# Simple questions regarding antiderivative rules and fractions

• Jul 22nd 2012, 03:15 AM
astuart
Simple questions regarding antiderivative rules and fractions
I'm just having a bit of trouble remembering the process of some of the integral rules, namely the power rule and the 'indefinite integral of a constant multiple of a function.

Basically, the example in my textbook is as follows (Not sure how to do the integral symbol, so hope this all make sense - I'll use 'int.' to denote that symbol).

$\displaystyle int. (1/x^(^3^/^2^)dx = int. x^(^-^3^/^2^)dx$ (No issues understanding this)

$\displaystyle =(1/(-1/2))x^(^-^1^/^2^) + C$ (Again, no issues here)

$\displaystyle = -2x^(^-^1^/^2^) + C = -(2/x^(^1^/^2^)) + C$

This is where I get a little lost. I have no idea where the 2 comes from in this part. I would have thought that from $\displaystyle =(1/(-1/2))x^(^-^1^/^2^)$ it would in fact be $\displaystyle ((-1/2)/(-1/2)x^(^-^1^/^2^)$, which would just be x^(-1/2).

Obviously, I'm going wrong here, so an explanation would be great. The textbook makes no mention of why this occurs and it's most likely something simple..

Cheers
• Jul 22nd 2012, 03:26 AM
Prove It
Re: Simple questions regarding antiderivative rules and fractions
Quote:

Originally Posted by astuart
I'm just having a bit of trouble remembering the process of some of the integral rules, namely the power rule and the 'indefinite integral of a constant multiple of a function.

Basically, the example in my textbook is as follows (Not sure how to do the integral symbol, so hope this all make sense - I'll use 'int.' to denote that symbol).

$\displaystyle int. (1/x^(^3^/^2^)dx = int. x^(^-^3^/^2^)dx$ (No issues understanding this)

$\displaystyle =(1/(-1/2))x^(^-^1^/^2^) + C$ (Again, no issues here)

$\displaystyle = -2x^(^-^1^/^2^) + C = -(2/x^(^1^/^2^)) + C$

This is where I get a little lost. I have no idea where the 2 comes from in this part. I would have thought that from $\displaystyle =(1/(-1/2))x^(^-^1^/^2^)$ it would in fact be $\displaystyle ((-1/2)/(-1/2)x^(^-^1^/^2^)$, which would just be x^(-1/2).

Obviously, I'm going wrong here, so an explanation would be great. The textbook makes no mention of why this occurs and it's most likely something simple..

Cheers

First, the command for the integral symbol is \int.

Now,

\displaystyle \displaystyle \begin{align*} \frac{1}{-\frac{1}{2}} &= 1 \div \left(-\frac{1}{2}\right) \\ &= 1 \times \left(-\frac{2}{1}\right) \\ &= -2 \end{align*}
• Jul 22nd 2012, 03:35 AM
astuart
Re: Simple questions regarding antiderivative rules and fractions
Thanks Prove it.

It appears I wasn't simplifying the fraction to get -2, but was instead copying the n from x^n into the numerator when that wasn't necessary at that stage.

Cheers.