Originally Posted by

**amyu2005** ok, so I have this recurrence:

a(n) = 2(a(n-1)) + 3(a(n-2))

so the degree is k, & k=2, & I used the formula r^2 - 2r - 3, to get the roots r=1 & r= -3.

then I used the general solution technique

a(n) = alpha1(r1^n) + alpha2(r2^n)

& subbed in my "r" values to get:

a(n) = alpha1[(1)^n] + alpha2[(-3)^n]

then I used the initial conditions a(0) = 0 & a(1) = 2 (subbed in n=0 & n=1) to the equation:

a(0) = alpha1[(1)^0] + alpha2[(-3)^0] = 0

a(1) = alpha1[(1)^1] + alpha2[(-3)^1] = 2

to try & solve for alpha1 & alpha2, but I keep gettin' fractions for both alphas...(I got alpha1 = 1/2, & alpha2 = -(1/2))

is this right?!??! b/c most of the examples in my textbook have alpha equaling an integer....

I'm not sure if I messed up the calculations.