For each integer n >= 0, let a(n) be the number of bit strings of length n that do not contain the pattern 101.

a.) Show that a(k) = a(k-1) + a(k-3) + a(k-4) + ... + a(0) + 2, for all integers k>= 3.

b.) Use the result of part (a) to show that if k >= 3, then a(k) = 2a(k-1) - a(k-2) + a(k-3)

**everything in parenthesis is treated as a subscript

This question is totally unlike anything else from this section of the book. I really have no idea where to even start. Part a almost looks like I could do induction but I don't know if that would be correct. The only thing I've done so far is I wrote out the first couple of bit length (don't even know if I'm supposed to do this) and came up with:

Length 0: ∊

Length 1: 0, 1

Length 2: 11, 10, 01, 00

Length 3: 111, 110, 100, 000, 001, 011, 010 (101 is excluded, obviously).

I also understand that a recursive formula I can write is

a(k) = 2a(k-1) + 2a(k-2) + a(k-3) (I think, not sure if this is correct)