You have $\displaystyle C_p v_i = \lambda_i v_i$ so $\displaystyle C_p \begin{bmatrix}v_1 & v_2 & \cdots & v_n\end{bmatrix} = \begin{bmatrix}\lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n \end{bmatrix}$ simply by equating columns.
The LHS is $\displaystyle C_p P$. Show that the RHS is $\displaystyle PD$. (Simply multiply it out).
Now, you just need to show that $\displaystyle P$ is invertible.
No typo. $\displaystyle P$ is the matrix of column vectors $\displaystyle v_1, \ldots, v_n$, so it is an $\displaystyle n \times n$ matrix, as well.
There are many ways to show a matrix is invertible. Can you show these eigenvectors are linearly independent? The key is the fact that each eigenvector spans an eigenspace that is invariant under $\displaystyle C_p$. Assume there is a dependency and show that the "invariant" space goes to zero for at most one eigenvalue.
Can you see if my proof is correct?Can you show these eigenvectors are linearly independent?
the linear combination of vectors and weights can be written as the following:
$\displaystyle c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0$
if we multiply the above expression by some matrix C_p we get:
$\displaystyle c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0$
we can also multiply the first equation with the respective eigenvalues $\displaystyle (\lambda_1, \lambda_2, .... , \lambda_k) $and equate with the second equation:
$\displaystyle c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_kv_{k-1}+c_k\lambda_k v_k=0$
$\displaystyle c_1\lambda_1 v_1 + ... +c_{k-1}\lambda_{k-1} v_{k-1}+c_k\lambda_k v_k=c_1 C_p v_1 + ... +c_{k-1} C_p v_{k-1}+c_k C_p v_k=0$
Now multiply the first equation with just $\displaystyle \lambda_k:$
$\displaystyle c_1\lambda_k v_1 + ... +c_{k-1}\lambda_{k} v_{k-1}+c_k\lambda_k v_k=0$
subtracting the expressions, we get:
$\displaystyle c_1(\lambda_1 - \lambda_k) v_1 + ... +c_{k-1}(\lambda_{k-1} -\lambda_k) v_{k-1} =0$
We know that by the expression above, the equality only holds if $\displaystyle c_1(\lambda_1 - \lambda_k) = 0$ etc... However since we assume from the question that the eigenvalues of the matrix $\displaystyle (\lambda_1, \lambda_2, .... , \lambda_k) $ is indeed unique, then it follows that for the equality to hold the values of $\displaystyle c_1 = c_2=...=c_{k-1}=0$. With this established we can relate back to the original equation;
$\displaystyle c_1 v_1 + ... +c_{k-1}v_{k-1}+c_k v_k=0$ $\displaystyle \rightarrow c_kv_k=0 $ since $\displaystyle c_1 = c_2=...=c_{k-1}=0$, thus this shows that the eigenvector $\displaystyle v_k \ne 0$, hence showing the linear independence of the eigenvectors.
Is this proof sufficient to show that it is indeed linearly indepedent? Even if I have established this fact that the eigenvectors are linearly independent, how does relate to the invertibility of the matrix P?
Here is some basics about matrix multiplication. Suppose you have a matrix $\displaystyle A$ and two sets of vectors $\displaystyle x_1,x_2,\ldots x_k$ and $\displaystyle b_1,b_2, \ldots, b_k$ with $\displaystyle A x_i = b_i$ for each $\displaystyle i=1,\ldots, k$. Let $\displaystyle X$ be the matrix whose columns are $\displaystyle x_1,x_2,\ldots, x_k$. Let $\displaystyle B$ be the matrix whose columns are $\displaystyle b_1,b_2,\ldots , b_k$. Then $\displaystyle AX = B$. I recommend trying that out in the 2x2 case for $\displaystyle A,X,$ and $\displaystyle B$. It follows almost trivially from the how matrix multiplication works.
So, $\displaystyle P$ is just a matrix whose columns are $\displaystyle v_1,v_2,\ldots, v_n$. So, the multiplication $\displaystyle C_pP$ will give $\displaystyle \begin{bmatrix}C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix}$. The $\displaystyle i$-th column is just the result of the matrix $\displaystyle C_p$ times the $\displaystyle i$-th column of $\displaystyle P$. That is how matrix multiplication is defined.
Now, consider the $\displaystyle i$-th column. It is $\displaystyle C_p v_i$. Because $\displaystyle v_i$ is an eigenvector, $\displaystyle C_p v_i = \lambda_i v_i$. So, we can rewrite the matrix:
$\displaystyle \begin{bmatrix} C_p v_1 & C_p v_2 & \cdots & C_p v_n\end{bmatrix} = \begin{bmatrix} \lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}$
Now, consider $\displaystyle PD = \begin{bmatrix} v_1 & v_2 & \cdots & v_n\end{bmatrix}\begin{bmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n\end{bmatrix}$. Break it down column by column.
$\displaystyle P\begin{bmatrix}\lambda_1 \\ 0 \\ \vdots \\ 0\end{bmatrix} = \lambda_1 v_1 + 0\cdot v_2 + \cdots + 0\cdot v_n = \lambda_1v_1$
$\displaystyle P\begin{bmatrix} 0 \\ \vdots \\ \lambda_i \\ \vdots \\ 0\end{bmatrix} = 0\cdot v_1 + \cdots + \lambda_i v_i + \cdots 0\cdot v_n = \lambda_i v_i$
So, $\displaystyle PD = \begin{bmatrix}\lambda_1v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n\end{bmatrix}$.
Hence, $\displaystyle C_p P = PD$ as desired.