if x is an eigenvector for the matrix A, then Ax = λ_{1}x, for some scalar λ_{1}.

if x also satisfies Ax = λ_{2}x, then we have: 0 = Ax - Ax = λ_{1}x - λ_{2}x = (λ_{1}-λ_{2})x.

now eigenvectors cannot be 0 (by definition), so (λ_{1}-λ_{2})x = 0 implies λ_{1}-λ_{2}= 0: that is, λ_{1}= λ_{2}.

so an eigenvector can only have ONE eigenvalue it belongs to.

it is possible, however, for an eigenvalue to have TWO (or more, linearly independent) eigenvectors.

consider the matrix I =

[1 0]

[0 1].

it should be clear that both (1,0) and (0,1) are eigenvectors with the same eigenvalue, 1, and that these two vectors are linearly independent.