Verifying Subsets as Subspaces through scalar vector

I am having trouble with this example in the book (Intro to Linear Algebra, Johnson Riess Arnold, Fifth edition, p. 174):

"Let W be the subset of R^2 defined by W={x: x = [ x1 x2 ], x1 amd x2 any integers }. Demonstrate that W is not a subspace of R^2."

W passes the 0 test and the x+y test. However, it fails the scalar test. It says that if a=1/2, then x is in W but ax is not. I don't understand this. Does that mean if a=3/4, then ax is not as well? What if a=3, is ax in W? Is ax only in W if a=1?

Why can we say that if a=XXX, then x is in W? I'm not sure how they came to that conclusion.

Thanks very much :)