To prove X and Y being independent random variables:

If = sigma , then X and Y are independent;

otherwise, X and Y are not independent random variables.

where h(xi,yj) is the joint distribution and i and j being the subscripts

where f(xi) is the probability distribution of X, and

g(yj) the probability distribution of Y.

The joint distribution of X+Y is the sum of h(xi,yj), where X=xi and Y=yj

I do not quite get your question about 1 being independednt of X+Y as I have not heard about it.

If 1 being a member in the probability distribution, then there must be only one random variable in the sample space, since 0< or = P and < or = 1.

If this is the case then P(X) = 1 and P(Y) =1, and P(X)*P(Y) = h(xi,yi) = 1.