At the start of their derivation note that the sum is from k=0 to n. That's why they take x sub k = k. The part with x sub k is just a general definition.
After that, the first term is k=0 because they are considering values of k from 0 to n
Wikipedia: Algebraic derivation of mean says:
Sure, is "a discrete random variable", but does that inevitably mean that ? Couldn't all be, say, multiples of 3 ( )?
And in next step they use this to saywhich is an essential part of proving that if (which is why I'd like to understand why should ).The first term of the series (with index k = 0) has value 0 since the first factor, k, is zero.
Look back at the Wikipedia page. You're focussing on the equations after the line saying "We apply the definition...". Look at the paragraph before this starting with "We derive these quantities from first principles..". The first equation after the text is a sum from 0 to N. That's what's defining this derivation.
Let me see, is here "how many times an event happens": and gives all arrangements. This is correct, right?
Now, "we apply the definition of the expected value of a discrete random variable to the binomial distribution". ....
... I see now. The case of, say, dice with sides doesn't apply to binomial distribution (which deals only with "yes/no" types).
Have I got it now?