It looks like a variation of the Riemann integral to me.
Let me give an example.
Say we want to find the quadradic variation of f(t)=t on the interval [0,1]
So we partition the region (like in the Riemann integral), I am going to be using left-endpoints because that is what they (Wikipedia) use.
Thus, divide the region into "n" parts. Each part has 1/n width.
The initial value is
You get the idea.
x_k = k/n
(Because of left-endpoints).
Now, we sum them up,
<f>_1 = SUM (of that whole thing) = 1/n^2+1/n^2+...+1/n^2=(n-1)/n^2
Now, we take the limit as n --> infinity.
This clearly goes to zero.
Thus, the quadradic variation of f(t)=t of 1 (meaning on interval [0,1]) is zero.
Now, they mention an interesting theorem.
If f(t) is differenciable, then its quadradic variation is zero.
Precisely what we got, zero.
I hope this helps somehow.