Hey everyone

I'm trying to determine the complexity of the following algorithm:

Code:

A = 0:n;
for i = 0:n-1
s = 0;
for j = 1:i
s = s + X(j);
end
A(i+1) = s / (i+1);
end

I came up with the following:

But something doesn't seem right here, since I've got a double for-loop. Shouldn't I be getting an ?

Also, I'm supposed to find the complexity of the algorithm theoretically and practically. What exactly is the difference between the algorithm on paper and after I've coded it? The instructions are exactly the same.