Laser length tool has an average error of 0cm, and the standard deviation error of 1 cm.

We are measuring the length of a table. And we repeat that 20 times. What is the probability that the error is less than 0.5 cm?

I understand it like this;

The laser tool is distributed normal:

We take the 20 mesaurments of the table, and that is also distributed normal like this:

So with this Y value, I calculate with function and look at the standard normal distribution table, and get..

Is this correct?, Or I have missed something?