I'm not really sure where to post this...

Basically, I've used Maple to plot the Mandelbrot set.

The way its done is I have a 10000x10000 array that is filled with values between 0 and 1 where each value represents the colour of a pixel on a grid. 0 stands for a black pixel and is part of the set, 1 is white and all other entries are a shade of grey depending on how many iterations they take to diverge.

To find the area I simply counted the number of 0's in the array (it was only different to the actual value by 0.002%). How would I go about finding the error in this? As in, I'm restricted by the number of iterations I can use,so some pixels that are black may NOT be black if I had a powerful enough computer so that I could increase the max number of iterations.

The value calculated by pixel counting (by using a massively more powerful computer than mine) is 1.50659177 $\displaystyle \pm$ 0.00000008.

The value I calculated was 1.506628382.