I'm making an application where I can automatically record rainfall using a very sensitive water pressure sensor at the bottom of a rain gauge. The pressure is directly proportional to the vertical height of the water, and because rain gauges are conical, this is not directly proportional to actual amount of rainfall.

I'm trying to work out how I could relate the water pressure (which will essentially be a number that increases linearly with the water pressure) to the actual rainfall, in mm.

I have that the volume of water in a cone is V = (1/3)*pi*r^2*h. By equating that to V=pi*r^2*h, where r is the radius of the top of the rain gauge, I can get h which is the actual rainfall. The only variable I know is the h in the first equation, which is the vertical height of the water which is given to me by the pressure reading (a scaled factor of the height).

By calibrating the application initially, which would amount to recording the water pressures at various rainfall amounts, would I be able to come up with a quadratic (or cubic) function that would relate the pressure reading to the actual rainfall amount? I thought this would be the case but I couldn't get it to work out with some test numbers on an Excel speadsheet.

I could of course solve my problem by just using a cylindrical container, but I like the fact that I can get more accurate readings at lower rainfall amounts using a conical container.

Any help is appreciated! Thanks.