In an analog to digital conversion and analog waveform is sampled, quantized and coded. A quantized function is a function that assigns to each sample value x a value y from a generally finite set of predetermined values. Consider the quantized defined by g(x)=[x]+1, where [x] denotes the greatest integer less than or equal to x. Suppose that x has a standard normal distribution and pit Y=g(x). Specify the distribution of Y. Ignore values of Y for which the probability is essentially zero.

Going by how the book taught it I would start this problem by computing the inverse of g(x). However this function has no inverse. Any suggestions how to proceed?