Hi all, i have a few parts to a question that I'm not too sure about, and i was wondering if anyone could help me out.
I'll start with one and add more in if i still need help.
I'm having trouble deriving the E(X) of the log normal distribution.
I believe i should be using moment generating functions of the normal distribution and applying a X = exp(Y) transformation? But I'm not entirely sure how to use them both to derive it.
As you said, , where Y follows a normal distribution . So we're looking for .
For more convenience, we can consider . And then, Y follows the same distribution as .
We'll calculate .
And from the above, we'll have
Z's pdf is .
Using the law of the unconscious statistician (at last I can put a name on it !!! ), we have :
Complete the square :
But the thing in the integral is exactly the pdf of a normal distribution !
Hence this integral is 1.
As stated, this is the MGF.where \phi is the MGF/characteristic fn(I dont what people call them exactly)
The MGF is equivalent to the Laplace transform
The characteristic function is equivalent to the Fourier transform
The MGF of a normal isSince , we have
For the characteristic function, just substitute t=it
You can find the definitions of mgf and characteristic function in wikipedia, as well as the mgf, char function of any distribution.
The problem is our lecture notes call it the characteristic function, last time Laurent pointed it and I saw it on Wikipedia too. I dont why they called it char function in our lecture notes. Now its confusing me a lot. Anyway I wil keep that in mind...ThanksYou can find the definitions of mgf and characteristic function in wikipedia, as well as the mgf, char function of any distribution.
Just joking... over 100 ? That's still a lot :s
By taking the pdf, you have very little chance of getting wrong lol! (except if, like I did in a first time, you forget the minus sign in the exp(-t^2/2sigma^2)... )
Both characterize the distribution. And you can go from one to another quite easily (though it requires justification if you want to be formal ^^)The problem is our lecture notes call it the characteristic function, last time Laurent pointed it and I saw it on Wikipedia too. I dont why they called it char function in our lecture notes. Now its confusing me a lot. Anyway I wil keep that in mind...Thanks
That's still strange ?
Thanks guys! Especially Moo.
Another question about the variance. Would you need to then use MGFs to do derive the variance? Using the 2nd moment of the normal distribution to find E(X^2) for the log normal distribution, thus having E(X^2) - m^2 for the variance.
Though i can't get the right answer.
Thanks Moo, really helpful! I found my mistake .
Was able to do the next part too! Though i might have 2 more questions still.
If you have alpha = Xbar as an estimator, how do you find the variance for that specific estimator? The estimator is unbiased, which was easy to show.
I figured it out with an easy estimator, but I'm now stuck on another one.
It's a bit too long to type so i attached it I'm not sure how you use large n approximations? Actually I'm kinda pretty confused in general about it. If someone could point me in the right direction to get started, that would be awesome!
Thanks so far, it's really helped.