1. ## Maximum Likelihood Estimator

Hello to all! I have a multi-step question concerning MLEs I was hoping to get some help with:

Question 1)

Consider carrying out a sequence of Bernoulli trials where the probability of success is pi in each trial. We stop when the first success occurs. Let Y denote the number of trials carried out. it is known that Y follows a geometric distribution defined by:

P(Y = k) = pi * (1-pi)^k-1, k=1,2,...,

a) Apply this distribution to an example of a game where a person tosses a balanced six-sided die repeatedly and "wins" the game when they roll a '1'. Calculate the probability they roll a "1" on the fourth toss. Identify pi and k first.

pi=1/6, k = 4

P(Y=4) = 1/6 * (1-1/6)^3 = 0.096.

b) Now assume pi is unknown. Derive the MLE of pi if a single observation Y = y is observed. Be sure to state any relevant conditions for the variables.

l(pi) = pi * (1 - p)^y-1 , 0 =< pi =< 1 , y = 1, 2, ... ,

L(pi) = log (pi) + (y-1) log (1- p), 0 =< pi =< 1 , y = 1, 2, ... ,

d L(pi)/ dpi = 1/pi - (y-1)/(1-p)

setting this equal to zero and solving for pi i get:

pi = 1 / y

c) Calculate the MLE when Y = 3 using your formula you obtained in part b)

pi = 1/3

d) Determine and plot the relative likelihood function for the observed data Y = 3. Does the likelihood function attain its maximum at the MLE obtained in part c) ? Explain why or why not. Be sure to state any relevant conditions for the variables.

From our notes, the relative likelihood function, R(pi), is defined by:

R(pi) = l(pi) / l(pi hat)
= (pi * (1 - pi)^2) / ( (1/3) * (2/3)^2 )
= 27/4 * pi * (1-pi)^2, 0 =< pi =< 1

When I plotted this in R, the likelihood function is maximized when pi = 1/3, so yes. I'm not sure how else to explain this.

Any thoughts?