# Thread: Statistics Help ASAP

1. ## Statistics Help ASAP

Hello guys. I am wondering can you help me with these statistics problems. I am baffled by the problems, or maybe I am just thinking too hard or something. Thanks in advance. The problems are on an attachment. I am wondering could you explain it good or something; I really want to gain a good understanding of the material.

2. Originally Posted by Dream
Hello guys. I am wondering can you help me with these statistics problems. I am baffled by the problems, or maybe I am just thinking too hard or something. Thanks in advance. The problems are on an attachment. I am wondering could you explain it good or something; I really want to gain a good understanding of the material.

The third question in the attachment is:

Suppose that $\displaystyle X_1, ~ X_2, ~ .... ~ X_n$ form a random sample from a Poisson distribution with unknown mean $\displaystyle \theta$, and let $\displaystyle Y = \sum_{i=1}^n X_i$, determine the value of a constant $\displaystyle c$ such that the estimator $\displaystyle e^{-cY}$ is an unbiased estimator of $\displaystyle e^{-\theta}$.
I assume that the X's are independent ....?

Distribution of $\displaystyle Y$:

It's well known and easy to prove that the sum of $\displaystyle n$ independent Poisson random variables with parameters $\displaystyle \lambda_1, \, \ \lambda_2, \, .... \, \lambda_n$ is a Poisson random variable with parameter $\displaystyle \lambda_1 + \lambda_2 + .... + \lambda_n$.

Therefore $\displaystyle Y$ follows a Poisson distribution with parameter $\displaystyle n \theta$.

Unbiased estimator:

You require $\displaystyle E\left( e^{-cY}\right) = e^{-\theta}$.

Calculation of $\displaystyle E\left( e^{-cY}\right)$:

The calculation follows exactly the same steps as the well known calculation for the moment generating function of a Poisson random variable.

It is well known that if $\displaystyle X$ follows a Poisson distribution with parameter $\displaystyle \lambda$ then the moment generating function of $\displaystyle X$ is $\displaystyle m(t) = E\left(e^{tX}\right) = \exp (\lambda (e^{t} - 1))$.

Substituting $\displaystyle t = -c$ and $\displaystyle \lambda = n \theta$ you get:

$\displaystyle E\left( e^{-cY}\right) = \exp (n \theta (e^{-c} - 1)) = \exp(- \theta [n(1 - e^{-c})]$.

It should be clear how to go from here.

3. Originally Posted by Dream
Hello guys. I am wondering can you help me with these statistics problems. I am baffled by the problems, or maybe I am just thinking too hard or something. Thanks in advance. The problems are on an attachment. I am wondering could you explain it good or something; I really want to gain a good understanding of the material.

The first question in the attachment is:

Suppose that X and Y are two independent random variables whose moment generating functions re given by $\displaystyle M_X(t) = \frac{e^t}{2 - e^t}$ and $\displaystyle M_Y(t) = \left(\frac{2}{3} + \frac{1}{3} \, e^t\right)^2$, find $\displaystyle P(X = 2Y)$
$\displaystyle M_X(t) = \frac{e^t}{2 - e^t} = \frac{\frac{1}{2} e^t}{1 - \frac{1}{2} \, e^t}$ is the moment generating function of a geometric random variable with $\displaystyle p = \frac{1}{2}$.

$\displaystyle M_Y(t) = \left(\frac{2}{3} + \frac{1}{3} \, e^t\right)^2$ is the moment generating function of a binomial random variable with $\displaystyle n = 2$ and $\displaystyle p = \frac{1}{3}$. Therefore the possible values of $\displaystyle 2Y$ are .....

Therefore $\displaystyle \Pr(X = 2Y) = \, ....$

4. Originally Posted by Dream
Hello guys. I am wondering can you help me with these statistics problems. I am baffled by the problems, or maybe I am just thinking too hard or something. Thanks in advance. The problems are on an attachment. I am wondering could you explain it good or something; I really want to gain a good understanding of the material.

The second question in the attachment is:

A random variable $\displaystyle X$ is said to have the logarithmic series distribution with parameter $\displaystyle p$ is

$\displaystyle P(X = x) = \frac{-(1 - p)^x}{x \log p}, ~ x = 1, \, 2, \, ....$

where $\displaystyle 0 < p < 1$.

(a) Verify that this is a valid pmf. Hint: recall tha, for $\displaystyle w \in (-1, 1], ~ \log (1 + w) = \sum_{k=0}^{\infty} \frac{(-1)^k w^{k+1}}{k+1}$.

(b) Find the expectation and variance of $\displaystyle X$.
(a) $\displaystyle \sum_{x=1}^{\infty} \Pr(X = x) = \sum_{x=1}^{\infty} \frac{-(1 - p)^x}{x \log p}$ $\displaystyle = \frac{1}{\log p} \, \sum_{x=1}^{\infty} \frac{(-1)^{x+1} (p-1)^x}{x}$

$\displaystyle = \frac{1}{\log p} \, \sum_{x={\color{red}0}}^{\infty} \frac{(-1)^{x} (p-1)^{x{\color{red}+1}}}{x{\color{red}+1}}$

Now substitute $\displaystyle p -1 = w$ and use the hint to get 1 as the answer.

Now all you have to do is check that $\displaystyle \Pr(X = x) > 0$ for $\displaystyle x = 1, \, 2, \, ....$

(b) Left for you to attempt. Please show all your working and where you get stuck.

By the way, if you want members to take the time to help you please have the courtesy to type your questions out (as I have done for you - this time only) instead of attaching a word document containing scanned images.