# Word problem dont know where to begin

Show 40 post(s) from this thread on one page
Page 1 of 2 12 Last
• Jan 2nd 2009, 09:04 AM
Word problem dont know where to begin
For an upcoming concert, each customer may purchase up to 3 child tickets and 3 adult tickets. let C be the number of child tickets purchased by a single customer. The probability distribution of the number of child tickets purchased by a single customer is given below.

----------------------
C |0 |1 |2 |3 |
P(c) |0.4|0.3|0.2|0.1 |
----------------------

A) Compute the mean and standard deviation of c

b) suppose the mean and the standard deviation for the number of adult tickets purchased by a signle customer are 2 and 1.2, respectively. Assume that the number of child tickets and adult tickets purchased are independent random variables. Compute the mean and the standard deviation of the total number of adult and child tickets purchased by a single customer.

c) Suppose each child ticket costs $15 and each adult ticket costs$25. Compute the mean and the standard deviation of the total amount spent per purchase.

Anyone want to help me on how to just start this problem?I'm sure ill come back with more questions after.
• Jan 2nd 2009, 03:08 PM
nzmathman
a) The mean,
$\displaystyle E(C) = \sum_1^i \;C_i \times P(C = C_i)$
$\displaystyle = 0 \times 0.4 + 1 \times 0.3 .....$

$\displaystyle Var(C) = E(C^2) - [E(C)]^2$

where $\displaystyle E(C^2) = \sum_1^i \;(C_i)^2 \times P(C = C_i)$
$\displaystyle = 0^2 \times 0.4 + 1^2 \times 0.3 .....$

and remember standard deviation, $\displaystyle \sigma_c = \sqrt{Var(C)}$

b) For any two random variables X and Y, $\displaystyle E(X + Y) = E(X) + E(Y)$ and $\displaystyle Var(X + Y) = Var(X) + Var(Y)$

c) For this question, use the rules $\displaystyle E(aX + bY) = aE(x) + bE(Y)$ and $\displaystyle Var(aX + bY) = a^2Var(X) + b^2Var(Y)$
• Jan 2nd 2009, 03:44 PM
Last_Singularity
Quote:

Originally Posted by nzmathman
a) The mean,
b) For any two random variables X and Y, $\displaystyle E(X + Y) = E(X) + E(Y)$ and $\displaystyle Var(X + Y) = Var(X) + Var(Y)$

c) For this question, use the rules $\displaystyle E(aX + bY) = aE(x) + bE(Y)$ and $\displaystyle Var(aX + bY) = a^2Var(X) + b^2Var(Y)$

Well, more specifically, any two independent random variables X and Y. If they are not independent, there will be a leftover covariance term in there...
• Jan 2nd 2009, 04:03 PM
nzmathman
"Well, more specifically, any two independent random variables X and Y. If they are not independent, there will be a leftover covariance term in there..."

Yes, I'm aware of this, but didn't mention it, as part (b) of the question tells you the variables in this situation are random and independant.
• Jan 2nd 2009, 08:39 PM
nz, it obviously looks like you know your stuff. However, my math teacher most definitely, has not taught us your method yet. Nor have we learned whatever you did. I was wondering if there was a different more simpler way to do this, as really, I've never seen that before.

Let me know, thanks!
• Jan 2nd 2009, 08:43 PM
mr fantastic
Quote:

nz, it obviously looks like you know your stuff. However, my math teacher most definitely, has not taught us your method yet. Nor have we learned whatever you did. I was wondering if there was a different more simpler way to do this, as really, I've never seen that before.

Let me know, thanks!

Which part have you not seen before and therefore need explained in a different way?
• Jan 2nd 2009, 09:02 PM
nzmathman
How can you do this question if you do not know all this? Unless you have a graphics calculator capable of calculating statistical values from a user-entered table of data?
• Jan 2nd 2009, 10:19 PM
Quote:

Originally Posted by nzmathman
How can you do this question if you do not know all this? Unless you have a graphics calculator capable of calculating statistical values from a user-entered table of data?

I think i remember dong a 1-var stats with something. is that possible with this method?
• Jan 3rd 2009, 05:05 AM
Last_Singularity
Quote:

On the other hand, surely you understand the concept of expected value? It is simply a weighted average of all the possible values a variable $\displaystyle X$ can take on. For example, if you have a 50% chance of earned $100 and 50% chance of earning$200, then you EXPECT to earn, on average, 0.5(100)+0.5(200) = $150. In your case, you have four possible values for your variable instead of just two - just multiple each value by its probability and add all of such products up - that's your expected. nzmathman did forget one thing: there is another way to calculate variance. Instead of using$\displaystyle var(x) = E(x^2) - (E(x))^2$, we can use$\displaystyle var(x) = E((E(x)-x)^2)$Using this method, variance is calculated by:$\displaystyle var(x) = \sum_{i=1}^n p(x_i) (\bar{x}- x_i)^2$where the mean$\displaystyle \bar{x} = \sum_{i=1}^n p(x_i) x_i$So, first figure out$\displaystyle \bar{x}$: =$\displaystyle 0 \times 0.4 + 1 \times 0.3 .....$. You should get 1. Then find the squared error of each$\displaystyle x_i$from that mean and multiply those by their probabilities:$\displaystyle (0.4)(0-1)^2 + (0.3)(1-1)^2 + (0.2)(2-1)^2 + (0.1)(3-1)^2$. That is your variance. Remember to square root that to get standard deviation. You should get$\displaystyle 1$for both variance and standard deviation. • Jan 3rd 2009, 10:34 AM Bradley55 wait I think i remember a way my teacher taught us now. Cant I just times 0 x .4 , 1x.3 , 2x.2, and 3x.1 ? My notes are saying that will give me expected value. Then i take each of those 4 values and i think times all 4 together? • Jan 3rd 2009, 01:36 PM mr fantastic Quote: Originally Posted by Bradley55 wait I think i remember a way my teacher taught us now. Cant I just times 0 x .4 , 1x.3 , 2x.2, and 3x.1 ? My notes are saying that will give me expected value. Then i take each of those 4 values and i think times all 4 together? Mr F says: Add them together, not multiply together. That's exactly what nzmathman said in post #2 (except s/he didn't make the mistake I've corrected you on). It looks like your notes could be a veritable goldmine of information. I suggest you study them very closely - I suspect all the answers to what you've asked following post #2 will be there ..... • Jan 3rd 2009, 03:18 PM Bradley55 Okay, so after doing my calculations i got the expected value of 1. questions "a" asks for the mean and standard deviation for c. Isnt the mean 1.5? just (3+2+1+0 / 4 = 1.5)? And my notes only reminded me of this, they are incomplete :( , so if i did any of this right what should be my next step? • Jan 3rd 2009, 03:56 PM Last_Singularity Quote: Originally Posted by Bradley55 Okay, so after doing my calculations i got the expected value of 1. questions "a" asks for the mean and standard deviation for c. Isnt the mean 1.5? just (3+2+1+0 / 4 = 1.5)? And my notes only reminded me of this, they are incomplete :( , so if i did any of this right what should be my next step? The expected value IS the mean. In other words: expected value = mean = 1 What you should do next has already been answered twice, once by nzmathman in post #2 and again by me in post #9... Please read! • Jan 3rd 2009, 05:05 PM Bradley55 Quote: Originally Posted by Last_Singularity Then find the squared error of each$\displaystyle x_i$from that mean and multiply those by their probabilities:$\displaystyle (0.4)(0-1)^2 + (0.3)(1-1)^2 + (0.2)(2-1)^2 + (0.1)(3-1)^2$. That is your variance. Remember to square root that to get standard deviation. You should get$\displaystyle 1$for both variance and standard deviation. So I see now that 1 is the mean and ALSO 1 is the standard deviation, because after is square root the variance (1) i get 1 for the Standard Deviation? Now onto b, ???. • Jan 3rd 2009, 06:25 PM Last_Singularity Again, post #2 by nzmathman... Quote: Originally Posted by nzmathman b) For any two random variables X and Y,$\displaystyle E(X + Y) = E(X) + E(Y)$and$\displaystyle Var(X + Y) = Var(X) + Var(Y)\$