it would be great if you will show us what you have done so far..
I need help evaluating the following integral and two summations. I have the answer to each (from calculator), but I must show my work. These are from advanced statistics/probability course questions, so knowledge in this area may supply tricks for evaluating.
1.
(the correct answer to this is -6)
2. (notice that in this problem and in #3, there is the "choose function")
(the correct answer is 4)
3.
(the correct answer is 24)
Thank you all in advance!
Both (2) and (3) come from the negative binomial distribution.
See Negative binomial distribution - Wikipedia, the free encyclopedia
NOW, that's not the one I teach, but it's fine.
The parameters are p=.5 and r=4.
So in (2), they want the mean and in (3) they want the second moment, which equals .
And yes, you do get 4 and 24 for these two.
matheagle, thank you! That did the trick! In class, we have solved tricky integrals by making substitutions to make the integral look like the distribution with different parameters or a different distribution entirely (which we know all distributions equal one), so that's what I tried. I didn't think to make the integrals/summations look like the forms for expected value or variance.
Again, thank you!
Even though I had never seen these before.
My speciality is working with distributions.
So, this took only a minute to figure out.
The weird one was this negative binomial.
I count all the trials so my neg bi is different from yours.
I did a search and I found yours at wikipedia.
And as a professor, I must correct this....
all distributions integrate/sum to one
sry
Thanks for catching my grammar slip... yes, all distributions sum to one!
I need to find the PDF of Y= Z/X, where X~Beta(a,b) and Z~Beta(b,a). Here's the integral:
After pulling out the constants (all of the gamma functions), I tried grouping the terms by similar exponents, but didn't seem to have luck there.
Again, your help is very much appreciated.
Ah... this assignment is driving me nuts! I want to make sure I have this one right.
Let X~Uniform(-1,1) and Z~Uniform(0,1/8) where X and Z are independent. Let Y = X^2 + Z. Then the joint PDF of X and Y is given by f(x,y) = 4, for -1<x<1 and x^2<y<x^2 + 1/8. Find the Cov(X,Y). I think the answer is 0, but I'm not positive.
Cov(X,Y) = E(XY) - E(X)E(Y)
E(XY) is the double integral of (xy)*f(x,y) = 4xy and I integrate y first and evaluate it from x^2 to x^2 + 1/8. Then, integrate the results with respect to x and evaluate from -1 to 1. I get 0! E(X) is the double integral of (x)*f(x,y) = 4x and I integrate y first and evaluate it from x^2 to x^2 + 1/8. Then, integrate the results with respect to x and evaluate from -1 to 1. I get 0 here as well!
LASTLY, the other problem that is being fickle:
Let X and Y be iid Uniform(-1,0). Note that X and Y can be considered to be random sample of size 2. Find the PDF of Z = |Y-X|.
OK, I just finished dinner and I was planning at looking at your beta problem.
As for the other one, I have no idea why you would want to obtain the
density of X and Y. I would just find the covariance of the pair.
--------------------------------------------------------------------------------------
Again, if all you want is , well that's easy.
Note that , thus
by the symmetry of X's density.
Did your instructor really want you to get the joint density here?
As for the final one, well daw it. Do not do any type of Jacobian change of variable.
Get the CDF of Z, by using geometry and complementary regions.
This one seems similar to the classic Z=X+Y.
Also is that ?
Z and X better be indep.
Thanks for clarifying and confirming my answer to the covariance problem.
Sorry, I am a little confused as to which problem you are referring when making the comments. I'll do my best to answer your questions.
First, yes, that symbol was supposed to be the gamma function. For the integral with two beta distributions, the problem is: "Let X~Beta(a,b). Find the PDF of Y= (1-X)/X." I made Z= 1-X and found its distribution to be Z~Beta(b,a). Now, I am at finding Y=Z/X in which I stated the current integral with which I am struggling. Again, here it is:
Further, you asked if X and Z are independent... I'm not positive, but I assume so.
Thanks for your help, matheagle!
OMG
I knew there was something wrong.
I thought you were obtaining the pdf for no reason.
That you only had to find the mean or variance of one of these rvs.
This cannot be simplified.
I knew that, but I had to work on it to get to the same point to 'find' your error.
BUT now that you explained the problem to me I see the 'error'.
This is a one dimensional change.
You cannot let Z=1-X and assume 1-X is independent of X.
I had to assume Z and X were indep since you gave no further info.
AND is typed as \Gamma.
As for the solution, well, just get the density of X.
Let .
Thus .
Then .
I don't understand the question.
If you want the density of Y, a transformation of X, then you must obtain
.
If you want just the mean of some funtion of X, then you can do either,
tranform the X via Y=G(X) and get E(Y)=EG(X).
Or just get EG(X) directly from X's density, which is simpler.
BUT, in this problem I was asked to find the transformation, so finding just the
expectation is not sufficient.
AND what teacher could possibly say that
.
doesn't exist? Did he go to graduate school?
I proved it in another thread and I've seen it in every book.
You just differentiate the CDFs, to get this.
It's just calculus one, u-sub with an absolute value.
So, to clarify a solution, you wrote:
I know the PDF (density) of X, but I'm not so sure about the inverse. Do I have to do a transformation, like Z= 1/X, for example? Doing this yields:
which reduces to:
And then, do I do another transformation to obtain Y: Y= Z-1? Again, this creates:
And if all of this is correct, can I make g(y) look like a well-known distribution?
Thank you, again, for your continued help!