# Thread: Confidence Interval Problem

1. ## Confidence Interval Problem

If $x_1$ and $x_2$ are values of a random sample of size 2 from a population having a uniform density with $\alpha = 0$ and $\beta = \theta$, find k such that:
$0 < \theta < k(x_1+x_2)$

is a $(1- \alpha )100$% confidence interval for $\theta$ when:

(a) $\alpha < 0.5$
(b) $\alpha > 0.5$

This question is confusing...can someone explain what is it asking?

2. Originally Posted by chopet
If $x_1$ and $x_2$ are values of a random sample of size 2 from a population having a uniform density with $\alpha = 0$ and $\beta = \theta$, find k such that:
$0 < \theta < k(x_1+x_2)$

is a $(1- \alpha )100$% confidence interval for $\theta$ when:

(a) $\alpha < 0.5$
(b) $\alpha > 0.5$

This question is confusing...can someone explain what is it asking?
Find $k$ such that:

$p(k(x_1+x_2)>\theta)=1-\alpha$

RonL

3. Originally Posted by CaptainBlack
Find $k$ such that:

$p(k(x_1+x_2)>\theta)=1-\alpha$

RonL
To the OP:

This thread will be relevant to your efforts: http://www.mathhelpforum.com/math-he...questions.html