# Math Help - Applying expecations and variance formulae to a multifactor model....proof required

1. ## Applying expecations and variance formulae to a multifactor model....proof required

Hi all,

Just wondering if anyone can help me with a proof/solution on how to get from equation 2 to equation 3. It is for deriving the variance of portfolio return using a multifactor model.

Equation 1: r = B1r1 + B2r2 + re

Equation 2: r2 = B12 .r12 + B22 .r22 + 2.B1.B2.r1r2 + 2.B1.r1re + 2.B2.r2.re + r2e

Steps: Apply expectations on both sides. Use formulae for variance to get:

Equation 3: var(r) = B12.var(r1) + B22.var(r2) + 2.B1B2.cov(r1,r2) + var(re)

I am having trouble with "apply expectations on both sides". Can someone please show me how to do this/what you end up with. I can then work from there to get eq 3.

2. ## Re: Applying expecations and variance formulae to a multifactor model....proof requir

Hi all,

Just wondering if anyone can help me with a proof/solution on how to get from equation 2 to equation 3. It is for deriving the variance of portfolio return using a multifactor model.

Equation 1: r = B1r1 + B2r2 + re

Equation 2: r2 = B12 .r12 + B22 .r22 + 2.B1.B2.r1r2 + 2.B1.r1re + 2.B2.r2.re + r2e

Steps: Apply expectations on both sides. Use formulae for variance to get:

Equation 3: var(r) = B12.var(r1) + B22.var(r2) + 2.B1B2.cov(r1,r2) + var(re)

I am having trouble with "apply expectations on both sides". Can someone please show me how to do this/what you end up with. I can then work from there to get eq 3.

It would help if you defined your variables a bit better and gave some information about what you already know about expectations. I have to GUESS that r1, r2, and re are random variables (not regression coefficients), that re is independent of r1 and r2, and that E(re) = 0. Because you do not say what you already know about expectations, I have to start at the beginning.

$P\ is\ a\ random\ variable\ \implies E(P) = expected\ value\ of\ P,\ and\ V(P) = variance\ of\ P = E\left(\left\{P - E(P)\right\}^2\right).$

$V(P) = E\left(\left\{P - E(P)\right\}^2\right)= E\left(P^2 - 2P * E(P) + \left\{E(P)\right\}^2\right) = E\left(P^2\right) - 2E(P)E(P) + \left\{E(P)\right\}^2 = E\left(P^2\right) - \left\{E(P)\right\}^2 \implies$

$EQUATION\ I:\ V(P) = E\left(P^2\right) - \left\{E(P)\right\}^2.$

$EQUATION\ II:\ E(jP) = jE(P).$

$V(jP) = E\left(\left\{jP - E(jP)\right\}^2\right) = E\left(\left\{jP - j * E(P)\right\}^2\right) = E\left(j^2 * \left\{P - E(P)\right\}^2\right) = j^2 * E\left(\left\{P - E(P)\right\}^2\right).$

$But\ V(P) = E\left(\left\{P - E(P)\right\}^2\right) \implies$

$EQUATION\ III:\ V\left(jP\right) = j^2V(P).$

$S\ and\ T\ are\ random\ variables \implies C(S,\ T) = covariance\ of\ S\ and\ T = E\left(\left\{S - E(S)\right\} * \left\{T - E(T)\right\}\right).$

$C(S,\ T) = E\left(\left\{S - E(S)\right\} * \left\{T - E(T)\right\}\right) = E\left(ST - S * E(T) - E(S) * T + \left\{E(S) * E(T)\right\}\right) = E(ST) - 2E(S)E(T) + E(S)E(T) \implies$

$EQUATION\ IV:\ C(S,\ T) = E(ST) - E(S)E(T).$

$C(jS,\ kT) = E\left(\left\{jS - E(jS)\right\} * \left\{kT - E(kT)\right\}\right) = E\left(\left\{jS - jE(S)\right\} * \left\{kT - kE(T)\right\}\right) = jkE\left(\left\{S - E(S)\right\} * \left\{T - E(T)\right\}\right) \implies$

$EQUATION\ V:\ C(jS,\ kT) = jkC(S,\ T).$

$EQUATION\ VI:\ S\ and\ T\ are\ independent \implies C(S,\ T) = 0 \implies C(jS,\ kT) = 0.$

Basics are out of the way. Now for sum of two random variables.

$EQUATION\ VII:\ E(jS + kT) = jE(S) + kE(T).$

$Q = jS + kT.$

$V(jS + kT) = V(Q) = E\left(Q^2\right) - \left\{E(Q)\right\}^2.$

$E\left(Q^2\right) = E\left(\left\{jS + kT\right\}^2\right) = E\left(j^2S^2 + 2 * jS * kT + k^2T^2\right) = j^2E\left(S^2\right) + 2jkE(ST) + k^2E\left(T^2\right).$

$\left\{E(Q)\right\}^2 = \left\{E(\left\{jS + kT\right)\right\}^2 = \left\{jE\left(S\right) + kE\left(T\right)\right\}^2 = j^2\left\{E(S)\right\}^2 + 2jk\left\{E(S)E(T)\right\} + k^2\left\{E(T)\right\}^2.$

$\therefore V(Q) = E\left(Q^2\right) - \left\{E(Q)\right\}^2 = j^2\left\{E\left(S^2\right) - \left\{E(S)\right\}^2\right\} + 2jk\left\{E(ST) - E(S)E(T)\right\} + k^2\left\{E\left(T^2\right) - \left\{E(T)\right\}^2\right\} =$

$V(jS) + 2C(jS,\ kT) + V(kT) = j^2V(S) + 2jkC(S,\ T) + k^2V(T) \implies$

$EQUATION\ VIII:\ V(jS + kT) = j^2V(S) + k^2V(T) + 2jkC(S,\ T).$

OK Almost at our destination. Three random variables, X, Y, and Z.

$W = jX + kY.$

$C(jX + kY,\ mZ) = C(W,\ mZ) = E(W * mZ) - E(W)E(mZ) = mE(WZ) - mE(W)E(Z) = mE\left\{(jX + kY)Z\right\} - mE(jX + kY)E(Z) =$

$mE(jXZ + kYZ) - m\left(\left\{E(jX) + E(kY)\right\}E(Z)\right) = mE(jXZ) + mE(kYZ) - mE(jX)E(z) - mE(kY)E(Z) =$

$jm\{E(XZ) - E(x)E(Z)\} + km\{E(YZ) - E(Y)E(Z)\} \implies$

$EQUATION\ IX:\ C(jX + kY,\ mZ) = jmC(X,\ Z) + kmC(Y,\ Z).$

$EQUATION\ X:\ E(jX + kY + mZ) = jE(X) + kE(Y) + mE(Z).$

$V(jX + kY + mZ) = V(W + mZ) = V(W) + m^2V(Z) + 2C(W,\ Z) = V(jX + kY) + m^2V(Z) + 2C(jX + kY,\ Z) =$

$\{j^2V(X) + k^2V(Y) + 2jkC(X,\ Y)\} + m^2V(Z) + 2\{jmC(X,\ Z) + kmC(Y,\ Z)\} \implies$

$EQUATION\ XI:\ V(jX + kY + mZ) = j^2V(X) + k^2V(Y) + m^2V(Z) + 2\{jkC(X,\ Y) + jmC(X,\ Z) + kmC(Y,\ Z)\}.$

Now if m = 1, E(Z) = 0, and Z is independent of X and Y then equations X and XI reduce to

$E(jX + kY + mZ) = jE(X) + kE(Y)$ and

$V(jX + kY + mZ) = j^2V(X) + k^2V(Y) + V(Z) + 2jkC(X,\ Y)$ respectively.

If I have understood your problem correctly, b1 = j, b2 = k, m = 1, r1 = X, r2 = Y, and re = Z. Just switch variables.

PS I don't guarantee that I have caught every typo. This is fussy work with LaTeX.