if y=ax+n where y is a d-dimensional vector, a is a scalar (amplitude rescaling factor), and n is a d-dimensional vector drawn from a zero mean gaussian, what is the MLE of $\displaystyle a$ given x and y?

the way i see it, this is equivalent to minimizing the sum of squared errors:

$\displaystyle \sum_{i=1}^d(y_i - ax_i)^2$, taking the derivative with respect to $\displaystyle a$ and setting it equal to zero i end up with $\displaystyle a = \frac{x^Ty}{x^Tx}$. for some reason, this seems too simple, and intuitively, doesn't make a whole lot of sense. does this look right?

edit: n is drawn from a zero mean gaussian with vI covariance (v is scalar, I is the identity matrix).

i think its the same, except you're minimizing $\displaystyle \sum_{i=1}^dv(y_i - ax_i)^2$, so $\displaystyle a = v\frac{x^Ty}{x^Tx}$

edit 2: v cancels out, $\displaystyle a = \frac{x^Ty}{x^Tx}$