Originally Posted by

**dopi** i need find an expression in terms of **v** for the minimum value of **(x^t)x** subject to the constraint **(v^t)x** = k

v is a fixed vector in R^n

k is a real constant

I need to use langrange multitpliers to solve this.

**My solution**

I let **L(x,lambda) =**** (x^t)x + lambda*(k - (v^t)x)**

so therefore **grad L = 2x - lambda*v = 0**

so i therefore let **x = 1/2*lambda*v **and put this into the constraint to get

**(v^t)*1/2*v*lambda = k**

so **1/2*lamda*(v^t)*v = k** in terms of **v**

To verify this is the min, i differentiated grad L again to get 2 >0 so this is convex and hence a global minimizer.

**Prohlem**

Can some one please check my solution, i was getting whether to differentiate with respect to **v** as well as it is a vector, but wasnt too sure?.. if some one can give me some advise that would be great thanks