Linear least squares with constraints

Hi,

I'm not completely familiar with these sorts of problems, so please bear with me.

Given a homogeneous system $\displaystyle Mw=0$

where $\displaystyle M$ is a $\displaystyle n$ x$\displaystyle n$ symmetric matrix:

From what I understand, the eigenvectors of $\displaystyle M$ are the set of best fitting solutions to the system above, however, I require a solution that is positive and constrained to a range.

Assuming that you have a vector of eigenvalues $\displaystyle d$ in descending order, and the eigenvectors corresponding to them as the columns of a matrix $\displaystyle E$, select an index $\displaystyle p$ such that $\displaystyle d_p > d_{p+1} = \ldots = d_n$

$\displaystyle \min_\beta||w-w_1||_2^2$

subject to $\displaystyle 0 < lb \leq w_i \leq ub$

where $\displaystyle w_1$ is a vector with all elements equal to 1

and $\displaystyle w=\sum_{i=p+1}^n\beta_iE_i$

The way I'm interpreting the minimisation above is: minimise the square of the Euclidean norm of $\displaystyle w - w_1$ subject to all the elements of the solution $\displaystyle w$ being in the range $\displaystyle [lb, ub]$. What's the significance of the subscript $\displaystyle \beta$ here? I don't believe it's constrained in any way.

How can I solve this?

Thanks.

P.S. In case anyone is wondering, these equations are from "NURBS curve and surface fitting for reverse engineering, W. Ma and J. -P. Kruth"