# Optimization problem

• Oct 26th 2010, 02:22 AM
Convexian
Optimization problem
Hello

I'm trying to solve some problems on Nonlinear Optimization. I have the book "Nonlinear Optimization" - by Andrzej Ruszczyński, 2006.

It's problem 3.8 and sounds as follows.

The output variable $\displaystyle y \in \mathbb{R}$ of a system depends on the input variable $\displaystyle x \in \mathbb{R}$. We have collecred observations $\displaystyle \tilde{y_i}, \, i=1, \dots , N$, for $\displaystyle N$ different settings $\displaystyle x_i$ of the input variable, with $\displaystyle x_1 < x_2 < \dots < x_N$.

(a) We know that the dependence of $\displaystyle y$ on $\displaystyle x$ should be nondecreasing. Formulate the problem of finding new values $\displaystyle y_i$ such that $\displaystyle y_1 \leq y_2 \leq \cdots \leq y_N$ and the sum of the squares of adjustments is minimized.

(b) We know that the dependence of $\displaystyle y$ on $\displaystyle x$ should be convex and nondecreasing. Formulate the problem of finding new values $\displaystyle y_i$ such that this is satisfied and the sum of the squares of adjustments is minimized.

I'm a bit puzzled of what he means with "the dependence of $\displaystyle y$ on $\displaystyle x$ should be nondecreasing" and "the dependence of $\displaystyle y$ on $\displaystyle x$ should be convex and nondecreasing"

In (a) I have formulated the optimization problem as:
$\displaystyle \min \left( \sum_{i=1}^N (\tilde{y}_i - y_i)^2 \right)$
and have defined my $\displaystyle y_i$'s as:
$\displaystyle y_1 \leq y_2 \leq \cdots \leq y_N$
and
$\displaystyle y_1 \geq x_1, y_2 \geq x_2 , \cdots y_N \geq x_N$.
But as I wrote, I'm a bit lost on what he wants me to do ?

Can anyone enlighten me (Wink) ?