# Thread: Contraction mapping problem in n-dimensional vector-valued function

1. ## Contraction mapping problem in n-dimensional vector-valued function

Let $\displaystyle S$ be a closed subset of $\displaystyle E^n$ which contains the entire line segment between any two of its points and let $\displaystyle f$ be a continuously differentiable map from an open subset of $\displaystyle E^n$ containing $\displaystyle S$ into $\displaystyle E^n$. Suppose that $\displaystyle f(S) \subset S$ and that there is a real number $\displaystyle k<1$ such that

$\displaystyle \sum_{i,j=1}^{n}(\frac{\partial f_i}{\partial x_j})^2 \leq k$

for all $\displaystyle x \in S$. Prove that the restriction of $\displaystyle f$ to $\displaystyle S$ is a contraction map, so that the fixed point theorem is applicable.

...

I have my idea, but I could not go through. I need to show that for any $\displaystyle a=(a_1,...,a_n) \in S$ and $\displaystyle b = (b_1,...,b_n) \in S$

$\displaystyle d(f(b), f(a)) \leq kd(a,b)$

where $\displaystyle f(x_1,...,x_n) = (f_1(x_1,...,x_n),...,f_n(x_1,...,x_n)), k<1$

$\displaystyle d(a,b) = \sqrt{(b_1 - a_1)^2 + ... + (b_n - a_n)^2}$

$\displaystyle d((f(b),f(a)) = \sqrt{(f_1(b)-f_1(a))^2 + ...+ (f(_n(b) - f_n(a))^2}$

by the Mean Value Theorem for several variables, we have

$\displaystyle \sqrt{(\frac{\partial f_1}{\partial x_1}(c))^2(b_1 - a_1)^2 + ...+(\frac{\partial f_n}{\partial x_n}(c))^2(b_n - a_n)^2 +(\frac{\partial f_2}{\partial x_1}(c))^2(b_1 - a_1)^2 +..+\frac{\partial f_2}{\partial x_n}(c))^2(b_n - a_n)^2} +..+\frac{\partial f_n}{\partial x_1}(c))^2(b_1 - a_1)^2 +...$
+$\displaystyle \frac{\partial f_n}{\partial x_n}(c))^2(b_n - a_n)^2$

where $\displaystyle c$ is the point on the line segment between $\displaystyle a$ and $\displaystyle b$ which yields

$\displaystyle \sqrt{((\frac{\partial f_1}{\partial x_1}(c))^2 + .. +(\frac{\partial f_n}{\partial x_1}(c))^2)(b_1-a_1)^2 + .. + ((\frac{\partial f_1}{\partial x_n}(c))^2 + .. +(\frac{\partial f_n}{\partial x_n}(c))^2)(b_n-a_n)^2}$

From here, I have no idea to proceed to make the desired inequality. There is a hint said that I could use the preceding problem in the book as follows:

If $\displaystyle f: [a, b] \rightarrow E^n$, its length is $\displaystyle \int_{a}^{b} \sqrt{\frac{df_1(x)}{dx}^2 + ... + \frac{df_n(x)}{dx}^2}dx$

but I could not relate how I can use it. The assumption that $\displaystyle f(S) \subset S$ should be the hint that $\displaystyle d(f(b),f(a)) \leq d(a,b)$ and the fact that $\displaystyle S$ is closed ensures the existence of cauchy sequences converging to a fixed point.

The Schwarz inequality seems to be the way to go, but I could not utilize it.

$\displaystyle |x_1y_1 +x_2y_2 + ... +x_ny_n| \leq \sqrt{x_{1}^{2} +x_{2}^{2} +...+x_{n}^{2}} \sqrt{y_{1}^{2} + y_{2}^{2} + ... + y_{n}^{2}}$ so that I could somehow use $\displaystyle \sum_{i,j=1}^{n}(\frac{\partial f_i}{\partial x_j})^2 \leq k$ to relate that

$\displaystyle d((f(b),f(a)) \leq (\sum_{i,j=1}^{n}(\frac{\partial f_i}{\partial x_j})^2) \sqrt{(b_1 - a_1)^2 + ... + (b_n - a_n)^2} \leq kd(b,a)$

Anyone has an idea on how to proceed ? Thanks.

EDITED: FORGET WHAT I WROTE ABOVE. I was careless. The Mean Value Theorem cannot be applied here. I wrongly wrote $\displaystyle (f_i(b) - f_i(a))^2 = \frac{\partial f_i}{\partial x_1}(c))^2(b_1 - a_1)^2 + ...+(\frac{\partial f_i}{\partial x_n}(c))^2(b_n - a_n)^2$ which is not correct. In face $\displaystyle (f_i(b) - f_i(a))^2 =\frac{\partial f_i}{\partial x_1}(c))^2(b_1 - a_1)^2 + ...+(\frac{\partial f_i}{\partial x_n}(c))^2(b_n - a_n)^2 + 2 \sum \sum \frac{\partial f_i}{\partial x_j}(c)\frac{\partial f_i}{\partial x_k}(c)(b_j-a_j)(b_k-a_k)$

2. THIS IS WRONG.

I have my further idea, but I am not sure whether it is correct.

From $\displaystyle \sqrt{((\frac{\partial f_1}{\partial x_1}(c))^2 + .. +(\frac{\partial f_n}{\partial x_1}(c))^2)(b_1-a_1)^2 + .. + ((\frac{\partial f_1}{\partial x_n}(c))^2 + .. +(\frac{\partial f_n}{\partial x_n}(c))^2)(b_n-a_n)^2}$, since

$\displaystyle \sum_{i,j=1}^{n}(\frac{\partial f_i}{\partial x_j})^2 \leq q_j < k$,

it should be clear that \frac{\partial f_1}{\partial x_j}(c))^2 + .. +(\frac{\partial f_n}{\partial x_j}(c))^2 = q_j [/tex] for any $\displaystyle j = 1,...,n$ and $\displaystyle q_1 + ... + q_n \leq k$

Hence I could write

$\displaystyle \sqrt{((\frac{\partial f_1}{\partial x_1}(c))^2 + .. +(\frac{\partial f_n}{\partial x_1}(c))^2)(b_1-a_1)^2 + .. + ((\frac{\partial f_1}{\partial x_n}(c))^2 + .. +(\frac{\partial f_n}{\partial x_n}(c))^2)(b_n-a_n)^2} = \sqrt{q_1(b_1-a_1)^2 + ... + q_n(b_n-a_n)^2)}$

I think there should have to be some way to reason from the above that

$\displaystyle \sqrt{q_1(b_1-a_1)^2 + ... + q_n(b_n-a_n)^2)} \leq \sqrt{k((b_1-a_1)^2 + ...+ (b_n - a_n)^2)}$

? ? ?

3. Sorry for double post. The system was delayed so I thought my message did not show up.

4. Write down the derivative at a point abstractly (i.e. the nxn matrix of partial derivatives), and consider how you would compute a directional derivative at that point (multiply by a norm-1 vector). What can you say about the norm of the directional derivative in any direction at any point?

5. Originally Posted by Tinyboss
Write down the derivative at a point abstractly (i.e. the nxn matrix of partial derivatives), and consider how you would compute a directional derivative at that point (multiply by a norm-1 vector). What can you say about the norm of the directional derivative in any direction at any point?
The norm of the directional derivative is

$\displaystyle \sqrt{\sum_{i=1}^{n}(\frac{\partial f_i(x)}{\partial x_j})^2}$

then what should I do next ?

(Anyone who could explain it without mentioning linear algebra would be helpful. I have not studied linear algebra and differential equations so those are arcane to me.)

Thank you very much.

6. You were right about the Schwarz inequality being the key here: If for a matrix $\displaystyle A=(a_{ij})$ we denote $\displaystyle \| A\|_{op} = \sup_{ \| x\|_2\leq 1 } \{ \| Ax\|_2 \}$ (where $\displaystyle \| . \|_2$ is the usual euclidean norm ie. the one you're using) and $\displaystyle \| A\| _F =\left( \sum_{i,j} a_{ij}^2 \right)^{\frac{1}{2}}$, then by the mean value theorem, which gives the bound $\displaystyle \| f(x)-f(y) \|_2 \leq \sup_{z\in [x,y]} \{ \| Df(z) \|_{op} \} \| x-y\|_2$ for any $\displaystyle x,y\in S$ (since $\displaystyle S$ is convex) and taking $\displaystyle [x,y]$ to be the segment joining both points, it suffices to prove that $\displaystyle \| A \|_{op} \leq \| A\| _F$. Let $\displaystyle z=\sum_{k=1}^n b_ke_k$ with $\displaystyle \| z\| _2 \leq 1$ then $\displaystyle \| Az \| _2 = \left( \sum_{j=1}^n \left( \sum_{i=1}^n b_ia_{ji} \right) ^2 \right) ^{\frac{1}{2}}$. Now applying the Schwarz inequality to the first sum we get $\displaystyle \left( \sum_{i=1}^n b_ia_{ji} \right) ^2 \leq \| z\| _2^2 \left( \sum_{i=1}^n a_{ji}^2 \right) \leq \sum_{i=1}^n a_{ji}^2$. Substitute in the previous inequality to get $\displaystyle \| Az \| _2 \leq \left( \sum_{i,j} a_{ji}^2 \right) ^{\frac{1}{2}}$, and since the right side does not depend on $\displaystyle z$ we are done.