Results 1 to 6 of 6

Math Help - Contraction mapping problem in n-dimensional vector-valued function

  1. #1
    Junior Member
    Joined
    Apr 2009
    Posts
    51

    Contraction mapping problem in n-dimensional vector-valued function

    Let S be a closed subset of E^n which contains the entire line segment between any two of its points and let f be a continuously differentiable map from an open subset of E^n containing S into E^n. Suppose that f(S) \subset S and that there is a real number k<1 such that

    \sum_{i,j=1}^{n}(\frac{\partial f_i}{\partial x_j})^2 \leq k

    for all x \in S. Prove that the restriction of f to S is a contraction map, so that the fixed point theorem is applicable.

    ...


    I have my idea, but I could not go through. I need to show that for any a=(a_1,...,a_n) \in S and b = (b_1,...,b_n) \in S

    d(f(b), f(a)) \leq kd(a,b)

    where f(x_1,...,x_n) = (f_1(x_1,...,x_n),...,f_n(x_1,...,x_n)), k<1

    d(a,b) = \sqrt{(b_1 - a_1)^2 + ... + (b_n - a_n)^2}



    d((f(b),f(a)) = \sqrt{(f_1(b)-f_1(a))^2 + ...+ (f(_n(b) - f_n(a))^2}

    by the Mean Value Theorem for several variables, we have

    \sqrt{(\frac{\partial f_1}{\partial x_1}(c))^2(b_1 - a_1)^2 + ...+(\frac{\partial f_n}{\partial x_n}(c))^2(b_n - a_n)^2 +(\frac{\partial f_2}{\partial x_1}(c))^2(b_1 - a_1)^2 +..+\frac{\partial f_2}{\partial x_n}(c))^2(b_n - a_n)^2} +..+\frac{\partial f_n}{\partial x_1}(c))^2(b_1 - a_1)^2 +...
    + \frac{\partial f_n}{\partial x_n}(c))^2(b_n - a_n)^2

    where c is the point on the line segment between a and b which yields

    \sqrt{((\frac{\partial f_1}{\partial x_1}(c))^2 + .. +(\frac{\partial f_n}{\partial x_1}(c))^2)(b_1-a_1)^2 + .. + ((\frac{\partial f_1}{\partial x_n}(c))^2 + .. +(\frac{\partial f_n}{\partial x_n}(c))^2)(b_n-a_n)^2}

    From here, I have no idea to proceed to make the desired inequality. There is a hint said that I could use the preceding problem in the book as follows:

    If f: [a, b] \rightarrow E^n, its length is \int_{a}^{b} \sqrt{\frac{df_1(x)}{dx}^2 + ... + \frac{df_n(x)}{dx}^2}dx

    but I could not relate how I can use it. The assumption that f(S) \subset S should be the hint that d(f(b),f(a)) \leq d(a,b) and the fact that S is closed ensures the existence of cauchy sequences converging to a fixed point.

    The Schwarz inequality seems to be the way to go, but I could not utilize it.

    |x_1y_1 +x_2y_2 + ... +x_ny_n| \leq \sqrt{x_{1}^{2} +x_{2}^{2} +...+x_{n}^{2}} \sqrt{y_{1}^{2} + y_{2}^{2} + ... + y_{n}^{2}} so that I could somehow use \sum_{i,j=1}^{n}(\frac{\partial f_i}{\partial x_j})^2 \leq k to relate that

    d((f(b),f(a)) \leq (\sum_{i,j=1}^{n}(\frac{\partial f_i}{\partial x_j})^2) \sqrt{(b_1 - a_1)^2 + ... + (b_n - a_n)^2} \leq kd(b,a)


    Anyone has an idea on how to proceed ? Thanks.

    EDITED: FORGET WHAT I WROTE ABOVE. I was careless. The Mean Value Theorem cannot be applied here. I wrongly wrote (f_i(b) - f_i(a))^2 = \frac{\partial f_i}{\partial x_1}(c))^2(b_1 - a_1)^2 + ...+(\frac{\partial f_i}{\partial x_n}(c))^2(b_n - a_n)^2 which is not correct. In face (f_i(b) - f_i(a))^2 =\frac{\partial f_i}{\partial x_1}(c))^2(b_1 - a_1)^2 + ...+(\frac{\partial f_i}{\partial x_n}(c))^2(b_n - a_n)^2 + 2 \sum \sum \frac{\partial f_i}{\partial x_j}(c)\frac{\partial f_i}{\partial x_k}(c)(b_j-a_j)(b_k-a_k)
    Last edited by armeros; March 14th 2011 at 07:27 PM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Junior Member
    Joined
    Apr 2009
    Posts
    51
    THIS IS WRONG.


    I have my further idea, but I am not sure whether it is correct.

    From \sqrt{((\frac{\partial f_1}{\partial x_1}(c))^2 + .. +(\frac{\partial f_n}{\partial x_1}(c))^2)(b_1-a_1)^2 + .. + ((\frac{\partial f_1}{\partial x_n}(c))^2 + .. +(\frac{\partial f_n}{\partial x_n}(c))^2)(b_n-a_n)^2}, since

    \sum_{i,j=1}^{n}(\frac{\partial f_i}{\partial x_j})^2 \leq q_j < k,

    it should be clear that \frac{\partial f_1}{\partial x_j}(c))^2 + .. +(\frac{\partial f_n}{\partial x_j}(c))^2 = q_j [/tex] for any  j = 1,...,n and  q_1 + ... + q_n \leq k

    Hence I could write


    \sqrt{((\frac{\partial f_1}{\partial x_1}(c))^2 + .. +(\frac{\partial f_n}{\partial x_1}(c))^2)(b_1-a_1)^2 + .. + ((\frac{\partial f_1}{\partial x_n}(c))^2 + .. +(\frac{\partial f_n}{\partial x_n}(c))^2)(b_n-a_n)^2} = \sqrt{q_1(b_1-a_1)^2 + ... + q_n(b_n-a_n)^2)}

    I think there should have to be some way to reason from the above that

    \sqrt{q_1(b_1-a_1)^2 + ... + q_n(b_n-a_n)^2)} \leq \sqrt{k((b_1-a_1)^2 + ...+ (b_n - a_n)^2)}

    ? ? ?
    Last edited by armeros; March 15th 2011 at 02:07 AM.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Apr 2009
    Posts
    51
    Sorry for double post. The system was delayed so I thought my message did not show up.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Senior Member Tinyboss's Avatar
    Joined
    Jul 2008
    Posts
    433
    Write down the derivative at a point abstractly (i.e. the nxn matrix of partial derivatives), and consider how you would compute a directional derivative at that point (multiply by a norm-1 vector). What can you say about the norm of the directional derivative in any direction at any point?
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Junior Member
    Joined
    Apr 2009
    Posts
    51
    Quote Originally Posted by Tinyboss View Post
    Write down the derivative at a point abstractly (i.e. the nxn matrix of partial derivatives), and consider how you would compute a directional derivative at that point (multiply by a norm-1 vector). What can you say about the norm of the directional derivative in any direction at any point?
    The norm of the directional derivative is

    \sqrt{\sum_{i=1}^{n}(\frac{\partial f_i(x)}{\partial x_j})^2}

    then what should I do next ?


    (Anyone who could explain it without mentioning linear algebra would be helpful. I have not studied linear algebra and differential equations so those are arcane to me.)

    Thank you very much.
    Last edited by armeros; March 15th 2011 at 02:15 AM.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Super Member
    Joined
    Apr 2009
    From
    México
    Posts
    721
    You were right about the Schwarz inequality being the key here: If for a matrix A=(a_{ij}) we denote \| A\|_{op} = \sup_{ \| x\|_2\leq 1 } \{ \| Ax\|_2 \} (where \| . \|_2 is the usual euclidean norm ie. the one you're using) and \| A\| _F =\left( \sum_{i,j} a_{ij}^2 \right)^{\frac{1}{2}}, then by the mean value theorem, which gives the bound \| f(x)-f(y) \|_2 \leq \sup_{z\in [x,y]} \{ \| Df(z) \|_{op} \} \| x-y\|_2 for any x,y\in S (since S is convex) and taking [x,y] to be the segment joining both points, it suffices to prove that \| A \|_{op} \leq \| A\| _F. Let z=\sum_{k=1}^n b_ke_k with \| z\| _2 \leq 1 then \| Az \| _2 =  \left( \sum_{j=1}^n \left( \sum_{i=1}^n b_ia_{ji} \right) ^2 \right) ^{\frac{1}{2}} . Now applying the Schwarz inequality to the first sum we get \left( \sum_{i=1}^n b_ia_{ji} \right) ^2 \leq  \| z\| _2^2 \left( \sum_{i=1}^n a_{ji}^2 \right) \leq \sum_{i=1}^n a_{ji}^2. Substitute in the previous inequality to get \| Az \| _2 \leq \left( \sum_{i,j} a_{ji}^2 \right) ^{\frac{1}{2}}, and since the right side does not depend on z we are done.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Contraction Mapping.
    Posted in the Differential Geometry Forum
    Replies: 3
    Last Post: July 30th 2011, 07:38 AM
  2. Vector valued function problem
    Posted in the Calculus Forum
    Replies: 1
    Last Post: November 6th 2010, 12:54 PM
  3. Contraction mapping iterations
    Posted in the Calculus Forum
    Replies: 1
    Last Post: April 17th 2010, 04:34 AM
  4. Vector Valued Function Problem
    Posted in the Calculus Forum
    Replies: 3
    Last Post: October 18th 2009, 02:45 PM
  5. Contraction Mapping Principle
    Posted in the Calculus Forum
    Replies: 3
    Last Post: January 31st 2009, 12:30 AM

Search Tags


/mathhelpforum @mathhelpforum