L_+ = ker(A*-i)=[ran(A+i)^perp

L_- = ker(A*+1) = [ran(A-i)]^perp

K_+ = {f directsum if: f in L_+}

K_- = {g directsum (-ig):g in L_-}

Show that gra A, K_+ and K_- are pairwise orthogonal.

How do I do this? Please help

Printable View

- March 19th 2009, 03:11 PMNuscUnbounded Operators
L_+ = ker(A*-i)=[ran(A+i)^perp

L_- = ker(A*+1) = [ran(A-i)]^perp

K_+ = {f directsum if: f in L_+}

K_- = {g directsum (-ig):g in L_-}

Show that gra A, K_+ and K_- are pairwise orthogonal.

How do I do this? Please help - March 20th 2009, 09:15 AMOpalg
This is a problem about extensions of a symmetric unbounded operator A on some Hilbert space H, and it's not hard provided that you understand all the definitions.

For a start, A has a domain D(A) (which is a dense subspace of H). It has an adjoint A* with domain D(A*), such that for all f in D(A) and g in D(A*). To say that A is symmetric means that and A*(f)=A(f) for all f in D(A).

The notation gra(A) means the graph of A, which is the set .

The spaces are defined by . The kernel of an adjoint operator is always the orthogonal complement of the range of the operator, so (because is the adjoint of ).

The spaces are by definition contained in the graph of A*. To see that they are orthogonal to the graph of A, let and . Then

(definition of inner product in direct sum)

(conjugate-linear property of inner product)

(because .

A similar argument shows that gra(A) and are orthogonal.

The fact that and are orthogonal is more or less immediate, because if and then - March 20th 2009, 06:40 PMNusc
I see thank you.

I'm not sure why it's clear that:

Then how would I show that this direct sum is dense in gra A*.

Thanks for your concern, - March 21st 2009, 01:51 PMOpalg
gra(A) is contained in gra(A*) because A is symmetric. (If you're unsure about that, go back and check the definition of a symmetric operator.)

and are contained in gra(A*) by the way in which they are defined. (If you're unsure about that, go back and check the definitions of .)

The sum is a direct one because we have seen that these three spaces are all orthogonal to each other.

This is proved in Lemma 2.13 of Conway's*Course in functional analysis*, Chapter X. The idea is that if is not the whole of gra(A*) then there will be a nonzero vector in its orthogonal complement. An element of gra(A*) is of the form for some h in the domain of A*. So you take a vector h in D(A*) such that is in that orthogonal complement, and you need to show that h=0. Go slowly and carefully through the proof of Lemma 2.13, and you'll see how it works. - March 21st 2009, 05:32 PMNusc
Example 2.21:

Let A and be as in Example 1.11 ( all functions that are absolutely continuous; so A is symmetric (how does this imply that?). The operator B of Example 1.12 ( - it's not even what he meant by ) is a self-adjoint extension of A. Let us determine all self-adjoint extensions of A. To do this it is necessary to determine . Now if and only if and , so . Hence . Also, the isomorphisms of onto are all of the form where .

If , let .

,

if .

What is the significance of this example? What was Conway trying to illustrate? - March 22nd 2009, 12:53 PMOpalg
To say that A is symmetric means that for all functions f,g in the domain of A (which should consist of all differentiable functions on [0,1] whose derivatives are in and which vanish at 0 and 1). The operator A is defined by Af=if'.

The inner product is given by . To see that this is equal to , you need to integrate by parts. I'll leave you to do that. Study the calculation very carefully, and you will see that only f (not g) needs to satisfy that condition f(0)=f(1)=0. In fact, the adjoint of A is the operator A* given by the same formula A*g=ig' but on a larger domain, namely the set of all differentiable functions on [0,1] whose derivatives are in (but which need not vanish at the endpoints).

The fundamental idea of von Neumann's theorem is that if A is symmetric (so that A* is an extension of A) then as you increase the domain of A, the domain of its adjoint decreases. If you're lucky, there is an intermediate domain which is the same for the operator and its adjoint. That operator is then selfadjoint.

In the case of the operator Af=if', that point occurs when the domain is the set of all differentiable functions f on [0,1] whose derivatives are in and which satisfy f(0)=f(1). You can check that if f and g both satisfy that condition then the integration by parts calculations works. This salfadjoint extension of A is what Canway calls B in his Example 1.12.

He is showing how von Neumann's theorem applies to this example. You get a selfadjoint extension of A by increasing its domain, or equivalently by increasing its graph through incorporating part of the graph of A*. It turns out that the way to do this is to add to the graph of A a subspace of which in turn is defined in terms of an isometric map from a subspace of to a subspace of . If this map takes the whole of to the whole of then the extension will be selfadjoint. The condition for that to be possible is that and have the same dimension. - March 22nd 2009, 01:01 PMNusc
I see, von Neumann's theorem is not stated in Conway or in Wikipedia.

It says:

Let A be a symmetric operator and suppose that there exists a conjugation C with C: D(A)->D(A) and AC = CA. Then A has equal deficiency indices and therefore has self-adjoint extensions.

Do you know of a more precise way of writing that theorem?