1. permuting vectors

Let v=(x[1],...,x[n]) be a n-dimensional real vector. Find the dimension of the subspace of R^n spanned by v and all vectors obtained by permuting the components of v.

2. The answer is "it depends". Consider, for example, the permutations of (1,1,1,...1) and the permutations of (1,0,0,...,0).

3. Rgep's examples suggest to me this conjecture: The dimension of the subspace is 0 if $v = (0,0, \ldots ,0)$, 1 if $v = (a,a, \ldots ,a)$ for some $a \ne 0$, and $n$ otherwise. Suggestions for a proof or counter-examples?

4. Well, that seems to be the case. The proof is really straightforward, let $v$ have a nonzero entry a at the i-th place: $v=(v_1,...,a,...,v_n)$. We can take $a=1$, as otherwise we can consider $(1/a)v$.

Let $V$ be the subspace created by $v$ and its permutations, and consider the j-axis ${(0,..,x_j,...,0)}$ of $R^n$.

We consider a permutation $s$ of $\{1,2,...,n\}$ with 1 occupying the j-th place, and denote the vector obtained by $v^*=(v_{s(1)},...,1,...,v_{s(n)})$.

Then

$v^*=(v_{s(1)},...,v_{s(j-1)},0,v_{s(j)},...,v_{s(n)})+{\bf e}_j$

where ${\bf e}_j$ is the jth euclidean basis vector. And
$(0,..,x_j,...,0)=x_j{\bf e}_j+0(v_{s(1)},...,v_{s(j-1)},0,v_{s(j)},...,v_{s(n)})$

This proves $V$ contains the j-th axis of $R^n$. So $V$ contains every axis, and therefore $V=R^n$.

5. Generalizing what rgep said, it seems to me I was (unable to prove it).

That there are only two cases, given,
$(x_1,x_2,...,x_n)$
And if $x_1=x_2=...=x_n\not = 0$
Then, the dimension of this subspace is 1.

Otherwise its dimension is $n$

6. -Doctor, i feel people don't pay attention to me.

7. Sorry, 'bout that did not wish to steal anyone's answers I just found the problem interesting.

Originally Posted by JakeD
The dimension of the subspace is 0 if $
v = (0,0, \ldots ,0)
$
You sure about that? It looks like the dimension is one to me.

8. Originally Posted by ThePerfectHacker
Sorry, 'bout that did not wish to steal anyone's answers I just found the problem interesting.

You sure about that? It looks like the dimension is one to me.
No the space consists of a single point - a subspace of $\mathbb{R}^n$ of dimension 0.

RonL

9. Sorry, 'bout that did not wish to steal anyone's answers
Oh no no, I made no such claim

I just felt invisible for a moment there

10. Originally Posted by CaptainBlack
No the space consists of a single point - a subspace of $\mathbb{R}^n$ of dimension 0.

RonL
You are right. Understand my mistake, I was considering that the set of all unit vectors for a basis and are linearly independent, my fault was that they were not elements from the zero vector.

11. In working up a proof of my conjecture, I found a counter example and had to revise the conjecture. (Rebesques, take note. :( ) Here is the revised conjecture and a proof.

Let $v = (x_1, x_2,\ldots, x_n).$ The dimension $d$ of the subspace $V$ spanned by $v$ and its permutations is

$\begin{array}{lll}
\mathsf{A.} &0 &\mathsf{if}\ v = (0,0,\ldots,0), \\
\mathsf{B.} &1 &\mathsf{if}\ v = (a,a,\ldots,a) \ \mathsf{for\ some}\ a \ne 0, \\
\mathsf{C.} &n-1 &\mathsf{if}\ v \ne (a,a,\ldots,a) \ \mathsf{for\ any}\ a
\ \mathsf{and}\ \sum_{i=1}^{n} x_i = 0, \\
\mathsf{D.} &n &\mathsf{if}\ v \ne (a,a,\ldots,a) \ \mathsf{for\ any}\ a
\ \mathsf{and}\ \sum_{i=1}^{n} x_i \ne 0. \\
\end{array}
$

As examples in $\mathbb{R}^2$, for $v = (1,1)\ \mathsf{or}\ (1,-1),\ d = 1,$ while for $v = (1,0),\ d = 2.$

Claims C and D are proven by showing that the given number is the maximum number of independent linear combinations of $v$ and its permutations. So assume $v = (x_1, x_2,\ldots, x_n) \ne (a,a,\ldots,a)$ for any $a.$ Under this assumption we may further assume wlog that
$x_1 \ne x_2$ since we can permute $v$ as we like. Let $\alpha = 1/(x_1 - x_2).$

We will use the following linear combinations of $v$ and its permutations.

$\begin{array}{lll}
v_1 &=& \alpha (x_1,x_3,x_4,x_5,x_2) - \alpha (x_2,x_3,x_4,x_5,x_1) \\
&=& (1,0,0,0,-1), \\
v_2 &=& \alpha (x_3,x_1,x_4,x_5,x_2) - \alpha (x_3,x_2,x_4,x_5,x_1) \\
&=& (0,1,0,0,-1), \\
v_3 &=& \alpha (x_3,x_4,x_1,x_5,x_2) - \alpha (x_3,x_4,x_2,x_5,x_1) \\
&=& (0,0,1,0,-1), \\
v_4 &=& \alpha (x_3,x_4,x_5,x_1,x_2) - \alpha (x_3,x_4,x_5,x_2,x_1) \\
&=& (0,0,0,1,-1), \\
v_5 &=& v \\ &=& (x_1,x_2,x_3,x_4,x_5). \\
\end{array}$

Putting these vectors into a matrix $\begin{bmatrix}
1 & 0 & 0 & 0 & -1 \\
0 & 1 & 0 & 0 & -1 \\
0 & 0 & 1 & 0 & -1 \\
0 & 0 & 0 & 1 & -1 \\
x_1 & x_2 & x_3 & x_4 & x_5 \\
\end{bmatrix}
$

it is clear that the first $n -1 = 4$ vectors are linearly independent and their elements sum to zero. If the elements of the last vector do not sum to zero, it will be independent of the first $n - 1$ vectors and the number of independent vectors in $V$ will be $n,$ the maximum possible. This proves claim D.

Proving C, if the elements of the last vector sum to zero, it will be linearly dependent on the first $n - 1$ vectors. Since the last vector could represent any permutation of $v$, this means there cannot be more than $n - 1$ independent vectors in $V.$ This proves C and completes the proof.