Let . Can we impose a condition on these vectors so that every vector has a unique decomposition
where and by definition and denotes element-wise conjunction?
The one from linear algebra (a linear combination can be zero only trivially) doesn't work -- it's easy to give an example.
Sorry! I meant logical disjunction. It's my English. Disjunction is called something different in Polish (unless someone really wants to sound smart) and I get confused all the time.
I thought I'd add something. Let for i=1,...,n be defined in the usual way. Every vector has a unique decomposition
which is easily proven. So it's not that there are no such bases. This one does actually satisfy the linear-algebraic independence condition, but does too, in and yet it doesn't yield unique decompositions.
edit: , and not
If no one is replying because I wasn't clear enough, please tell me. I will try to make myself clearer.
Anyway, perhaps you could help me with my web search. I'm not sure if I'm using the correct terms. Is the structure defined above a semimodule over the semiring
Is there an unambiguous name for this semiring? I think it's called a boolean algebra, but I understand this term is ambiguous.
hi,
first we must show that the space has dimension equal to n.
and then :
we know that every (n independent vectors) represent a base for the space. (the base have a property that every vector from a space can be writen uniquely withe the basis)
thus, the condition is : the vectors u(1) ... u(n) must be independent.
It's not a vector space. (This is what I meant when I said that is is not a linear space.) A vector space is defined over a field. For example, {0,1} is a field when you define the addition as XOR (we denote it by and the multiplication as AND (we denote it by ), you get the two-element field. But in this case we do not have a field. While the multiplication is still defined to be AND, the addition is defined to be OR (we denote it by ), not XOR. In this case we do not get a field.
But we can define a space of "vectors" over this structure anyway. We can define addition of the "vectors". Let and be in . We define
We can also define the multiplication by scalars the following way. Let be in . Let be in . We define
.
So we just defined a "vector space" over our structure. It is not an actual vector space though, because our underlying structure isn't a field.
as i think you have recognized ymar, the problem lies in the fact that you only have a boolean algebra, and not a vector space. in particular, {0,1} is not a group under , because there is no inverse for 1. because of this peculiar behavior, linear independence is not a strong enough condition to guarantee the unique decompostion you seek.
i believe the structure you are after is "an idempotent semimodule", although "boolean semimodule" may also turn up some results for you.
a definition i saw of independence was: X is independent if where is the set of linear combinations of X.
the trouble with your set is that , for example is not realizable as any linear combination:
, but then .
this is in stark contrast to linear algebra where a 3-element linearly independent set necessarily spans a 3-dimensional space.
Thank you, Deveno. This was very helpful. Especially the names of the structure! Now I see that there are lots of papers about it. I was about to believe that no one has tried to do research on them! I'll be looking at them and doing some thinking, and if I have any problems, I'll post here again.
i think idempotent semimodule bases are unique up to scalability, so if the underlying semiring has only two elements, should be the only possible basis for .
i did not see a proof in the link you posted, just the statement of it.