Hey maxim.

As a formal measure, you should look at the entropy and distributions of the final set of data.

Specifically you should look at theorems regarding mutual information and joint entropy for a specific way to prove (or dis-prove) your result.

Intuitively, I think you are correct but proving it will require information theory.

Note that the definition of random is that of maximum entropy given some alphabet and corresponding distribution under that specific probability distribution in relation to the alphabet being used to express the information.

There is a good book on Information Theory by Cover if you need a good extensive resource.