Something tells me I made a similar post before but I did not find it.

What is the rigorous mathematical definition of randomness? I was thinking like this: Assume you have an algorithm which produces numbers then we can define it as random if the limit of the correlation coefficient approaches zero as $\displaystyle n \rightarrow \infty$.

I am talking about concepts (probability and algorithms) which I have never studied thus you might not understand what I am trying to say. But I am trying to say for example an algorithm is defined in such a way it produces only one's. Then we can create a plot with the x-axis being the number of times the algorithm used and y-axis is its result output. Then by the correlation coefficient formula that coefficient is always one. Thus, the limit as increasing the number of times using the algorithm is 1 not zero thus that algorithm is not random. Does anyone understand what I am trying to ask?

CaptainBlack this is what you know best Computability Theory, help me.