I have spent a couple of years formulating the following theory:
There exists a highly sophisticated mathematical algorithm that may require sets of algorithms that the brain uses to convert millions and/or billions of pieces of sensory data to a single concept. (e.g. Grandma) This algorithm requires interpreting data from multiple dimensions and simplifying them down to less dimensions in a rounding process. Binary is obviously one of the languages that the brain uses but it can't be assumed that binary is the only language used in the algorithm as languages that involve more than 2 dimensions may have significant advantages for complex calculations. Cracking the code of how the brain interprets sensory data into concepts may help us understand great mysteries of the universe.
The code involves the following attributes:
* Number of data points
* Number of dimensions to the data
* Number of sets of algorithms
* Number of numerical languages
* Existing numerical languages that the brain uses
It's such an abstract theory, I don't know who to talk to about it. I have done some initial modelling on the theory but it seems to earlier to get caught up in the mathematics but rather focus on what field of study would be most interested on building on it.