Thoughts on Neural Information processing
Post date: 2023-04-01
I believe that mathematics is neither invented nor discovered, but rather emerges from human consciousness. If there were two planets inhabited by isolated human civilizations with similar levels of scientific achievement, I believe that many of the mathematical achievements of the two civilizations could be shown to have transformational invariances, i.e., many of the mathematical methods and conclusions would be similar. And if the current human civilization were to make contact with an alien civilization with a truly different form of consciousness or innate information processing, I believe that we might agree on certain conclusions, but the methods of abstraction and axioms used to build or arrive at those conclusions would be very different from our understanding of mathematics. Therefore, I would like to postulate that mathematics is an emergent phenomenon, born out of the emergence of the human brain. This, I would argue, makes the information-theoretic study of the human brain an extremely important endeavor, potentially affecting the very foundations of most sciences. This is not a unique conclusion, Gödel and his theorems raised similar questions about the nature of human thought, in a loose interpretation of the incompleteness theorems, that is, if there are mathematical truths that cannot be derived solely from within the formal system, how do humans come to understand or discover them? To extend this line of thought, human intelligence will likely always have an intrinsic uniqueness, with the ability to grasp certain truths that lie beyond the reach of formal or computational systems. This foreshadows a future in which complex challenges will be met through the synergistic tandem of diverse intelligences, hopefully for the progress of humanity and all life on this planet.
Pioneering efforts in the second half of the 20th century to understand brain computation and, at times, to replicate these rudimentary models have had a profound impact on engineering. The "von Neumann architecture" - the foundation of modern computer systems, consisting of CPUs, I/O, buses, and RAM - was conceived by John von Neumann, who sought to loosely emulate the structure and organization of the brain in a computing machine as it was understood in the 1940s. The 1943 paper by Warren McCulloch and Walter Pitts, "A Logical Calculus of the Ideas Immanent in Nervous Activity," explored the idea of building computational networks based on the structure and function of biological neural networks. Frank Rosenblatt's introduction of perceptrons, convolutional neural networks, and reinforcement learning underscores the rich interplay between the fields of artificial intelligence and neuroscience. Predicting the future may be a precarious endeavor, but I would like to place my bets by participating in this dynamic interplay.