I walked in slightly late for a “Cosmic/Neuronal Slapdown” Saturday at the New York Academy of Medicine. I had never heard of such a thing, but this was an event sponsored by my friends Penny and Thomas (for this AKA the Brandt Jackson Foundation), and the title of the entire day long meeting was “Festival of Medical History and The Arts”, so I prepared myself for a fun morning. It was and it even involved a banjo. Rather than going through the details of the actual slapdown, I will just say that it was an inspiring competition of cosmological, astrophysical and neuronal imaging. Anyone who knows me also knows that these 3 topics, and even the banjo for that matter, are very relevant to my life. It was brilliantly orchestrated by Lawrence Weschler, and doing the slapping via Macs and projection were Carl Schoonover and Michael Benson. It was as remarkable as you could imagine it to be, with images of stained mouse brains on one side, and Europa on the other looking extremely similar as just one example. If someone had not seen these images before they may not know which were of the cosmos and which were of a brain. There were hundreds of these. Each of them were beautiful and revealing as both artistic and scientific imagery and were almost narrative in themselves as synapses and galaxies formed revealing nature on scales that we can only see through enhanced and sophisticated tools. The actual narrative and the final words of the great film editor Walter Murch spoke to these similarities and to the quest of art and science. More importantly it revealed that the slapdown was not one of imagery but instead one of complexity. It was not about which photos, or even which explanations were better or more glorious. It was really about the place of an animal (and even more specifically in the end a human) brain in relation to the universe. Murch, and I would guess that much of the audience, did something that I have noticed a lot in recent years. They point out that despite humanities small place in a vast and possibly infinite universe, this small material of our brains has more complexity than the universe that surrounds it. This counter intuitive idea has gained popularity for fairly good reasons. We have nearly 100 Billion neurons creating more synapse than all of the stars in the known universe. This certainly seems complex.
Walter Murch made a nice comparison not only about complexity as this numbers game, but also about energy output. He points out that in terms of complexity he feels the brain in comparison to the sun is certainly more advanced, as the sun is rather simple in his opinion, regardless of its importance. What he was curious to find though is that when an equal volume of solar mass is compared to the electricity emitted by a human brain it is very small. This is a very cool observation of course, and I completely see what he was getting at. Still, I think there is something going on in this type of discourse that is almost equivalent to the years of resistance to Copernicus and Galileo. Is it smart people (Ptolemy and Aristotle were very smart of course) trying to keep us at the center of the universe? If not at the physical center, at least analogies such as these keep us anchored to a perception of superior complexity and energy density.
Before I refute these ideas, I want to admit that to a large extent I agree with the premise as it relates to our current state of knowledge. This is especially true if you are comparing pictures, as was done on Saturday. Nebula resembled Neurons so much that if all of the information contained in those images were all we knew or were ever to know, we would basically just count, and the brain would win this slapdown as being more complicated. I am saying complicated as a way to distinguish this word from complexity, which I think of differently (despite Google telling that they are indeed synonyms.) This distinction is why the brain’s importance is scientifically arguable to be. I actually wish that the images did tell the whole story as I make instruments for taking images. I want those images to tell a story, and they do. The story they tell is observational and involves imagination however. We also make software for interpretation and classification so that we are not fooled by our eyes. Those are the most important part perhaps, but with being able to scale deduction of these observations we are lost with a simplistic idea that everything that looks complicated is inherently more complex. I have reduced this before, and this is where it gets tricky (or complicated and complex I guess).
Complexity theory is of course a real thing, and involves modeling by some of the most creative computer scientists and statisticians I know. The Santa Fe Institute is dedicated this study, and the sophistication of complexity models used varies from those which we may have an intuitive sense about, to those that are completely non-intuitive. No matter what types of complexity being addressed (agent based modeling, toy models etc.) there are two basic distinctions that are often made up front. It goes back to the late 1940’s with the work of the mathematician Warren Weaver who described Complexity as either being organized, or disorganized. I am not in any way an expert on either, but in the age of big data we can think of disorganized complexity as something that traditionally involved statistical methods for understanding bulk behavior. The farther away we are from direct reduction, the more we rely on the randomness of disorganized systems. In other words we need a lot of data points to make sense of it. Mathematics has made us better at such systems by the adoption of Chaos Theory which is another enormously large discipline in mathematics that attempts to break down the complexity of disorganized systems into predicable components. Many people are familiar with chaos theory in a colloquial sense from the famous butterfly effect, where a single flap of a butterfly changes conditions everywhere. A chaotic and complex is system is dynamic, and the farther we are from detailed observation the more we rely on complex models. An organized system would appear easier to cope with, but it assumes direct correlations. If randomness is eliminated and a new entity is observed, it is said to emerge. Emergence, along with Quantum energy, are two of the most overused and misunderstood concepts in physics. Complexity theorists use emergence in a way that it allows for a new agglomerated data point which can either be used in an organized or disorganized complexity model.
Having said all of that I will propose a narrow definition of complexity, and it is not my complete argument against the Braincentric universe theory that I am describing. The brief understanding of how others describe complexity here is only to say that perhaps the mathematical necessities of disorganization and the observable predicable aspects of others when combined are where complexity really resides. This means that if we are using imaging anything recorded by a sensor has basically equivalent amounts of data (yes sensors can be different, but astrophysicists and neuroscientist use similar sensors and analyze bit values ether way.) So, while I won’t discard the work of the smarter dedicated complexity theorists, I will use my own definitions, which somehow fall between a disorganized and organized system as I think most do. Rather than talk about the brain specifically though, I will propose some basic axioms, which I know borders on pretentious to do, but I think is necessary to understand my point.
a) The ability to predict patterns of a given system is equivalent to its complexity. Therefore:
b) The smaller a single organized object in relation to larger disorganized objects within the same space is more complex as:
c) The potential for location within that larger space is far greater
d) Any aggregation of smaller objects results in reduced complexity
This may seem both very easy to understand, and very wrong at the same time, but I want to point out just two examples, neither of which jump straight to the brain and the universe, which I am sorry for. Instead they come from Experimental Physics, which is at least closer to my field. They do take in to account the Murch et al. idea of complexity as being about large numbers and energy.
a) One bit of science I have worked on is putting nanoparticles in polymers. This seems like a strange thing to do, but there are great practical reasons for such an experiment. In the case of what I and my colleagues have done it was to look at how many of these nanoparticles you could put into the polymer matrix in order to achieve certain “complex” behaviors. Coincidentally (or not) we were looking for ways to generate electricity, and material strength. The results are that if you put a small amount of the nanoparticle in a polymer it is very hard to deal with. There are laws of thermodynamics, polymer entanglements, week polymer-filler bonds etc, which make the location and the control of these particles very difficult to predict. So I call this complex. What we were after was something complex, but this level of complexity put the “mind” of the particle out of our control, so we needed to add more. As a bit of a sidenote, we did not think of this, we were just the first to experiment with it. The theory was that there was a near perfect amount of complexity by which we could achieve. This is called percolation, and refers to a theoretically loading level where flow between particles can occur. In this case the flow of electricity actually takes place without direct contact, but through quantum tunneling. So the goal was to fill with as little of the particles as possible, but allow for percolation to occur. This provides a good idea of complexity which reaches a level (the percolation point) at which the system becomes less complex with additional loading. So this is again an example of increased simplicity with greater numbers, and more energy.
b) My second example of complexity, is something I am much less knowledgeable about, but still more knowledgeable about than neuroscience (which I know is supposed to be the point of my whole essay, but still..) This week I had two Walter Murch experiences, the first one being early in the week when I saw the brilliant, thought provoking and personally moving film “Particle Fever” at the New York Film Festival. Walter edited this film, and since it contained footage taken over 7 years, his contribution to it must have been profound. “Particle Fever” is an almost existential journey into the lives of a few scientists involved with the LHC discovery of the Higgs Boson. Readers here have certainly followed this highly publicized discovery, and the multibillion dollar project that produced it. It was the largest experiment in history, with a primary aim of observing the presence of something very small and very import. The Higgs is a key discovery in unifying the Standard Model of Particle Physics, and even now, with it found, the implications remain truly complex. This is to say that with all of the resources of the world physics community, and a very large device (the LHC itself), the particle is more complex than the instrumentation or even the people that it took to produce it.
So after all of this my point is actually incredibly simple (not complex at all). The human brain is tightly packed. There are a lot of relatively small things. Billions as we are reminded. These things are bounded within a very small area. This essentially makes the brain predictable as we can theoretically map all of these points. The brain is not infinite in any direction. We can also use a very old way of placing each brain into a disorganized system, by taking the organization of each brain as an emergent system. When we do this then and compare the small population of earth to the galaxies alone, which move through a potentially infinite space it is impossible to consider a single brain to be more complex. Again we find ourselves small and simple.