The chaos theory is widely studied in the area of Mathematics of chaos and continuous time dynamical systems. It is the study of purely random or unpredictable behaviour is systems governed by deterministic and well-studied laws.
If we intend to express it in a more precise term, deterministic chaos, suggests a paradox.It connects two concepts and notions that arefamiliar and commonly regarded as irreconcilable.
One of the common indicators of chaotic behaviour in a system is power law. Power law scalings are copious in natural and man-made complex systems.The human brain is also often described asa system with critical point with a significant and often showing avalanche effect or change taking place. It shows a careful line between high and low activity, between order and disorder which is basically inherent property of chaotic system.
Defining Power Law
The power law (also known as scaling law) characterizes the special quantities. The relative change in one quantity results in a proportional relative change in the corresponding quantity.
The temporal dynamics of complex networks such as the worldwide Internet, common interactions among differentpersons in society, protein or gene interactions in biological systems in nature, communication between digital computers, and various electricity transmissiongrid systems are characterized by a power scaling between the temporal mean and spread of signals at each network node.
Scientists haveconfirmed the proposed conjectured arguments that the temporal dynamics of the brain networks are categorizedin the same category of similar power law. This wide-ranging study might be useful to weigh the relative effects of randomness, causality and external modulators on the brain’s neural network dynamics.
Neuron Firing: Process of information transfer inside brain
The communication between neurons takes place due to neuronal firing. These firings are inherently electrical impulses occurring between neurotransmitters.
In order to better appreciate this process, we need to shed some more light on and focus on how the parts of a neuron, including the soma, dendrites and axons communicate.
The soma is believed to be the “brain” of the brain cell, as it takes care of the processes related to the input/information into the cell and regulates it categorizingit if it is imperative enough to pass along to the other cell.
The dendrites are structures that follows a tree-like pattern. They that receive and gather data from other neurons for delivering it to the soma.This information is processed and as stated above, the determination is made whether it is important enough to permitit to pass along to anotherneighbouring neural cell.
Axons works a pathway to pass such information from neuron to neuron. In layman terms it acts like the cable or wires in our house.
The way electrical wires are insulated in our homes, the same way the axons are also insulated with a fatty substance called myelin. This helps to preserve the electrical current keeping it strong and flowing directionally.
Scientists and researchers have found that the brain actually follows a power law when processing stimuli on a neural level.
To give a quick summary sometimes it is better to miss a lot of details, as done by low dimensional neural activity. It recognises similar pattern and misses details but resistant to noise. As opposed to it high dimensional neural activity is extremely sensitive and highly detailed.
So, brain processes stimuli maintain a striking balance between low dimensional and high dimensional neural activity in order to receive right amount of sensory information. Studies have been done in strict lab-controlled conditions on mice. The studies have led t conclude the researchers that a sort of decaying power law is flowed by dimensionality and neural response.
“A few dimensions… captured most of the neural responses to visual stimuli”, and adding more and more dimensions “increased that predictive power only by smaller and smaller increments”.
This connects to the power laws taught in undergraduate and graduate courses in a veryunexpected way. On first glimpse neural networks don’t give the impression like the regular networks at all, and yet they are ultimately both networks and therefore behave strangely in similar ways, in spite of being so deeply different in origin.