Aaron Frank
Abstract: Through uncovering and manipulating the patterns of nature, reductionist approaches have been remarkably effective in a variety of domains within science and engineering. Due to its success, the mechanistic sensibilities of this paradigm have extended beyond practical application and increasingly shape the cognitive frames which inform our ideas of truth about reality itself. By conflating predictive power with metaphysical ‘truth,’ this paradigm proposes a view of existence as being a static and rigid machine of cause and effect which overlooks the inherent nonlinear and chaotic movements of nature. Drawing on breakthroughs in fields including complexity science, quantum mechanics, and formal logic, this essay integrates the spiritual worldview of yoga and P.R. Sarkar’s Neohumanism to offer an understanding of nature as a dynamic, living system which will come to redefine our relationship to concepts of ‘truth’. Rather than a single universal truth, many truths exist and must be situated in their appropriate context.
At the annual TED Conference in 2022, surrounded by an audience of high-profile scientists, entrepreneurs, and intellectuals, Elon Musk was asked by his interviewer to describe the motivation fueling his activities. The response, perhaps surprising in the context of a discussion about space rockets and electric cars, was a philosophical self-assessment of his innate desire to understand the foundational truth of reality:
“Whatever condition I had, I was just absolutely obsessed with truth. So, the obsession with truth is why I studied physics, because physics attempts to understand the truth of the universe. Physics is just, ‘what are the provable truths of the universe’, and truths that have predictive power,” he told his audience (TED, 2022).
This description provides insight to Musk’s metaphysical worldview and relationship to ‘truth’. By proposing that physics is the discipline best situated to uncover it, Musk is giving expression to an entirely common epistemic approach to ‘truth-seeking’ grounded in a materialist/reductionist paradigm which has operated as the dominant modality of western culture since at least the sixteenth century (Lent, 2017).
The rationale for viewing truth through the lens of physics and predictive power typically goes as follows:
Because nature operates according to consistent and discoverable laws, these laws must exist in the fundamental structure of reality itself. Therefore, there are external and mind-independent objective truths such as mathematical statements, which are universal in nature and therefore do not exist as a result of being perceived by the human mind. These truths, which exist irrespective of a subjective knower, indicate that math itself is discovered and not invented. In this view, the work of physics is to develop models that align with some fundamental truth about the nature of reality, and the success of a model is a test of how closely we’re able to reflect something intrinsically ‘true’. Therefore, reality must be objectively ‘real’, separate from the observer, independent of mind or perception, and can provide feedback to the observer in the form of predictive success.
In this worldview, something is “true” to the extent that it helps us predict future outcomes. Truth is also universal, singular, and the scientific journey of progress is a climb up a fixed hierarchy of discovering truer and truer models (Eriksen, 2024). It is also common within this framing to view this as a project of learning to “command nature” (Eriksen, 2024).
When intellectuals describe truth in this way, what they are describing is a machine. Prediction works because reality is presumed to be a fixed, rigid, and stable system of cause and effect.
In many respects, this reductionist approach is certainly useful. Discovering the consistent rule-based principles guiding nature has allowed us to improve society within areas including engineering, agriculture, medicine, astronomy, and computing. Not satisfied to allow useful models to be considered merely ‘useful’ however, their ability to ‘command nature’ has persuaded many intellectuals that they are also a proxy for the fundamental truth of reality. As a result, reductionism, with its mechanistic sensibilities, has not only come to dominate our scientific methodologies but also our conceptual frames for understanding existence itself.
As cognitive historian Jeremy Lent demonstrates in his work, “the cognitive frames through which different cultures perceive reality have a tremendous impact on guiding their historical direction” (Lent, 2017). Lent’s argument builds on philosopher Stephen Pepper’s concept of “root metaphors”, foundational assumptions about the nature of reality which operate as hidden assertions underpinning a culture’s knowledge system (Pepper, 1935). Those concepts of truths then inform answers to questions pertaining to ‘the meaning of existence, ‘our purpose,’ and ‘how we should conduct ourselves in the world.’ The structures of thinking guided by a society’s root metaphors will shape their cultural values and how they make socio-political choices within their environment.
Therefore, it’s worth taking Elon Musk’s machine-centric views of ‘truth’ quite seriously. More than 200 million people follow his account on X and given his ventures in space technology and social media, it’s difficult to point toward an individual more influential in shaping our collective cognitive frames and subsequently steering the direction of human activity on earth and in our solar system.
His worldview builds on a variety of conceptual premises that often go unquestioned. They postulate a metaphysics, a term I use to refer to a system of thought regarding the fundamental nature of reality, defined by assumptions situated within a context of scientific materialism, ontological reductionism, and philosophical realism.
As this essay explores, some of the most important discoveries of the last century in fields including environmental science, quantum mechanics, and formal logic are piecing together a view of reality suggesting a metaphysics quite different from today’s dominant worldview. From Kurt Gödels’ incompleteness theorems to Edward Lorenz’s discovery of deterministic chaos, western science is uncovering a nature governed by fundamental limits to the “universality” of our models, non-linearity and unpredictable change, and one where the relationship between parts is far more revealing than any one isolated piece of nature.
And we’re uncovering a universe filled with a plurality of truths, each suited to its appropriate context.
Scientific Materialism
At the core of Musk’s worldview, as is true for many influential thinkers today, is the idea that all phenomena in the universe, including consciousness, can be explained within a materialist paradigm. It’s certainly not a new idea that physical matter is the fundamental substrate of reality (Stoljar, 2024), or that consciousness somehow emerges from complex biochemical activity in the brain. Many scientists simply take the fact that the universe is composed of matter, at a foundational level, as self-evident.
Elon Musk has repeatedly stressed his desire to build rockets that can take us to Mars in order to ‘maximize the probable lifespan of consciousness’ (Musk, 2024), an indication of his foundational materialism. In this understanding, human bodies are consciousness producing machines and so some number of them must be relocated to Mars, now a celestial safety deposit box, in order to preserve the existence of it in our universe.
Materialism as a paradigm for explaining fundamental reality, however, is certainly reaching its limitations even within physics itself. Nima Arkani-Hamed, a theoretical physicist at the Institute for Advanced Study in Princeton has argued that “spacetime is doomed” (Arkani-Hamed, 2017) meaning that spacetime does not appear to be fundamental reality but rather an emergent phenomenon built on something deeper. Arkani-Hamed’s idea is supported by his work introducing ‘the amplituhedron’ (Arkani-Hamed, 2017), a geometric structure that simplifies calculations of particle interactions which doesn’t rely on spacetime or locality (the assumption that particles only affect those close to them), both of which are core assumptions in traditional physics.
By showing that physics can be formulated without reference to spacetime, it both challenges the deeply held view that spacetime is the arena in which all phenomena occur and suggests that spacetime emerges from something more fundamental and may not be the ground from which our conscious experience emerges.
At a minimum, Arkani-Hamed’s work suggests we’ll need to reconfigure many of the core assumptions that govern our dominant views of reality and the metaphysics they inform.
Reductionism
Since materialists correctly point out that physical reality is guided by a collection of consistent rules, they conflate the usefulness in discovering these principles as being answers to fundamental questions of truth. The pursuit of a ‘theory of everything’, which we now know doesn’t exist (Bischoff, 2024), is fueled by the alluring promise of perfect knowledge and the total control of nature.
To achieve this, a given system must be ‘reduced’ to some defined set of fundamental components to analyze and measure its behavior. In this paradigm, truth can only be known about that which is quantified, modeled, and predicted or can be formalized within a symbolic system like math or language. While reductionism as a methodology is certainly a useful tool, it fails as an ontology for accessing ‘truth’ in several significant ways.
Humanities scholar and AI researcher Alix Rübsaam, points out that reductionism assumes that objects can be demarcated into symbols in a universally defined or singular way; a mode of thinking which has only accelerated in the age of digital computation. Rübsaam points out that the process of formalizing the world into datasets ready for analysis is always a culturally embedded practice which can vary greatly. It is therefore a subjective process contingent on a perceiver making choices how to structure their taxonomies and modes of analysis (Rübsaam, 2020). This challenges the views of universality which pervade reductionist ontologies.
Complex Systems scientist, Carlos Gershenson has pointed out that “reductionism is contingent on separation and so ignores interactions between parts. If interactions are relevant, then reductionism fails as a tool for studying, analyzing, and understanding that phenomena or system” (Gershonsen, 2011).
And finally, even within physics, reductionism makes frequent use of approximation. Certainly useful as a tool due to the complexity of the real-world, by simplifying the problem physicists can obtain approximate solutions that capture the essential behavior of a system. One example is the ‘ideal gas law’, an equation used to assume certain properties that no gas actually possesses in the real world. The equation allows for the study of their behavior since the behavior of real gases are described closely by the equation (Britannica, 2024). Here, reductionism is obscuring fundamental truths about the properties of a gas, a feature and not a bug of the methodology, in order to be useful. As we’ll explore, classical Newtonian mechanics is built entirely on this approach because real numbers with infinite precision cannot be collected experimentally in the real world (Volovich, 2011).
Reductionist physics in the context of Elon Musk’s views of ‘truth’, therefore, is far more about finding useful tools to control nature than it is about defining fundamental ‘truth’.
Philosophical Realism
Central to Musk’s metaphysics, rooted in philosophical realism, is the idea that an objective reality exists independently from a perceiver. In this view, a separation is assumed to exist between a subjective knower and the objective known. The idea that the truth of reality is mind or perception independent, however, establishes exactly the type of subject/object duality being increasingly undermined by recent developments in fields like quantum mechanics.
“Much like our universe itself, complex systems are adaptive in that they can respond dynamically to changes in their environment, are self-organizing, and function without regard to an external control.“
While realism is complex territory within philosophy and selectively applied by philosophers depending on the subject matter (Miller, 2024), technologist intellectuals like Musk tend to default to realist views when discussing ‘truth’ and predictive power.
The discoveries of quantum mechanics during the 20th century raise serious doubt that there is a separation between the observer and the observed, a core assertion of realist thought. One of the implications of quantum mechanics is that ‘a quantum system behaves differently when we observe it than how it behaves when we are not observing it.’ (Richheimer, 2021). Though still debated with many finding ways of preserving materialist interpretations, Stanford trained scientist Steven Richheimer points out that the implications seem to be that an independent objective reality doesn’t exist. “Somehow observation is fundamental to how reality manifests” (Richheimer, 2021), which compromises the mind-independence and subject/object duality held by realism. Later, this essay will address the apparent mind-independence and objectivity of math.
Rübsaam, in her work, points out that the philosophical context underpinning reductionist approaches to science in the last few centuries, especially in the west, often held that if a perceiving subject is merely ‘reasonable’, which is a culturally embedded concept defined and socially reinforced by those in positions of authority, then reality can simply be perceived ‘as it really is’ and is therefore objective (Rübsaam, 2020). This sleight of hand, often invisible to realist thinking, turn a perceiving subject assumed to be free from foundational assumptions underpinning their methodology of inquiry, into a neutral observer of an objective reality.
Even within formal logic and mathematics, the idea of being an assumption-free neutral observer fell apart with the discovery of Kurt Gödel’s Incompleteness Theorems, discussed later, which points out that “what mathematicians can prove depends on their starting assumptions, not on any fundamental ground truth from which all answers spring” (Wolchover, 2020).
Therefore, any formal model of reality is built on some foundational assumptions which themselves cannot be proved from within that model. Making choices about assumptions is necessarily the starting point, therefore scientific, mathematical, or philosophical inquiry into the nature of reality must involve an inseparable link between a subject making choices of the method and structure of inquiry and the object being analyzed.
The view of seeing nature as a machine, while useful, has extended far beyond its appropriate domain and is fueling a global culture built on the principles of separation and unconscious linear machine-like repetition. If culture shapes our values, and those values shape history (Lent, 2017), we’re embedded in a society whose metaphysics tells us we, as conscious subjects, are separated from some fundamental source of reality, which is one that could be described as a static, rigid, and lifeless machine.
This, in turn, is driving catastrophic outcomes on our planet.
These views need revision to account for a new and rapidly emerging paradigm of complexity science and systems thinking. This emerging modality suggests that reality may in fact hold the intrinsic qualities of being dynamic, interconnected, and in some sense even ‘alive’.
The Emerging Paradigm of Complex Systems
To understand what complexity science is, it’s helpful to visit its origin in the lab of MIT meteorologist Edward Lorenz in 1961, as recounted in James Gleick’s Chaos. Fueled by the deterministic promise of Newton’s laws, Lorenz hoped to use the data-processing horsepower offered by computers, then a breakthrough technology, to reveal the rule-based activities of weather much like astronomers had uncovered the movement of our planets.
At first, Lorenz’s results looked promising. Though computational limits forced him to shape his model into a relatively simple collection of rules, Lorenz mesmerized his colleagues who would gather around the printout of his god-like prediction machine. Over time, familiar patterns emerged which mimicked the behavior of observable weather in the world. The assumption then, was that the difference in forecasting weather from predicting the movement of our planets, was simply one of data processing workload. Once computers became capable of handling the increased number of calculations involved in meteorology, forecasting would surely become as exact as planning the movements of the cosmos.
That dream collapsed entirely by accident one morning in the winter of 1961.
Hoping to view one of his models through an extended amount of time inside a graphical interface he was developing, Lorenz decided to re-run a simulation he’d previously conducted. To lighten his workload, he gave the second simulation’s computer its initial conditions from a printout of the first simulation taken at the midway point of its analysis. The second run should have matched the output of the first, yet when Lorenz returned to his office and discovered a model which had quickly and wildly diverged, his first instinct was to assume the computer had malfunctioned.
When Lorenz discovered the true culprit of the discrepancy, however, the insight would, in the words of the committee that would later award him the Kyoto Prize in basic sciences, “[bring] about one of the most dramatic changes in mankind’s view of nature since Sir Isaac Newton.” (Chang, 2008).
The computer that ran his models stored data out to six decimal places, but to save space on his printed results, it was shortened to only three. So, an initial condition in the first simulation of .506127 became .506 in the second. Lorenz had assumed that the rounded off numbers, reflecting a difference of one part in a thousand, was inconsequential (Gleick, 1987).
His discovery laid bare the idea, core to understanding complex systems, that predicting the future outcome of systems like weather are tremendously sensitive to their initial conditions. Though the system may be deterministic, perfect prediction is an impossibility. Lorenz’s discovery pointed out that scientists ‘marching under Newton’s banner’ of mechanistic determinism, “always made one small compromise, a compromise so small working scientists often forgot it was there lurking in a corner of their philosophies like an unpaid bill. Measurements can never be perfect.” (Gleick, 1987).
Infinitely precise measurements don’t exist. Astronomers don’t achieve perfection, but calculations of planetary motion were so accurate that people forgot they were ‘forecasts’ (Gleick, 1987).
This faith in approximation, a foundational pillar of both Newtonian physics and western reductionist science, holds that predictive models can ignore features that have small effects. When predicting the arrival of a comet, for example, “if approximately accurate inputs give approximately accurate outputs, a tiny discrepancy can remain invisible for millions of years” (Gleick, 1987).
The success of the reductive use of approximation in many domains over the past several centuries, Musk’s orbiting Starlink satellites included, has resulted in a scientific landscape which now overuses the tool of reduction beyond the boundaries of where appropriate or useful.
Oxford researcher, Brian Klaas, has pointed out that in the social sciences, researchers in fields like economics, psychology, and political science have come to depend on the reductionist tool of linear regression (Klaas, 2024). By analyzing historical data, linear regression models seek out simplified cause and effect relationships to determine which variables drive change in an environment. This approach presumes to convert the messy and dynamic behaviors of nonlinear systems into the cold predictable machinery of cause and effect.
“By smoothing over near-infinite complexity, linear regressions make our nonlinear world appear to follow the comforting progression of a single ordered line. This is a conjuring trick. And to complete it successfully, scientists need to purge whatever doesn’t fit. They need to detect the ‘signal’ and delete the ‘noise’. But in chaotic systems, the noise matters.” (Klaas, 2024).
The attribute of nonlinearity inside complex systems, means that interactions occurring between parts generates new information not present in the initial conditions (Gershonsen, 2011). This concept of feedback loops is core to the inner working of complex systems.
“When modelling the movement of a hockey puck sliding on ice, for example, you cannot assign a constant to the importance of friction because its importance depends on speed. The speed, however, depends on the friction. The act of playing the game has a way of changing the rules” (Gleick, 1987).
“A complex system is thus not only one whose behaviors are incredibly sensitive to initial conditions, where nonlinearity produces novel information mid experiment, but also one in which the number of independent interacting components is large” (Ladyman et. al, 2012).
The paradox, however, is that complex systems also follow “lawlike and causal regularities, and various kinds of symmetry, order and periodic behavior (Ladyman et. al, 2012). That is, they follow rules and patterns. Therefore, complex systems require us to navigate both the rigidity of machine-like repetition and the flexibility of chaotic variability. And much like our universe itself, which we’ll come to, complex systems are adaptive in that they can respond dynamically to changes in their environment, are self-organizing, and function without regard to an external control.
Therefore, at the conceptual heart of this new paradigm of science, is a core principle that relationships between components in a system dynamically interacting is as important as any one isolated part. No longer a talking point exclusive to the tree-hugger types, science is indicating more loudly than ever that reality is intrinsically interconnected as one single harmonizing system.
A New Approach to Scientific Inquiry
Approaching ‘complexity science’ as a discrete area of study under its own domain misunderstands the implications of its core idea. Philosopher James Ladyman and colleagues ask, “whether there is such a thing as complexity science, rather than merely branches of different sciences, each of which have to deal with their own examples of complex systems” (Ladyman et al, 2012).
Systems thinking, as it’s often called today, is a way of approaching the questions we ask and solutions we build rather than its own field of study. To highlight how this paradigm shift is reorienting entire branches of the scientific landscape, it’s worth exploring the life sciences as one example of a domain currently experiencing a remarkable transformation.
For several centuries, the conceptual frames of reductionism have permeated views of biology and in the digital age there is a tendency to equate living cells with computers. As an expression of this, Craig Venter, the scientist who led the first team to fully sequence the human genome is entirely mechanistic in his understanding of biology. He says, “life is a DNA software system. All living things are solely reducible to DNA and the cellular apparatus it uses to run on” (Corbyn, 2013).
This view, a form of genetic determinism, is reckless in its overstatement. Inside a cell, nonlinearity is on full display as billions of interacting molecules change their behaviors from one environment to the next (McCarty, 2024). We now know that the ingredients of life at the molecular level are constantly shifting their activity based on what is happening around them.
Jeremy Lent writes, “since the discovery of DNA in 1953, we’ve come to learn that proteins act directly on the DNA of the cell, specifying which genes in the DNA should be activated. What this means is that there is no such thing as a ‘gene for something’ but rather genes are expressed within the cell because of what is going on around them” (Lent, 2021).
As a dramatic example he highlights the case of grasshoppers and locusts, commonly understood as being different insects which even look quite distinct. Yet there is no taxonomic difference because they have the exact same DNA (Dobbs, 2013).
Lent writes, “When certain kinds of grasshoppers sense its environment changing, either from food scarcity or overcrowding, it can transform itself within hours into an aggressive locust. Its cells switch on different genes within its DNA; it begins shrinking its legs and wings, changes its coloring, even grows its brain to deal with the social complexities of the swarm. Later on, when the environment improves, its cells again switch their DNA settings, and the locust magically transforms back into a grasshopper.”
Reflecting Rübsaam’s ideas regarding difficulties categorizing the world for analysis, genetically speaking, a grasshopper and locust are identical. The way in which we articulate a distinction says nothing about their ‘DNA software system’ and is entirely contingent on a relationship of interactions occurring between their genes and the environment.
Perhaps instead of thinking of DNA as a deterministic machine, one emerging metaphor is to consider it like a piano keyboard where each key accounts for a segment of DNA capable of expressing a certain trait (Miller, 2012). In that sense, genes are not our destiny, but DNA rather sets the boundaries of which notes might be played by the environmental piano player.
Many research teams within the life sciences are certainly aware of the shift in thinking about the nature of biology and many now take a systems approach to their work. However, our cognitive frames have yet to realign to these new understandings. Unlike what reductionism suggests, nature cannot be understood by way of freezing it in place. As Alan Watts writes, “our universe, including ourselves, is thoroughly wiggly” (Watts, 1966).
The Spiritual Worldview: Seeing the Universe as a Complex Adaptive System
Rather than demanding an entirely new metaphysics, developments in complexity science align perfectly well with the spiritual philosophies of yoga. The teachings of P.R. Sarkar, of which the yogic ideas presented here are based, are remarkable not for their novelty but for their synthesis combining Vedic and Tantric concepts already several thousand years old. Sarkar has mainly reinterpreted these ancient ideas within the context of modern understandings in physics, human physiology, and bio-psychology.
“Our mainstream thought structures are still saturated by centuries of language and metaphors equating nature with a machine (and more recently with computers) and it will take time for a paradigm shift to penetrate our metaphysical perceptions of reality.“
If, as Gödel’s second incompleteness theorem indicates, models of reality rest on unproveable foundational assumptions, the key distinction between realist materialism and the spiritual worldview of yoga can be understood as conflicting claims about the fundamental substrate of reality. Where materialism assumes it is physical matter, the spiritual worldview asserts that consciousness is the substance of existence. The nature of consciousness, if it can be referred to as a thing at all, is an all-pervading field of awareness or fundamental sense of “I”. It is an undefinable formlessness capable of expressing itself as the energetic waves which comprise the material universe and the physical matter we experience.
Yoga tells us that reality is more an imagining mind than a programmed machine.
Within this metaphysics, existence can be thought of as one infinite, vibrating, and wiggling “self”. Indivisible wholeness is entirely counterintuitive to our personal experience as subjective perceivers of separateness, since we experience reality through the perspective of a finite ‘unit self’. The cosmic or absolute self refers to the underlying infinite consciousness in its totality or the singular wholeness of existence (Sarkar, 1955). Yoga, meaning ‘union’ in Sanskrit, not only puts forward these claims through a philosophical knowledge system, but also comprises an embodied set of practices aimed at mediating the relationship between our unit self and its desire to seek union with the indivisible whole. When a unification occurs, the unit consciousness ceases to experience a separate identity (Sarkar, 1979).
In the language of quantum physics, “the wave function, not matter, is fundamental reality” (Richheimer, 2021). A wave function is simply a collection of probabilities about the state of a quantum system, and only becomes ‘real’ in the experienced sense when observed. “The wave function that describes the entire universe is fundamental reality, and from the spiritual point of view is called cosmic mind” (Richheimer, 2021). An important feature of the wave function is that it cannot be expressed as a collection of separate parts, but as an interconnected web of possibilities (Richheimer, 2021).
Thus, what yoga and complexity science share is both a primary focus on interconnected relationships rather than isolated parts, as well as an integration of consistent principles governing a system and the unpredictable ways it can express itself.
Within the yogic system, which is comprised of rigid guidelines, is also the concept of ‘time, place, and person’ (Sarkar, 1957). The moral laws of Yama and Niyama for example, a core pillar of yogic philosophy, are built on foundational principles that are universal in nature yet cannot be mechanistically codified due to the inherent flexibility needed to adhere to them. Following Yama and Niyama may require one set of behaviors in one context yet require seemingly opposite behaviors within another. It is for this reason that yoga places so much emphasis on developing intuition through meditation as a sensemaking tool, and why Sarkar’s concepts of Neohumanism call for an integration of intuition with the rational logic of conceptual reasoning (Sarkar, 1982).
He writes that, “society is not a static entity, but a dynamic one and so no single economic, political or religious structure can be the permanent answer to humanity’s needs. This is because theories are born in a particular temporal, spatial and economic environment. It may be that something which is quite useful for a particular time, place and person is totally worthless for a different time, place or person. After observing the effectiveness of a theory in a particular context, short-sighted people begin to believe in its eternal effectiveness. This is a total illusion” (Sarkar, 1957).
The idea that the world is a machine or that nature can be conquered with the tools of prediction, simply doesn’t map to this view of existence. Much like in a complex system where dynamic changes in the environment can generate new information not present in the initial conditions, the concept of ‘time, place, and person’ allows for yogic systems to adapt to changes in society. It is this element of yoga which maintains unchanging universal principles, while adapting to the ways that “playing the game changes the rules.”
Complexity science and the spiritual worldview of yoga both invite us to see the intrinsic connectedness of existence and the irreducibility of dynamic systems. We are not merely separate parts but also participants in a unified and connected whole, and for western cultures, these cognitive frames are still foreign to our reductionist patterns of thinking (Lent, 2017).
The Systems Approach to Truth: An Ecology of Contexts
At the core of a metaphysics which equates truth with predictive power is the idea that truth is something inherently universal, unchanging, and singular in nature.
“Both systems thinking and yogic perspectives tell us that defining the correct methodology or system of inquiry, whether logic, intuition, or something else, requires situating it within the appropriate context.“
While complexity science doesn’t inherently refute realist philosophy or materialism, it certainly invalidates reductionism as being the tool for accessing ‘truth’. Complexity science tells us that perfect prediction is an impossibility, therefore any epistemology which equates truth with predictive power must either concede that absolute truth does not exist, or that it is inherently unknowable.
Yogic philosophy proposes that an absolute truth of indivisible wholeness exists, is tightly coupled with the concept of infinity, but is beyond the scope of the material world. Our physical and conceptual sensemaking systems cannot formally conceive of absolute truth through cognition, logic, or reasoning. Infinity, by definition, cannot be reduced to any definable conceptual or symbolic representation. This infinite wholeness can be directly ‘realized’ as a sort of embodied experience, but it cannot be formalized in any symbolic system of math or language.
Therefore, yoga proposes, and complexity science seems to agree, that there is no one singular ‘truth’. Truths are always contextual, and those contexts have limits to the boundaries within which things can be said to be ‘true’.
To oppose this claim, a Platonist within realist philosophy will point toward mathematical statements such as 1+1=2. The apparent universality of such a fact would indicate that math cannot be a subjective invention of the human mind, that math is discovered and not invented, and that mathematical truths are universal in scope. They argue that this serves as evidence for the existence of external objective truths.
However, rather than confirming an external ground truth, the power of mathematics in its explanatory capability is just as likely an expression of a well calibrated relationship between a particular system of perception and its environment (or the unit self and an aspect of the larger absolute).
A yogic perspective would make space for more nominalist views in pointing out that mathematical objects like numbers, rather than being an objectively real entity, are symbols used to reflect back what we perceive in the world. A two isn’t objectively real in some platonic realm, but rather a symbol humans use which captures the perceptual experience of seeing a ‘pair’ of things, just like the word ‘rock’ is a symbol to encode for a certain object we encounter.
The ‘truth’ of mathematical statements that result from manipulating these symbols is also relative to a particular system’s adopted rules and axioms. For example, when most people think of math where 1+1=2, they are building from a set of axioms, or starting assumptions, put forth by Giuseppe Peano, the developer of ‘Peano arithmetic’ (Hosch, 2024). This system, which forms the foundation of the math taught in grade school, formalized the number theory which underpins everything from basic algebra to the algorithms used by most computers today.
Published in 1931, Gödel’s two incompleteness theorems shocked the world of formal logic when it showed that even Peano arithmetic is incomplete and incapable of proving its own consistency (Sautoy, 2021). Gödel achieved this by encoding self-referential statements into arithmetic using a formal method, demonstrating that there are certain true statements which cannot be proved from within the system.
Though Gödel’s theorems apply specifically to formal systems capable of describing their own rules, their philosophical implications are significant. First, there is no such thing as a ‘theory of everything’ as there will always be some unprovable yet true statements within any symbolic system of logic contingent on the starting assumptions. Second, it means that the usefulness of even Peano arithmetic, which is certainly considerable, will always be constrained by theoretical incompleteness and will always remain particular to a certain context or domain, even if a large enough one as to appear to us as universal.
Though Gödel attempted to resolve the implications of his breakthrough within his own realist worldview (Raatikainen, 2022), his theorem indicates that what appears as objective truth in mathematics does not require some independent and separate existence. It is simply reflecting the internal consistency of a particular formal system. Much like how the rules of chess are invented, once established, players can “discover” the best moves within that system (Weir, 2024). Mathematics is similar as a constructed framework where discoveries follow from invented rules.
The invented rules of mathematics are a result of things that appear so fundamentally self-evident to our perceptual systems that most people assume they are a representation of an ‘objective’ external truth. And the success of mathematics in science and engineering is due to its effectiveness as a tool to engage with the nature we observe. In a practical sense, the bounded territory within which certain mathematical statements are true may be so large as to reasonably refer to them as ‘universally true’, yet it requires an act of faith to presume that those true statements exist as an objective realm of truth independent from a subjective observer.
Additionally, the label ‘universal’ has a poor track record of remaining permanent.
As Jeremy Lent, writes, “even within mathematics, laws once viewed as universally true are sometimes later found to describe a more constrained set of circumstances. For example, Euclid’s laws of geometry were considered universally true until the nineteenth century, when a series of breakthroughs led to the conceptualization of geometry in curved space following different laws, which became known as non-Euclidean geometry. Similarly, Newton’s laws were viewed as universally applicable until Einstein demonstrated they were not valid in certain circumstances. In neither case was Euclid or Newton proved wrong, rather the scope of their laws, once thought to be universal, was constrained by new findings.”
What happened to Newton now appears certain to happen to Einstein and the standard model which currently operates as the dominant ‘universal’ explanation for understanding our cosmos (Dominant Model of the Universe is Creaking, 2024). Scientists had presumed that the universe was expanding at a constant accelerating rate everywhere based on the assumption that the density of dark energy has been the same since the universe began. New research recently confirmed by the James Webb Space Telescope (Ouellette, 2024), which has the distinct odor of a dynamic, feedback driven, non-linear system, now suggests that the features of dark energy are not constant across spacetime as previously thought. This means that the standard model may hold true within our corner of the universe, but the universe may not be accelerating away from itself in all places. This aligns with a core concept of Sarkar’s teachings that the idea of a ‘heat death’ in our universe is a myth (Sarkar, 1959), and it looks increasingly plausible to say the universe is itself a complex adaptive system.
Just as it appears to be the case in cosmology, and as yogic philosophy argues, truths like Newton’s laws, Einstein’s Standard model and whatever is coming next will remain contextual rather than universal. Rather than a scientific journey up a fixed hierarchy of truer and truer ideas, as many intellectuals today propose (Eriksen, 2024), both yoga and systems thinking shows us that discovering truth is far more about matching a particular truth to its appropriate environment.
Many of the ideas proposed here can feel destabilizing, disorienting and uncomfortable, as if arguing a nihilism in which there is no ground truth to stand on. Taking a yogic view, however, points out that a ground truth, relative to our personal experience of separateness, does in fact exist. But rather than being some separate external realm, ground truth exists in relationship to another part of the indivisible ‘self’ of existence. In that sense, nature, which we’re seeking to understand through the symbolic systems developed from our perceptions of it, is an extension of ourselves.
Reality, then, can be understood as a singular infinity expressing itself as a multitude of finite forms exploring itself through various conscious perspectives. In that sense, the human endeavor of philosophy, science, religion, or any domain seeking absolute ‘truth’, is an expression of infinite consciousness developing finite versions of itself which then work to discover what it, itself, is. This as a process, by definition, will never reach completion. And it is exactly this yogic perspective which helps resolve the apparent self-referencing paradox at the heart of Gödel’s results.
Those who hold reductionist views of a singular objective truth, and who even seem to understand Gödel’s theorem, often dismiss the “self-referential trickery” (Eriksen, 2024) as irrelevant to the project of finding truer models, as if it’s an annoying fly to shoo away. In fact, the self-referential statement in his theorem is pointing at exactly the deeper truth of ‘self-realization’. Much like a finger can never point at itself, it can only be itself, the yogic goal of ‘self-realization’ and union with an indivisible whole is an absolute truth which cannot be conceptually known with logic but only experienced directly.
Conclusion: Grounding Truth Within Neohumanism and the Practical Reality of the Material World
At its core, the project of yoga is one of aligning both the mental and physical patterns of the individual (unit self) and the collective movement of society with the energetic wave signatures of the cosmic mind. When an individual’s mental and physical energetic patterns merge with the thought projections of the cosmic mind, a distinction between the two no longer exists resulting in the experience of ‘union’ to which the word yoga refers.
Neohumanism as a philosophical orientation built on these spiritual ideas, comprises a variety of tenets intended to align a society with the deeper principles intrinsic to the cosmic mind. Therefore, it is challenging to create a clear demarcation of Sarkar’s views into the opposing categories of realist and idealist philosophy. Sarkar’s views may be classified as a form of spiritual monism, situated within an idealist view of consciousness as fundamental, yet maintains elements more typically associated with realist and Platonist thinking.
This is the case in part due to the idea that Sarkar rejects the idea, common in some varieties of spiritual thinking, that the material world is somehow just an illusion. Though derived from the cosmic mind as a thought projection of infinite consciousness (idealism), in a practical sense the material world is real and deserving of our full participation grounded in ethical principles universal to humanity.
Therefore, while his views are certainly relativist, he rejects versions of relativist thinking which propose that no universal benchmarks with which to analyze moral behavior or structure society exist (Sarkar, 1957). Much like systems thinking attempts to reconcile the relationship between rule-based patterns with the unpredictability of variable expression, Neohumanism proposes that there are deeper principles of morality intrinsic to the cosmic mind and therefore universal across humanity. Sarkar has criticized forms of moral relativism from the past which have resulted in confusion within society (Sarkar, 1957).
While Sarkar does maintain that a subjectivity exists at the heart of even the most objective-seeming truths, Sarkar’s Neohumanism grounds these truths as practical elements to be used in the material world. Rübsaam, in her work, uses the phrase ‘operational truths’ to account for the usefulness of concepts perceived as ‘objective’ through most reasonable sensemaking systems.
The nature of inherent subjectivity and fluidity to the boundaries of what is considered ‘reasonable’, underpins Neohumanism’s call to integrate conceptual reasoning (rationalist approaches) with the development of intuition through structured practices of meditation. Intuition developed through meditation is simply an alternative methodology of inquiry into the nature of existence, and one that should be integrated with the tools of logical reasoning.
Neohumanism is also rooted in a practicality that would find it unnecessary to debate whether mathematical concepts platonically exist as some aspect of existence or are invented through human conception. It is more concerned with the capacity of math as a practical tool in the service of promoting welfare for society based on deeper principles universal to humanity.
Therefore, questioning the limits of reductionism as a mechanism for discovering truth is in no way an indictment of its ability to uncover useful models. What both systems thinking and yogic perspectives tell us, is that defining the correct methodology or system of inquiry, whether logic, intuition, or something else, requires situating it within the appropriate context.
Measured against the backdrop of several centuries of reductionist thinking in the west, science is only just awakening to the new paradigms of systems thinking. Our mainstream thought structures are still saturated by centuries of language and metaphors equating nature with a machine (and more recently with computers) and it will take time for a paradigm shift to penetrate our metaphysical perceptions of reality.
Reductionist systems of analysis will continue to offer views into the patterns of nature which allow us to control our environment. Our cognitive frames, however, need time to absorb the lessons of complexity science asking us to see reality, not as a rigid machine to be controlled with a fixed universal truth, but as a dynamic living system full of contextual truths to align with in harmony.
Aaron Frank is a researcher, writer, and consultant who has spent over a decade in Silicon Valley advising senior leaders on issues related to emerging digital technology. He currently serves as a Global Fellow at Singularity University where he teaches about the impact of accelerating technological change on business, society, and culture. He routinely advises companies, startups, and government organizations with clients including several national governments, Ernst & Young, and many others. As a writer, his articles have appeared in Vice, Wired UK, Forbes, and Venturebeat. He also guest lectures on technology innovation at Oxford University’s Saïd Business School where he completed his MBA.
References
Arkani-Hamed, N. (2017, December 1). The Doom of Spacetime—Why It Must Dissolve Into More Fundamental Structures—Nima Arkani-Hamed. PSW Science. https://pswscience.org/meeting/the-doom-of-spacetime/
Arkani-Hamed, N., & Trnka, J. (2014). The Amplituhedron. Journal of High Energy Physics, 2014(10), 30.https://doi.org/10.1007/JHEP10(2014)030
Bischoff, M. (2024, November 29). Math and Physics Can’t Prove All Truths. Scientific American.https://www.scientificamerican.com/article/math-and-physics-cant-prove-all-truths/
Britannica. (2024, October 25). Ideal gas law | Definition, Formula, & Facts | Britannica. https://www.britannica.com/science/ideal-gas-law
Chang, K. (2008, April 17). Edward N. Lorenz, a Meteorologist and a Father of Chaos Theory, Dies at 90. The New York Times. https://www.nytimes.com/2008/04/17/us/17lorenz.html
Corbyn, Z. (2013, October 13). Craig Venter: “This isn’t a fantasy look at the future. We are doing the future.” The Observer. https://www.theguardian.com/science/2013/oct/13/craig-ventner-mars
Devon Eriksen [@Devon_Eriksen_]. (2024, April 17). None of that matters, because truth is objective, and can be tested, and has one simple, objective definition. Truth is predictive power. That’s it, that’s all, that’s everything. A statement is “true” to the extent that it helps us predict the future, and the outcomes, in that future, of choices. [Tweet]. Twitter. https://x.com/Devon_Eriksen_/status/1780708940076159444
Dobbs, D. (2013, December 3). The selfish gene is a great meme. Too bad it’s so wrong | Aeon Essays. Aeon.https://aeon.co/essays/the-selfish-gene-is-a-great-meme-too-bad-it-s-so-wrong
Elon Musk [@elonmusk]. (2024, November 21). To be clear, I have not done any media interviews and this is not actually my checklist. I am trying to make life multiplanetary to maximize the probable lifespan of consciousness. Some of the items below are needed for that. [Tweet]. Twitter. https://x.com/elonmusk/status/1859605758091894795
Gershenson, C. (2011). Complexity (arXiv:1109.0214). arXiv. https://doi.org/10.48550/arXiv.1109.0214
Gleick, J. (1987). Chaos: Making a New Science. Viking Penguin.
Hosch, W. (2024, November 22). Natural number | Definition & Facts | Britannica. https://www.britannica.com/science/Peano-axioms
Ladyman, J., Lambert, J., & Wiesner, K. (2012). What is a Complex System?
Lent, J. (2017). The Patterning Instict: A Cultural History of Humanity’s Search for Meaning. Prometheus Books.
Lent, J. (2021). The Web of Meaning. Profile Books.
McCarty, N. (2024, September 2). A Holistic View of the Cell. The Latecomer. https://latecomermag.com/article/a-holistic-view-of-the-cell/
Miller, P. (2012, January 1). A Thing or Two About Twins. Magazine.https://www.nationalgeographic.com/magazine/article/identical-twins-science-dna-portraits
Ouellette, J. (2024, December 9). Latest James Webb data hints at new physics in Universe’s expansion. Ars Technica.https://arstechnica.com/science/2024/12/latest-james-webb-data-hints-at-new-physics-in-universes-expansion/
Pepper, S. C. (1935). The Root of Metaphor Theory of Metaphysics. The Journal of Philosophy, 32(14), 365–374.https://doi.org/10.2307/2016759
Raatikainen, P. (2022). Gödel’s Incompleteness Theorems. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Spring 2022). Metaphysics Research Lab, Stanford University.https://plato.stanford.edu/archives/spr2022/entries/goedel-incompleteness/
Richheimer, S. (2021). The Fallacy of Materialism: How Conciousness Creates the Material World and Why It Matters.
Rübsaam, A. (Director). (2017). Critical Thinking, Decision Making, and Ethics [Video recording].
Sarkar, P. R. (1957). The Great Universe: Discourses on Society. In A Guide to Human Conduct.
Sarkar, P. (1955). What is this World?
Sarkar, P. (1979). A Yogi Must Certainly Be a Theist. In Yoga Sádhaná.
Sarkar, P. (1982). Liberation of Intellect: Neo-Humanism.
Sarkar, P. R. (1959). Idea and Ideology.
Sautoy, M. du. (2021, July). Marcus du Sautoy: The paradox at the heart of mathematics: Gödel’s Incompleteness Theorem | TED Talk.https://www.ted.com/talks/marcus_du_sautoy_the_paradox_at_the_heart_of_mathematics_godel_s_incompleteness_theorem
Stoljar, D. (2024). Physicalism. In E. N. Zalta & U. Nodelman (Eds.), The Stanford Encyclopedia of Philosophy (Spring 2024). Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/spr2024/entries/physicalism/
TED. (2022, April 14). Elon Musk talks Twitter, Tesla and how his brain works—Live at TED2022. https://www.youtube.com/watch?v=cdZZpaB2kDM
The dominant model of the universe is creaking. (2024, June 19). The Economist. https://www.economist.com/science-and-technology/2024/06/19/the-dominant-model-of-the-universe-is-creaking
Volovich, I. V. (2011). Randomness in Classical Mechanics and Quantum Mechanics. Foundations of Physics, 41(3), 516–528. https://doi.org/10.1007/s10701-010-9450-2
Weir, A. (2024). Formalism in the Philosophy of Mathematics. In E. N. Zalta & U. Nodelman (Eds.), The Stanford Encyclopedia of Philosophy (Spring 2024). Metaphysics Research Lab, Stanford University.https://plato.stanford.edu/archives/spr2024/entries/formalism-mathematics/
Watts, A. (1966). The Book: On the Taboo Against Knowing Who You Are. Vintage Books.
Wolchover, N. (2020, July 14). How Gödel’s Proof Works. Quanta Magazine. https://www.quantamagazine.org/how-godels-proof-works-20200714/
1 thought on “Systems Thinking and Embracing the Plurality of Truths”