Another installment of my
serialized work about the “meaning” of biological complexity and what it has to
say about our collective understanding of the natural world. This part examines
the methods we use to conceptualize our world, and some of their built-in
shortcomings…the ways we limit ourselves by seeing life as a product of
material processes and nothing more. Following this section I’ll take readers
on a tour of the cell—a look into the inner workings of every living thing.
IV. Materialism’s Two-Edged Sword
The problem is to construct
a third view, one that sees the entire world neither as an indissoluble whole
nor with the equally incorrect, but currently dominant, view that at every
level the world is made up of bits and pieces that can be isolated and that
have properties that can be studied in isolation.… In the end, they prevent a
rich understanding of nature and prevent us from solving the problems to which
science is supposed to apply itself.
Richard Lewontin, Biology as Ideology: The
Doctrine of DNA
We can still offer thanks to that brilliant
17th century French mathematician René Descartes for turning the entire universe
into an arena open to rational investigation. Materialistic naturalism (or materialism)—a
product of the Cartesian worldview—is
responsible for the basic premises of the scientific method and its inherent
assumption of our world’s intelligibility. The ostentatious-sounding name
simply denotes a philosophic belief that physical matter is the only reality
and that everything in the universe (including thought, feeling, mind, and
will) can be explained solely in terms of physical law. And, by extension, that
all phenomena—being the result of purely natural causes—need no explanation
involving any sort of moral, spiritual, or supernatural influence. Materialism,
by way of its incomparable capacity to describe the world we experience and
then make predictions through its discoveries that allow the control of
nature—has made possible most of the great advances that led directly to our
current way of life. From medicines and improved food crops to scanning
electron microscopes and Doppler radar, science’s creative productions surround
us at all times.
Very little was known about genes before the discovery of DNA
confirmed their material nature. Many experimental doors were throw open once
it was perceived that the double helix acted as an actual bearer of hereditary
information. After that historic shift, impressive successes within the
fledgling field of microbiology (thanks to brand-new technologies used to
analyze and manipulate enzymes and other nucleic acids) provided confidence
that unscrambling the code hidden within the physical substance of heredity
would reveal evolution’s secrets. This led to an attitude of unreserved
certainty that life, just like other aspects of the material universe, would
eventually be explained purely on the basis of its chemical nature and
molecular parts. Riding the wave, Francis Crick (a steadfast physicalist)
wrote, “The ultimate aim of the modern movement in biology is to explain all biology in terms of physics and
chemistry.”
Not long after its inauguration the modern Darwinian synthesis
came to be recognized as the virtual
foundation of biology and, to this day, remains the dominant paradigm of evolutionary thought. However,
biochemical researches conducted during the last few decades, plus findings
from entirely new lines of enquiry (such as the discovery of quorum sensing in bacterial consortia,
whereby colonies of microbes interact socially via chemical signaling) have
revealed that all biochemical processes are far more complex
than was ever imagined and sometimes—as in the case of the “behavior” of DNA
repair enzymes—appear to go beyond the dictates of basic chemistry. As would be
expected, molecular biologists and biochemists, long committed to a materialist
stance, are unwilling to even consider that living things function by means other
than straightforward chemical processes.
The institution of science features a genuine and powerful
taboo against entertaining hypotheses that might open the door to any sort of
supernatural explanations. However, for the same reasons our conceptions of
what life is deserve a fresh look,
newly revealed types of biological complexity invite a re-evaluation of the way
we perceive all natural processes, including current evolutionary theory and
theories of mind.
This shift is actually well underway despite a lack of media
attention (which nowadays correlates to
a lack of public awareness). Currently, the Gaia hypothesis—the concept that Earth itself can be regarded as a
colossal organism responding to stimuli via feedback mechanisms—has gained
increasing support. Another fascinating viewpoint, which never attracted wide
attention in the west, was that of Russian scientist Vladimir Vernadsky.[1]
His theory is related to the Gaia hypothesis but preceded it by decades. Unlike
Vernadsky’s early 20th century contemporaries, he saw life as being a geological force that had a
profound effect on the world through chemically and physically altering its
surface, and by transforming and transporting materials around the globe. In
line with the Gaia concept, Vernadsky saw life as being a global phenomena—but
more as part of a larger process. He didn’t
regard it independently from other aspects of nature; it was just “living matter.” Furthermore, he pictured
the entire biosphere—including both animate and inanimate matter—as being
alive. By rejecting accepted
principles and manners of categorizing and labeling, Vernadsky was able to
formulate a new and conceptually coherent world model.
In contrast to such notions, one derivative of materialism was
the cultivation of a powerful idea: that all entities are best studied and can
be understood by reducing them to their parts. This is known as reductionism or reductive thinking, an essential tool in gaining empirical knowledge.
How could anatomy be tackled except through the study of individual organs,
their tissues, and their tissues’ cells? In biology, this time-honored approach has continually proven its worth
and has been key to most advances.
However, there is currently a burgeoning
movement away from reductionism in the life sciences toward a more inclusive
wide-ranging “systems” approach, which a growing number of scientists regard as
the way of the future. Reductionist methods serve admirably to understand, say,
how circulatory systems work or nerve impulses are conducted but ultimately
break down when considering all the layers of complexity in their state of
seamless integration. As a sort of backlash to the long-needed reappraisal of
biological thinking, reductionism has taken on a suspect standing in certain
areas where its shortcomings are seen to hinder further progress (such as those
fields dealing with pattern and form, or ones where intricate relationships
resist quantification). Many of the criticisms leveled against reductive
thinking are, in fact, directed toward what has been termed radical reductionism—the assumption
that larger scale phenomena can be explained entirely by what occurs at smaller
scales. (As in, Biology is nothing more
than chemistry.)
One of the main limitations of reductive
thinking is seen in its tendency to downplay or ignore so-called emergent properties—patterns,
behaviors, or traits that can’t be deduced from either lower or higher levels
of association. (The deliberate, organismic behavior of DNA helper molecules
being a fine example.) Nonetheless, despite recognized limitations,
reductionism will doubtless continue to occupy its central role in all
scientific endeavor.
Another unforeseen spinoff of endless
scientific triumphs has been an ever-increasing tendency to fragment knowledge
into discrete areas of specialization. Biology professor and science journalist
Rob Dunn writes of the resulting quandary:
For
individuals, it has become more difficult to have a broad perspective. The
scientists of each field have developed more and more specific words and
concepts for their quarry. It is now difficult for a neurobiologist to
understand a nephrologist and vice versa, but it is even difficult for
different neurobiologists to understand each other. The average individual’s
ability to understand other scientific realms has become limited…. The more
divided into tiny parts a field is, the less likely some types of big
discoveries become.… [V]ery few individuals are standing far enough back from
what they are looking at to be able to make big conceptual breakthroughs.
Entirely new sub-sub-disciplines emerge
continuously—such as paleomicrobiology
or biogeochemistry—and the
compartmentalization of knowledge leads to an overly narrow focus on complex
issues (such as what can result if a scientist devotes their career to studying
one facet of one variety of cell in one type of organism). This almost inevitably results in skewed perspectives,
even within someone’s own discipline. The same can happen on a larger scale; in
one example pertinent to this narrative, only a century ago, the intellectual
isolation of several interrelated fields led to a sort of “scientific
provincialism” that created a need for the modern synthesis.
Neo-Darwinism rose to a dominant working model of evolutionary
pathways by virtue of that succession of breathtaking new microbiological
discoveries. Due in part to the lucid and entertaining writings of authors such
as Richard Dawkins, Carl Zimmer, and Sean B. Carroll, the elegant simplicity of
neo-Darwinian precepts leads scientifically literate people to believe that any
unexplained mysteries of evolution have by and large already been solved…or
soon will be.
Among our well-educated populace, whose image of reality is now
effectively defined by science, a mind-set is often on display that could be
characterized as a self-assured but naïve conviction that this worldview
represents an objective description of the “real” world. Indeed: the people of
any given era and culture share a generally agreed-upon overall worldview;
those inhabiting a given period feel sure that their view of reality—having superseded the mistaken and antiquated beliefs of generations past—is entirely
accurate. History reveals this certainty to be a recurring cultural illusion though
people almost invariably ignore this truism. During a speech at Cambridge in
1923, Haldane (the foremost popularizer of science in his day) said, “Science
is in its infancy, and we can foretell little of the future save that the thing
that has not been is the thing that shall be; that no beliefs, no values, no
institutions are safe.”
This was a prescient statement in Haldane’s time but the
observation is no less true today. So…what’s our excuse? Living in the 21st century at (or near)
civilization’s zenith, having seen so
much, one would think we would have perceived this historical pattern and humbly acknowledge that at least some things
fervently believed to be irrefutable facts will become obsolete anachronisms.
Alas: History is made…and then ignored.
As it has always been, our grandchildren will look back on their ancestors’
quaint and primitive ways, marvel at the hilariously crude machines, and long
for simpler times.
Another thing: religion, for many educated people, has been
supplanted by science as their reality-defining milieu. Faith in God has been
exchanged for faith in the scientific approach to such an extent that there’s a
term, in use since the mid-1800s, for what has been recognized as a philosophy
or even a quasi-religion: scientism.[2]
As such, it represents a wholly materialistic standpoint, insisting that only
empirical methods are capable of providing accurate views of reality and that
all other modes of thought, if not just plain wrong, lack substance. Those
unacquainted with the nitty-gritty of actual research or how things work in
academia often seem rather unaware of how messy the scientific arena can be,
with incessant struggles for funding, researchers’ sometimes slipshod work, the
bitter rivalries and envy—even the occasional outright frauds.
Alan Lightman,
astronomy and physics researcher, put it this way: “[O]ne must distinguish
between science and the practice of
science. Science is an ideal, a conception of logical laws acting in the world
and a set of tools for discovering those laws. By contrast, the practice of
science is a human affair, complicated by all the bedraggled but marvelous
psychology that makes us human.” Stephen Jay Gould further emphasizes that
practitioners of science often fall prey to cultural predispositions:
Our ways of learning about the world are strongly
influenced by the social preconceptions and biased modes of thinking that each
scientist must apply to any problem. The stereotype of a fully rational and
objective “scientific method” with individual scientists as logical (and
interchangeable) robots is self-serving mythology.
The truth behind
Gould’s words is exposed by another striking historic pattern: phenomena being
accounted for in language linked to a particular era’s latest promising
discovery or leading technology. In
times past, the biological vital force was ascribed to fire, magnetism,
electricity…even radioactivity. And thus has it been the “fashion” to describe
observable fact using a succession of terminology borrowed from clock making,
steam power, radio technology, and electrical engineering. In the last century,
life processes have been considered in terms of quantum dynamic effects,
computer science, game theory and non-linear dynamics. (Presently we are
undergoing a shift toward various biological topics being thought of in terms
of information theory.) Recognizing the consequence of such influences reveals
the subtle effect culture has on the practice of science. Tellingly, it’s
impossible to even imagine what technology our next contextual aids might be
borrowed from.
In our time, there
is a widely held conviction that scientists, in order to be considered scientists, are limited
exclusively to materialist explanations for all phenomena. Materialistic
naturalism grants no basic worth or import to nature’s rich pageantry, not to mention the pre-eminence of mind.
Disregarding the mysterious nature of life,
a materialistic approach pays no heed
to the reality of things beyond its reach—at the extreme end, going so far as
to argue that concepts like beauty and morality are illusions serving no
constructive purpose…that all living things, including humans, are the result
of chance events…that life,
ultimately, has no object or underlying significance.
This is the stance
of well-known evolutionary scientists Jacques Monod, George C. Williams, and
Richard Dawkins—each of whom has rebuked (in some instances quite harshly)
those who make the cardinal error of inserting subjectivity into matters that
lie within science’s domain. Their position is entirely justifiable in the
context of science’s dealing exclusively with matters within reach of external
verification, but not things that can
only be experienced. However, those authors go beyond simply reminding their
readers that science is powerless—is the wrong tool—to make judgments about
concerns like morality or beauty; they unswervingly insist that the products of
our minds and sense organs have no objective value per se, aside from how they
might contribute to genetic success. Dawkins informs us that life “is just
bytes and bytes and bytes of digital information”…that life’s sole “purpose” is
for DNA to make more DNA. (Oddly, he
never bothers to ask why this should be so.)
Once again: we don’t even know what life is, much less what—if any—its “purpose” might be. Even if the
public isn’t aware of it, there are yet a host of unanswered questions. And
others that haven’t yet been asked. Science, taken as a whole, is likely
humanity’s greatest innovation and our bequest to—hopefully—a bright future
(or, in the ever-astute Gould’s more incisive words, “whatever forever we allow
ourselves”). Lifting us out of darker ages, the work of all those individuals,
building on that of their predecessors, made our way of life possible—another oft-overlooked
detail. But we still aren’t close to knowing precisely how our senses work,
what dreams are for, or where memories reside. Consciousness, the greatest of
all mysteries, remains a variegated enigma. Lacking humility, we persist in
taking as a given our ability to perceive, evaluate, and act with consistent
propriety…or restraint. Which leads to problems.
The physicist Richard Feynman, widely
considered in his day to be one of the most brilliant people alive, said in an
interview:
I
can live with doubt and uncertainty and not knowing. I think it’s much more
interesting to live not knowing than to have answers which might be wrong. I
have approximate answers and possible beliefs and different degrees of
certainty about different things but I’m not absolutely sure of anything and
there are many things I don’t know anything about such as whether it means
anything to ask why we’re here…. I don’t have to know an answer. I don’t feel
frightened by not knowing things….
While it’s generally assumed that scientists
are objective and impartial in their views, such intellectual bravery and
humbleness as this is vanishingly rare. For Richard Feynman (who suffered from terminal
curiosity) it always came down to the pure joy of discovery. Charles Darwin also displayed
this quality in spades; he had the courage to question his own views, and
openly invited others to challenge his cherished theories.
So why this tendency to feel such certainty, such resolute assurance, that the wondrous things all
around us ultimately have no importance…or are of no particular consequence?
Why the need for such staunch conviction? And what exactly does this say about
our culture, that we denigrate life so? Even if philosophy and religion are
left out of the picture entirely, individuals will still seek meaning and
purpose in the world, invite beauty into their lives, and go on living by a
moral code based on what they believe
to be objective values. Such things are still
fundamental, inescapable aspects of what it is to be human. For mystics,
atheists, and rational materialists alike.
Yes, we eternally owe a debt of gratitude to those great minds
that made our modern way of life possible. But, in an essay—almost a
manifesto—co-written by Dorian Sagan, Lynn Margulis, and Ricardo Guerrero, we
are reminded of another way:
Perhaps Descartes did not dare admit the
celebratory sensuality of life’s exuberance. He negated that the will to live
and grow emanating from all live beings, human and nonhuman, is declared by
their simple presence. He ignored the existence of nonhuman sensuality. His
legacy of denial has led to mechanistic unstated assumptions. Nearly all our
scientific colleagues still seek “mechanisms” to “explain” matter, and they
expect laws to emerge amenable to mathematical analysis. We demur; we should
shed Descartes’ legacy that surrounds us still and replace it with a deeper
understanding of life’s sentience. In [Samuel] Butler’s terms, it is time to
put the life back into biology.
Promoting this way
of thinking is my intent. But to take it even farther—not simply to put the
life back in biology, but to replace it with a greater concept of life. While fully cognizant of the sheer
unlikelihood that a non-scientist could perceive things about the natural world
that have somehow been overlooked by untold numbers of highly trained
professionals, I have been unable to shake this powerful conviction: There is a much deeper reality behind the
way we currently perceive nature. As with DNA, our views of the way cells
work shows how little we credit the power of Natural Design.
©2016 by Tim Forsell draft
25
Feb 2016
[1]
Vladimir Ivanovich Vernadsky
(1863–1945) was considered one of the founders of both geochemistry and radiometric dating and
also popularized the concept of the noösphere. “In Vernadsky’s theory of the
Earth’s development, the noösphere
[human and technological] is the third stage in the earth’s development, after
the geosphere (inanimate matter) and
the biosphere (biological life).
Just as the emergence of life fundamentally transformed the geosphere, the
emergence of human cognition will fundamentally transform the biosphere. In
this theory, the principles of both life and cognition are essential features
of the Earth’s evolution, and must have been implicit in the earth all along.…
Vernadsky was an important pioneer of the scientific bases for the
environmental sciences.”
[2]
The term is often used pejoratively by those who
insist that scientism results to an impoverished worldview.
No comments:
Post a Comment