Cybernetics in the 3rd Millennium (C3M) -- Volume 2 Number 2, Feb. 2003
Alan B. Scrivener --- www.well.com/~abs --- mailto:email@example.com
Biological Computing: The Next Big Thing?
As a youth one of my main heroes was Leonardo da Vinci, and I wanted
to be a Renaissance Man like him. His sketches of helicopters
and parachutes appealed to me the most; his anatomy studies the least.
But the last "life sciences" class I took (with one exception which
I'll mention later) was sophomore biology in high school, taught by
the football coach, and infamous because of its "health" (code for
"sex education") segment, and for the frog dissecting that seemed
designed to discourage most of us from wanting to be doctors. So
I had set a course to become a so-called Renaissance Man with little
In my 12th grade I happened upon the "Whole Earth Catalog" which
seemed like the kind of thing a Renaissance Man would read.
( www.amazon.com/exec/obidos/ASIN/1892907054/hip-20 )
I quickly discovered that it was filled with glib, counter-culture
prognostications like: "ecology and education are going to come
together, and the schools don't know it." I found that its editor,
Stewart Brand, had been a biology major, and his search for an
understanding of whole systems came with a life sciences perspective.
It was from the Whole Earth Catalog that I learned both the word and
the ideas behind cybernetics. The most interesting thing about
Norbert Weiner's work was his mathematical tools for function
prediction and his ideas for feedback mechanisms; the least
interesting was his collaboration with physician Arturo Rosenblueth
and his detailed examinations of biological systems.
One of the amazing things the Catalog pointed me at was the
computer-simulated apocalypse in "The Limits to Growth: A Report
for the Club of Rome's Project on the Predicament of Mankind"
(1972) by Donella and Dennis Meadows.
( www.amazon.com/exec/obidos/ASIN/0451136950/hip-20 )
It used computer models to predict that in the absence of mitigating
factors our technological civilization would "crash" somewhere around
2000 to 2010 (depending on some assumptions). It got my attention.
Clearly, we needed to hop on this right away, and begin producing
more detailed models and studying these preliminary results very
Meanwhile the Whole Earth Catalog had its own apocalypse to contend
with. It was losing money with every issue, so Brand decided to shut
it down, publishing the "Last Whole Earth Catalog" in 1972. It became
a surprise New York Times bestseller, and two years later Brand
revived it as the "Whole Earth Epilog: Access to Tools" (1974)
( www.amazon.com/exec/obidos/ASIN/0140039503/hip-20 )
This new edition highlighted the ideas of Gregory Bateson, who
coincidentally lived four doors down from me on the campus of
the University of California at Santa Cruz. Through a series of
events I describe elsewhere, he became my mentor, and among the
important things he taught me where:
- the best way to develop a good intuition for complex
systems (and to approach a state of wisdom) is to study
natural history, that is, the record of what life has done
- in any biological system, no variable can grow with
an unlimited exponential curve -- something will "crash"
His most recent book at the time was his collection of short works:
"Steps to an Ecology of Mind: Collected Essays in Anthropology,
Psychiatry, Evolution, and Epistemology" (1972)
by Gregory Bateson
The hardcover edition included a short 1970 essay cut from the
paperback edition, "On Emptyheadedness Among Biologists and State
Boards of Education," which asserted:
"Evolution has long been badly taught. In particular,
students -- and even professional biologists -- acquire
theories of evolution without any deep understanding of
what problem these theories attempt to solve. They
learn but little of the evolution of evolutionary theory."
He observed that in attempting to resist the pseudo-scientific
agenda of creationists, evolutionists were becoming inflexible
and dogmatic themselves.
Soon afterwards I got Bateson to sponsor me in a student-directed
seminar called "Understanding Whole Systems," in which I attempted
to organize all knowledge the way a Renaissance Man should.
When I got to the session on biology, I drew a blank; I still
didn't feel I knew enough to say anything non-trivial.
A few weeks later Stewart Brand came to town and I got him
to address my seminar. It was a pleasant enough session
(I had him grappling with the not-very-fruitful question, "What
is reality?") but he also addressed a Cybernetics class offered
by the Information Science department, and I sat in on it.
Brand was singing the praises of a book he was reading, and had
not even finished, called "Perspectives in Ecological Theory" (1968)
by Ramon Margalef.
He called it a "tough little book," but insisted it was the
most cybernetic thinking on ecology he'd read. He was impressed
that Margalef had done most of his work with ocean plankton, but
had drawn conclusions that seemed to apply to most ecosystems.
A few months after that Brand reprinted an abridged form of the
book in his magazine The CoEvolution Quarterly (Summer 1975) --
the same issue that was the first non-specialist publication to
carry the Gaia Hypothesis by Lynn Margulis and James Lovelock.
Having read the whole book, I'd actually say it was improved
by the editing (a little less detail on plankton, thank you) as
well as by the charming but informative diagrams by Peter Warshall
and Carol Kramer. The book begins with a look at the "classic"
approach to systems: model a set of variables connected by Ordinary
Differential Equations (ODEs), and then stand back and say: "B-but
we can't solve these analytically." Then he goes on to identify
"heuristics" he's developed for "rule of thumb" analysis where
rigor fails. This is quite like the approach taken by
Ludwig Von Bertalanffy in "General System Theory" (1968).
But Margalef has some better heuristics. Starting with the bald
fact that predators tend to remember more than their prey (the
prey only learn when the predation attempt fails), he goes on
to develop a powerful abstraction of the idea of "exploitation."
The next year I finally took my one college course in a life
science: it was an interdisciplinary seminar taught by Gregory
Bateson (an inter-disciplinarian in his own right, whose father,
after all, had coined the word "genetics") and Robert Edgar, (who
was lured away from Cal Tech to found Kresge College at UCSC;
I'm told that if he had stayed, and finished mapping the DNA of
the tobacco mosaic virus -- the first organism whose genome was
completely mapped -- he would have certainly gotten the Nobel Prize).
The course was called "The Evolutionary Idea," and the name was a
three-way pun which referred to:
- the idea of evolution (evolution as mental process)
- the evolution of ideas (minds and thought processes as
an analog to natural selection)
- the evolution of evolution (how the theory of evolution
One of the problems we wrestled with was the "punctuated equilibrium"
in the fossil record. Nowhere do we actually see a species being
created, despite Darwin's claim to have explained "The Origin of
Species" with the theory of natural selection.
Instead, we see species remaining constant, except for mostly changes
in the sizes of parts, for millions of years. The argument made is
usually that the creation of species happens too fast for the fossil
record to catch. But biologists have struggled to explain why. I
argued the idea that DNA could somehow
detect "stress" in the phenotype (organism bodies) and increase its
own rate of mutation of the genotype in response, so that DNA could
control its own rate of change. They were aghast -- not because they
thought I was wrong (they actually sort of liked the idea) but because
it went against the "central dogma" of Watson and Crick, and no such
research could get funded in the current academic climate. I had to
ask myself why scientific researchers who were supposed to be
searching for truth would ever use the word "dogma" to describe
their preliminary results.
At one of the last class meetings Bob Edgar had just returned
from a conference at which it had been revealed that DNA had the
power to edit itself prior to being transcribed into RNA, forming
complex structures that included loops. (I think this may
have been an early hint of the research later published as
"The structure of histone-depleted metaphase chromosomes,
Paulson J. R., and Laemmli U. K. (1977), Cell 12, 817- 828.
My final paper for the class wrestled with the question of the
minimum computational requirements for intelligence without much
success. (Conclusion: less than the universe but more than a
single binary switch was required.) As a parting gift, Bob Edgar
gave me a copy of the classic "On Growth and Form" by D'Arcy
Wentworth Thompson (1917).
(I remember Bateson complaining that Thompson seemed to have scoured
the natural world for examples of growth and form that could be
explained with pure physics, without any cybernetic thinking, but
-- hey -- it was 1917, and he was trying to show skeptics that
biology could be a quantitative science) He also gave me a
well-worn a copy of "Molecular Biology of the Gene (2nd Edition)"
(1970) by James D. Watson, on condition that I read both books.
I tried, Bob, I really did, but Thompson is such a polymath
that you need to know French, Latin and the Greek myths just
to read his introduction, and Watson's work is so full of
FYI: Watson's "Molecular Biology of the Gene (4th Edition)" (2001)
is still in print.
Well, it was a long, strange trip from the seventies to the nineties.
I mostly concentrated on earning a living, and so got sucked into the
microcomputer revolution along with a lot of other folks. Every
now and then I tried studying some biology. One book that shall
remain nameless, which attempted to integrate cybernetics and
cell biology, tackled the question of how much information a
cell "contained." Well, I knew from studying Shannon's original
work on communication theory that information is defined only as
a measure of data communicated, not data "in" something.
See "Mathematical Theory of Communication" (1950),
by Claude E. Shannon, Warren Weaver.
But in the late 1970s and the 1980s, cybernetics was dying on
the vine. Biologists and social scientists seemed the only people
paying it lip service -- goodness knows the information science
community completely ignored it. But what came along in the
mid-1980s to keep the young-uns excited were fractals and chaos.
(Also, in science fiction, "cyberpunk" appeared, with its dark
biotech-influenced future, but that's another story.)
I find it interesting that one of the big breakthroughs in
chaos, May's discovery of strange attractors in the logistic
equation, came in the context of environmental system modeling.
Today, you can see chaos in the popular software toy "sharks
and fishes," available in several Java versions:
My own interest in chaos, as latest keeper of the "systems theory"
flame, led me to the Society of Industrial and Applied Mathematicians
(SIAM) Conference on Dynamical Systems in Orlando, FL, May 7-11 1990.
I was there along with my friend Dave Warner, who was a MD/PhD
candidate at Loma Linda University at the time, and his advisor,
Dr. Douglas Will. Our conversations concerned mostly chaos in the
nervous system. It was also on that trip that I first experienced
"The Wonders of Life" pavilion at Walt Disney World's EPCOT theme
park, and its motion-base ride "Body Wars."
It is arguably based loosely on the movie "Fantastic Voyage" (1966),
in that you are "shrunk" to a tiny size and then voyage through a
During this trip I experienced a rare apotheosis, in which I
realized that the 1990s would be a decade of breakthroughs
in human health and "life extension" technology, and this was
going to impact my mental development and career a great deal.
I still remember exactly where I was at that moment, while
walking for exercise, at the corner of East Buena Vista Drive and
& Hotel Plaza Blvd., Lake Buena Vista, FL:
(Continuing interest brought me to another SIAM conference on
dynamical systems at Snowbird, UT, in October of 1992.
They continue to this day; the next one is the "SIAM Conference
on Applications of Dynamical Systems," May 27-31, 2003,
Snowbird Ski and Summer Resort, Snowbird, UT,
sponsored by SIAM Activity Group on Dynamical Systems:
There is also a new "SIAM Journal on Applied Dynamical Systems"
which just published its first issue.)
One of things I did with my new "aha" was to begin a quest.
I was working for a family of companies involved in scientific
computing and simulation, going by the names Stellar, Ardent,
Stardent, Kubota and Advanced Visual Systems. (Through acquisitions
and spin-offs I kept getting new business cards but keeping the same
customers.) Initially most of our customers had been aerospace
companies, military weapons labs and Department of Energy (DOE) labs
which were famous for building the first nuclear weapons. Because
this seemed like such a waste of technology to me, and
because the Berlin Wall had just fallen and the Soviet Union was
about to as well, my quest was to "beat swords into plowshares"
and convert the use of this technology (high-speed computing
and 3D visualization) to medical research.
To this end, one of things I did -- at the urging of Dave Warner
-- was to begin attending and presenting at a new conference called
"Medicine Meets Virtual Reality -- Discovering Applications
for 3-D Multi-Media Technology in the Health Sciences."
For the first conference I presented a paper, "The Somascope:
A Tool for Guided Self-Healing Using Medical Imaging" (1992).
For the second conference I presented "The Impact of Visual
Programming in Medical Research" (1994).
By this time I was at AVS, Inc., and I was both trying to push the
medical world into using AVS's technology and trying to get AVS to
recognize this new market. Ultimately, I succeeded. The largest
sale in the company's history, which I was instrumental in closing,
went to ADAC Laboratories, a manufacturer of Gamma Cameras -- medical
scanners that photographed gamma rays inside patients who'd swallowed
isotope "cocktails," in order to image metabolism in process.
(ADAC is now owned by iotech; the ADAC web site redirects there.)
I remember around 1994 or 5 sitting in a session at Med VR
while a video was being shown of endoscopic surgery. In this
breakthrough technique, instead of opening up a patient's abdomen
like a Dutch door, a small incision is made and a cable much like
a plumber's "snake" with a camera and scalpel on the end is inserted,
allowing remote control surgery and, of course, reduced recovery
time. It was grossing me out. I remembered again how, after the
frog dissecting in high school, I knew I didn't want to be a
doctor. But my wife, sitting next to me, who had been trained
as an Animal Health Technician (AHT) and had assisted in many
veterinary surgeries, said, "Look at that pink, healthy tissue!"
It was beautiful to her.
But I found myself attracted more towards the use of computers
to simulate living things. My friend Phil Mercurio returned
from a trip to New Mexico in 1992 talking about a conference
he had attended there: "Artificial Life III." The proceedings
are still available; see "Proceedings of the Workshop on Artificial
Life Held June, 1992 in Santa Fe, New Mexico" (Santa Fe Institute)
Christopher G. Langton, editor.
One of the things he liked the most was a prototype robot construction
kit that later evolved into the Lego Mindstorms programmable Lego
robots for kids. See "The Mini Board Technical Reference" by Fred
(There is now an International Society for Artificial Life, with a
web site at:
helping to promote and carry on this important work.)
I still remember jotting some notes in the mid-1970s for a genetic
simulation program which I never implemented (shame on me).
My ideas did make it into a short story I wrote for "The Journal of
It was called "Sleazy Weasels," and you can read it on-line:
The best guide for the lay reader on artificial life is "Artificial
Life: A Report from the Frontier Where Computers Meet Biology"
(1993) by Steven Levy.
There are two things I found especially interesting about this book:
- In simulations of evolution -- which arguably were actually
EXAMPLES of evolution -- it had turned out that parasites were
vital to genetic health. First, parasites had appeared where
they weren't expected, and then, it turned out that artificial
organisms with parasites were more adaptive, since they were
always playing "cat and mouse" games with the parasites. When
their external environment changed, they could adapt more quickly
since they were used to change and hadn't "overadapted" to their
old environment. This shed some light on the "punctuated
equilibrium" issue. Even when the phenotype wasn't observed to
change, an organism with parasites had a constantly changing
- The people making most of these breakthroughs, doing the
research and attending the conferences, were mostly computer
programmers and other "information science" types. They were
having a hard time getting the biologists doing "wet lab" work
interested, and in convincing them that the A-Life research
was relevant to biology.
About the same time Windows 95 was released I found myself drifting
out of the scientific computing and visualization world and, partly
because I became a father, concentrating more on earning a living,
and so got sucked into the internet revolution along with a lot of
other folks. About six years later the dot com crash left me looking
for the Next Big Thing in the spring of 2001. One person I went
to see was Art Olson, at the Scripps Research Institute, who told
me "biological computing" was worth paying attention to.
That summer, at the urging of Dave Warner -- who always seems to be
the one lately who pushes me towards life sciences -- I went to the
Bio2001 conference at the San Diego convention center.
After seeing a bewildering array of exhibits by bio-tech companies
like Avecia Biotechnology, BioImmunPharma GmbH, Capsulution
Nanoscience AG, DYAX Corp., Elegene AG, FermPro Manufacturing LP,
Genaissance Pharmaceuticals, Hybrigenics, Infigen, Jostra
Medizintechnik AG, KoBioTech Co., Ltd., Lonza Biotechnology,
Medarex, Inc. and so on, I decided I needed a break. I left
convention center and walked over to Seaport Village, an outdoor
fishing-village-themed shopping area, and stopped in at one of
my favorite combination bookstore/espresso bars. I picked up a
book off the "new arrivals" shelf: "Genome: The Autobiography of
a Species in 23 Chapters" by Matt Ridley (2000), which told the
story of the Human Genome Project and its implications.
In the foreword I read about the race between government-sponsored
researchers in the United States and England, and the private company
founded by scientists Craig Venter, to complete a map of the human
genome by June of 2001.
"On June 26, 200, President Bill Clinton in the White House and
Prime Minister Tony Blair in Downing Street simultaneously
announced that the rough draft was complete."
As I stood reading those words the date was June 26, 2001, the first
birthday of the completed rough draft.
At that moment I experienced another rare apotheosis, in which I
realized that the 2000s would be a decade of breakthroughs
in the genomics and related bio-technologies, and this was
going to impact my mental development and career a great deal.
I still remember exactly where I was at that moment, at the
Upstart and Crow Bookstore and Coffeehouse:
The following month I attended the SIGGRAPH 2001 Conference at
the Los Angeles Convention Center, devoted to 3D graphics and
(This isn't germane to the main thrust of this essay, but it seems
as good a place as any to insert this plug: the SIGGRAPH conference
is coming to San Diego in July of 2003, and the San Diego chapter of
SIGGRAPH is producing an event called SIGKIDS to promote the use of
3D graphics and interactive technologies for children. See:
for more information.)
As I sat on a couch in a lounge I was joined by an associate
who said, "I'm still trying to figure out what the Next Big Thing
is this year." It was a year in which the computer graphics industry
was in decline -- Disney had consolidated its Feature Film Effects
group with Dream Quest, a company it had bought, to form The Secret
Lab, and then abruptly laid everyone off as a cost-saving
measure; other production groups had similar woes; most the 3D vendors
were suffering as well -- and a lot of people were looking for the
Next Big Thing.
"I've got that right here," I replied, and pulled out a copy of
the newspaper "USA Today" which I'd found on a table. On the
front page of the "Money" section for August 13, 2001 was this
Pioneers shift gears -- from tech to biotech
By Del Jones, USA TODAY
Visionaries who ushered in the age of software, supercomputers
and the Internet are turning to a new passion: biotechnology.
* Danny Hillis -- pioneer of parallel computers, now the basis
for most supercomputers -- has moved into genetics and
neurobiology as chairman and chief technology officer of
* Jim Clark, founder of Silicon Graphics and Netscape, is a big
investor in and director of DNA Sciences, a gene-research
company in the heart of the Silicon Valley in Fremont, Calif.
* Internet pioneer Frank Moss sits on the advisory board of the
Harvard Medical School and is director of a biotech start-up he
would not identify.
"Over the next years, diseases will be cured. That's a hell of a
lot better than finding books, or other uses of the Internet we're
even less proud of," Moss says. "This might be bigger or smaller
than the Internet, but a lot more important."
President Bush gave his limited go-ahead to embryonic stem-cell
research Thursday. But many wealthy tech gurus left the starting
gate a long time ago. Bill Gates, Paul Allen and Larry Ellison,
three of the world's four richest men, are among the largest
individual investors in biotech. Ellison, CEO of Oracle, has a
particular interest in ventures trying to slow the aging process,
a potential gold mine.
Money follows visionaries. During the second 3 months of 2001,
14% of venture capital dollars went to medical/health/life science
companies, vs. 4% during the same period of 2000, says Venture
Economics. The share of capital going to Internet start-ups
declined from 48% to 28%.
"A tidal wave is about to come," Moss says. "Individuals are
looking for a new gig."
Nathan Myhrvold and Edward Jung, former chief technology officer
and chief software architect of Microsoft, are partners in a
biotech think tank. Myhrvold speaks at biotech conferences,
calling it the next "exponential industry" that will see fast
growth and falling prices, like computer chips.
The two have started a non-profit aimed at helping high school
students form biotech clubs, just as many of today's technology
billionaires have their entrepreneurial roots in the computer
clubs of the 1970s.
"Hey, we were part of a revolution that made a big difference,"
Jung says. "Here's the next revolution that's going to make a
Jung says much of biotech still uses "primitive computer tools,"
and tech experts are being pulled into the field as much as
they're pushing into it.
Tech veterans have other lessons to teach an industry embarking
on rapid growth. But there are cultural differences, Moss says.
Traditional drug companies develop products over 10 years. Six
months is a lifetime in the Internet.
"There's a lot of getting used to one another," Moss says.
Just one month later American went from a state of "deep peace"
to trying to defend itself against terror attacks, and for the first
time in its history was the victim of biological warfare. The old
heavy metal group "Anthrax" announced on their web site that they
were even considering changing their name to "basket full of puppies"
because they felt so bad about their ironic name being associated
with real suffering now that "the end of irony" had been heralded by
Time Magazine. In short order I found myself under contract to Dave
Warner's company, Mindtel LLC, to do research on ways to visualize
potential bio-terrorism threats in San Diego.
After many meetings and conferences that exposed me to bio-terrorism
preparedness planning, I came to the conclusion that it's a good thing
WE (Americans) are not the terrorists, because we are so resourceful.
There are frequent scenarios simulated with blue teams (good guys) and
red teams (bad guys), and I learned that in any city in America you
could kill millions and panic many millions more by simply -- ...but
maybe I shouldn't be spreading this around.
During this hysteria I re-read the novel "Jurassic Park" (1990)
by Michael Crichton, which is more brutal and cautionary than the
Spielberg movie it was made into.
Mathematician and chaos expert Ian Malcolm (played by Jeff Goldblum
in the movie) dies in book, and in his final soliloquy he says:
...the great intellectual justification of science has vanished.
Ever since Newton and Descartes, science has explicitly offered us
the vision of total control. Science has claimed the power to
eventually control everything, through its understanding of
natural laws. But in the twentieth century that claim has been
shattered beyond repair. First, Heisenberg's uncertainty principle
set limits on what we could know about the subatomic world...
Then Godel's theorem set similar limits to mathematics, the
language of science... And now chaos theory proves that
unpredictability is built into our daily lives...
We are witnessing the end of the scientific era. Science, like
other outmoded systems, is destroying itself. As it gains in
power, it proves itself incapable of handling the power. Because
things are going very fast now. Fifty years ago, everyone was
gaga over the atomic bomb. That was power. No one could imagine
anything more. Yet, a bare decade after the bomb, we began to
have genetic power. And genetic power is far more potent than
atomic power. And it will be in everyone's hands. It will be
in kits for backyard gardeners. Experiments for schoolchildren.
Cheap labs for terrorists and dictators. And that will force
everyone to ask the same question -- What should I do with my
power? -- which is the very question science says it cannot answer.
Powerful prescience from a medical student turned best-selling author.
And while I'm on the subject, the best novel I've read about bio-
terrorism is "The Cobweb" by Stephen Bury (really Neal Stephenson
and his uncle), which takes place during the Gulf War in 1990-91.
Okay, fast forward one year. Now I'm writing this e-Zine. I'm finally
getting around to finishing reading "Out of Control: The New Biology of
Machines, Social Systems and the Economic World" (1994) by Kevin Kelly,
which I've been reading off and on for nine years.
This is a layman's book, by a member of the Whole Earth Catalog crowd,
that looks at a wide variety of things that he collects into "the rise
of neo-biological civilization." And I find myself wondering, what did
Art Olson mean by "biological computing" in the spring of 2001? I
thought I knew at the time, but Kelly's book has reminded me that,
as Shakespeare says in "Macbeth,"
There are more things in heaven and earth than are dreamt
of in your philosophy.
So I called him up and made an appointment. On December 5, 2002,
I sat down with Art to probe this question further.
TO BE CONTINUED...
Privacy Promise: Your email address will never be sold or given to
others. You will receive only the e-Zine C3M unless you opt-in to
receive occasional commercial offers directly from me, Alan Scrivener,
by sending email to firstname.lastname@example.org with the subject line "opt in" -- you
can always opt out again with the subject line "opt out" -- by default
you are opted out. To cancel the e-Zine entirely send the subject
line "unsubscribe" to me. I receive a commission on everything you
purchase during your session with Amazon.com after following one of my
links, which helps to support my research.
Copyright 2003 by Alan B. Scrivener