Cybernetics in the 3rd Millennium (C3M) -- Volume 2 Number 4, Apr. 2003
Alan B. Scrivener --- www.well.com/~abs --- mailto:firstname.lastname@example.org
What Ever Happened to Cybernetics?
"Is 'dis a system?"
-- R. Crumb, 1970
Since she could talk my daughter has always begged me to tell her
stories. In the last few years her taste has turned to science
fiction, and I find I'm running out of stories. I end up browsing
used paperback stores looking for sci-fi classics I can skim to pick
up plot lines. Recently I paid $1.10 for a faded, torn copy of
"I Robot" (1950) by Isaac Asimov, a collection of his first "robot
series" short stories.
( www.amazon.com/exec/obidos/ASIN/0553294385/hip-20 )
In the story called "Robbie," first published in 1948, Asimov
gives this biographical information about a woman robotics engineer
and magnate in the year 2057:
She obtained her bachelor's degree at Columbia in 2003 and
began graduate work in cybernetics.
Hmmm. Here it is, 55 years after Dr. Asimov wrote those words, we
really are in the year 2003 and yet I don't think you can currently
do graduate work in cybernetics anywhere on Earth, let alone at
Columbia or any other Ivy League school in the United States.
What ever happened to cybernetics? It started out so well.
In 1942 the prestigious Macy Foundation began their
"Conferences on Feedback" later renamed "Conferences on
Cybernetics" and sent the participants out to spread the word.
Wiener had just coined the word and written the book "Cybernetics"
in 1948, so science-fiction writer Asimov was quick to pick up on it.
( www.amazon.com/exec/obidos/ASIN/026273009X/hip-20 )
Some of those participants included Gregory Bateson, Margaret Meade,
John von Neumann, Warren McCulloch, and Heinz von Foerster.
But it didn't catch on. No cybernetics chairs was endowed at
Cambridge. UCLA didn't hastily create a Cybernetics Department,
and hastily hire conference attendees. At the time I was first
becoming aware of the field in 1972 it was already an anachronism.
Visitors to "Independence Hall" in Philadelphia, where many important
events in early U.S. history occurred, are told on the tour how the
building was called "the state house" and later "the old state house"
after a new one was built, and by the late 19th century was in
disrepair and some thought it should be demolished as a hazard.
But when Washington's old ally Lafayette visited from France, and
asked to see the "Hall of Independence" where the Declaration of
Independence was signed, his hosts realized they'd let a national
treasure almost fall into ruin, and it was restored and made a
shrine to democracy.
Perhaps a similar "shaming" of America into recognizing its
treasures in process right now. French author Jean Pierre Dupuy
has looked at the philosophical and linguistic implications of the
birth of cybernetics in "Aux origines des sciences cognitives" (1994)
( www.amazon.com/exec/obidos/ASIN/2707122009/hip-20 )
(apparently never translated into English), and more recently
"The Mechanization of the Mind" (2000) translated by M. B. DeBevoise.
( www.amazon.com/exec/obidos/ASIN/0691025746/hip-20 )
Dupuy analyzes the Macy conferences and surrounding events to shed
light on the evolution of concepts in cognitive science since then.
I mentioned last month that I finally finished reading
"Out of Control" (1994) by Kevin Kelly.
( www.amazon.com/exec/obidos/ASIN/0201483408/hip-20 )
You can find the book on-line at:
Near the end, in chapter 23, "WHOLES, HOLES, AND SPACES," he asks
the same question I am asking in this column:
what ever happened to cybernetics?
The cybernetic group did not find answers as much as they prepared
an agenda for questions. Decades later scientists studying chaos
complexity, artificial life, subsumption architecture, artificial
evolution, simulations, ecosystems, and bionic machines would find
a framework for their questions in cybernetics. A short-hand
synopsis of "Out of Control" would be to say it is an update on
the current state of cybernetic research.
But therein lies a curious puzzle. If this book is really about
cybernetics, why is the word "cybernetics" so absent from it?
Where are the earlier practitioners of such cutting-edge science
now? Why are the old gurus and their fine ideas not at the center
of this natural extension of their work? What ever happened to
It was a mystery that perplexed me when I first started hanging
out with the young generation of systems pioneers. The
better-read were certainly aware of the early cybernetic
work, but there was almost no one from a cybernetic background
working with them. It was as if there was an entire lost
generation, a hole in the transmission of knowledge.
There are three theories about why the cybernetic movement died:
Cybernetics was starved to death by the siphoning away of its
funding to the hot-shot-but stillborn-field of artificial
intelligence. It was the failure of AI to produce usefulness
that did cybernetics in. AI was just one facet of cybernetics,
but while it got most of the government and university money,
the rest of cybernetics' vast agenda withered. The grad
students fled to AI, so the other fields dried up. Then,
AI itself stalled.
Cybernetics was a victim of batch-mode computing. For all its
great ideas, cybernetics was mostly talk. The kind of
experiments required to test its notions demanded many cycles
of a computer, at its full power, in a completely exploratory
mode. These were all the wrong things to ask of the priesthood
guarding the mainframe. Therefore, very little cybernetic theory
ever made it to experiment. When cheap personal computers hit
the world, universities were notoriously slow to adopt them. So
while high school kids had Apple IIs at home, the universities
were still using punch cards. Chris Langton started his first
a-life experiments on an Apple II. Doyne Farmer and friends
discovered chaos theory by making their own computer. Real-time
command of a complete universal computer was what traditional
cybernetics needed but never got.
Cybernetics was strangled by "putting the observer inside the
box." In 1960, Heinz von Foerster made the brilliant suggestion
that a refreshing view of social systems could be had by
including the observer of the system as part of a larger
metasystem. He framed his observation as Second Order
Cybernetics, or the system of observing systems. The insight
was useful in such fields as family therapy where the therapist
had to include him- or herself in a theory of the family they
were treating. But "putting the observer into the system" fell
into an infinite regress when therapists video-taped patients
and then sociologists taped therapists watching the tape of
the patients and then taped themselves watching the therapists
....By the 1980s the rolls of the American Society of Cybernetics
were filled with therapists, sociologists, and political
scientists primarily interested in the effects of observing
All three reasons conspired so that by the late 1970s cybernetics
had died of dry rot. Most of the work in cybernetics was at the
level of the book you are now reading: armchair attempts to weave
a coherent big picture together. Real researchers were bumping
their heads in frustration in AI labs, or working in obscure
institutes in Russia, where cybernetics did continue as a branch
of mathematics. I don't believe a single formal textbook on
cybernetics was ever written in English.
At this point I feel the necessity of jumping in and reminding folks
that W. Ross Ashby did write a textbook on cybernetics, called
"An Introduction to Cybernetics" (1964).
( www.amazon.com/exec/obidos/ASIN/0416683002/hip-20 )
I saw it used with some successes as a textbook for a course in
"Cybernetics" given by the Information Science department at
the University of California at Santa Cruz in 1975.
And speaking of Ashby, I have said elsewhere:
I wish Ashby had written more. [Design for a Brain] and
[An Introduction To Cybernetics] are all we got from
this great, precise thinker.
Well, another collection had appeared that I didn't know about:
"Mechanisms of Intelligence: Ashby's Writings on Cybernetics" (1981)
( www.amazon.com/exec/obidos/ASIN/0914105043/hip-20 )
and it's now out of print, but findable on Amazon anyway.
And I recently received an email from a member of Ashby's heirs,
telling me a new collection of writings from his notebooks may be
appearing soon, and if so I will be notified. Of course I will
pass the news onto you, the subscribers to this e-Zine.
But anyway. On the whole Kevin Kelly is right, cybernetics
by that name seems to have withered away. We seem to have a
number of fields that were supposed to converge into cybernetics,
each instead pursuing its own evolutionary pathway. Here is
a table of names for related concepts divided roughly into the
"Modern" (1919-1969) and "Postmodern" (1970-present) eras:
control engineering dynamical systems
electronics information technology
signal filtering digital signal processing
ergodic behavior basins of attraction
graph theory network topology
human engineering ergonomics
psychology cognitive science
the social matrix memetics
general system theory complexity
limits to computability intractable problems
miniaturization large scale integration
chemical engineering nanotechnology
operations research management science
and so on. Every field re-invented itself from time to time, as
Public Relations (PR) requirements demand, and Grand Integrations
of all knowledge are discouraged. As Tom Wolfe points out in the
essay "Digibabble, Fairy Dust, and the Human Anthill," reprinted in
"Hooking Up" (2000)
( www.amazon.com/exec/obidos/ASIN/0312420234/hip-20 )
the Catholic Church discouraged Tielhard de Chardin from publishing
his theories of the convergence of all human knowledge, and academia
discouraged Marshall MacLuhan in his notions of "the global village,"
and biologists discouraged Edward O. Wilson in his attempt to glean,
from his studies of insects and the genetic basis of their behavior, a
unification of all knowledge in the humanities and social sciences
into "specialized branches of biology."
Another thing that jumped out at me in the final pages of "Out of
Control" was his revisiting of the old controversy about the book
"The Limits to growth : a report for the Club of Rome's Project on
the Predicament of Mankind" (1970) by Donella H. Meadows, et. al.
( www.amazon.com/exec/obidos/ASIN/0451136950/hip-20 )
This was a report on an ambitious attempt to model the world's
resources, industries, pollution, economic systems, food supplies,
population, etc. in one huge interconnected simulation. Most of
the original work was done by Jay W. Forrester at MIT, who also
wrote the seminal paper on social system modeling, "Understanding
the counterintuitive behavior of social systems," (1971) in:
"Collected Papers of J. W. Forrester" (1975)
( www.amazon.com/exec/obidos/ASIN/1563271923/hip-20 )
(It's worth noting that he is still the Germeshausen Professor
Emeritus of Management at the Sloan School of Management, MIT.)
( sysdyn.clexchange.org/people/jay-forrester.html )
Again quoting Kelly's "Out of Control" we get the short version
of the "Limits to Growth" story.
In the computer labs of MIT, an unpretentious engineer cobbled
together the first global spreadsheet. Jay Forrester had been
dabbling in feedback loops since 1939, perfecting machinery-
steering servomechanisms. Together with Norbert Wiener, his
colleague at MIT, Forrester followed the logical path of
servomechanisms right into the birth of computers. As he
helped invent digital computers, Forrester applied the first
computing machines to an area outside of typical engineering
concerns. He created computer models to assist the management
of industrial firms and manufacturing processes. The usefulness
of these company models inspired Forrester to tackle a simulation
of a city, which he modeled with the help of a former mayor
of Boston. He intuitively, and quite correctly, felt that
cascading feedback loops-impossible to track with paper and
pencil, but child's play for a computer-were the only way to
approach the web of influences between wealth, population,
and resources. Why couldn't the whole world be modeled?
Sitting on an airplane on the way home from a conference on
"The Predicament of Mankind" held in Switzerland in 1970,
Forrester began to sketch out the first equations that would
form a model he called "World Dynamics."
It was rough. A thumbnail sketch. Forrester's crude model
mirrored the obvious loops and forces he intuitively felt
governed large economies. For data, he grabbed whatever was
handy as a quick estimate. The Club of Rome, the group that
had sponsored the conference, came to MIT to evaluate the
prototype Forrester had tinkered up. They were encouraged by
what they saw. They secured funding from the Volkswagen
Foundation to hire Forrester's associate, Dennis Meadows, to
develop the model to the next stage. For the rest of 1970,
Forrester and Meadows improved the World Dynamics model,
designing more sophisticated process loops and scouring the
world for current data.
Dennis Meadows, together with his wife Dana and two other
coauthors, published the souped-up model, now filled with
real data, as the "Limits to Growth." The simulation was
wildly successful as the first global spreadsheet. For the
first time, the planetary system of life, earthly resources,
and human culture were abstracted, embodied into a simulation,
and set free to roam into the future. The Limits to Growth
also succeeded as a global air raid siren, alerting the world
to the conclusions of the authors: that almost every extension
of humankind's current path led to civilization's collapse.
The result of the Limits to Growth model ignited thousands of
editorials, policy debates, and newspaper articles around the
world for many years following its release. "A Computer Looks
Ahead and Shudders" screamed one headline. The gist of the
model's discovery was this: "If the present growth trends in
world population, industrialization, pollution, food production,
and resource depletion continue unchanged, the limits to growth
on this planet will be reached sometime within the next 100
years." The modelers ran the simulation hundreds of times in
hundreds of slightly different scenarios. But no matter how
they made tradeoffs, almost all the simulations predicted
population and living standards either withering away or
bubbling up quickly to burst shortly thereafter.
Primarily because the policy implications were stark, clear,
and unwelcome, the model was highly controversial and heavily
scrutinized. But it forever raised the discussion of resources
and human activity to the necessary planetary scale.
The Limits to Growth model was less successful in spawning
better predictive models, which the authors had hoped to
spark with their pioneer efforts. Instead, in the intervening
20 years, world models came to be mistrusted, in large part
because of the controversy of Limits to Growth. Ironically,
the only world model visible in the public eye now (two
decades later) is the Limits to Growth. The authors have
reissued it on its 20th anniversary, with only slight changes.
Forrester's own version of these events are given in a talk
I remember in the mid-1980s, when computers were finally so cheap
that even I owned one, a buddy and I decided we wanted to
re-implement the world model from "Limits to Growth" using a
spread-sheet program. We thought it should work. We got a copy of
"Principles of Systems" (1968) by Jay W. Forrester
( www.amazon.com/exec/obidos/ASIN/0915299879/hip-20 )
as well as a copy of "DYNAMO User's Manual, Fifth Edition" (1976)
by Pugh and Pugh (the sixth edition is still in print)
( www.amazon.com/exec/obidos/ASIN/0262660296/hip-20 )
so we could understand the notation the model was written in.
But we never got it together. Today I think the task would
be easier. The model is available in software form. I believe it
is included in the 1993 follow-up book, "Beyond the Limits:
Confronting Global Collapse, Envisioning a Sustainable Future"
by Donella H. Meadows, et. al.
( www.amazon.com/exec/obidos/ASIN/0930031628/hip-20 )
I'm told that versions are available for modeling language software
Stella from High Performance Systems,
( www.hps-inc.com/ )
as well as for the software Vensim, which is available in a free
( www.vensim.com )
(An overview of such simulation software packages and their use in
world modeling is at:
Sadly, Donella "Dana" Meadows passed away on February 20, 2001.
Energy policy expert Amory Lovins gave her a tribute at a memorial
service that put her life in perspective; a transcript on-line.
( www.rmi.org/sitepages/art1127.php )
So, over 30 years ago an earnest and open attempt at a world
model predicted some gloom and doom, but the world didn't end,
the modelers were vilified, nobody else wanted to be vilified
too (plus it wasn't anybody's job to solve this problem) and so
nobody continued the research and attempted to improve and refine
it. Which would be fine except we're talking about the END OF THE
WORLD here; wouldn't it be a good idea to figure this out one way
or another? Eventually the original group did a little more
advancement of the model, but now their front person is dead.
I remember that Bucky Fuller, in "Critical Path" as well as
some other places, talked about the phenomenon of Thomas Malthus
in the 19th century concluding that most of humanity was destined
to starve because food supply is always outstripped by population.
( www.amazon.com/exec/obidos/ASIN/0312174918/hip-20 )
Bucky explained that originally this result was classified top
secret. Even before Marx the "powers that be" were afraid that
the masses would revolt if they had this knowledge. (Of course
Bucky claimed Malthus was wrong, and that since 1969 we have had
the technology to take care of all earthlings.) Perhaps this
research continues with secret funding and results shared only
with a select elect. (Cue "X Files" theme music.)
But there are two issues that need to be pulled apart here. The
first question is "Should we panic? Is disaster at our doorstep?"
and the second question is "Should we build and refine computer
models to help us predict the world's resource and pollution
problems?" Bucky would say "no" and "yes." Me too. We can
always panic later, but we should be modeling more now. If
every school child played with world models, what an informed
electorate we'd have. (I thought "Sim City" was a step in the
( simcity.ea.com/ )
Some time in the last decade -- I've forgotten exactly when --
scientists using computer models predicted a dire global warming
catastrophe if we didn't reduce greenhouse gas emissions pronto.
Then they revised the model and said the situation was less dire.
An editor for a southern California newspaper (I think it was the
Long Beach Press-Telegram) wrote and editorial condemning the use
of computer models in deciding public policy and suggesting they
should be banned. I wondered: would it be okay to use calculators?
How about counting on fingers? How low-tech and unconscious do our
models need to be to be acceptable to this editor? He obviously
missed the fact that opinions based on ignorance (and stuff
"everybody knows") are models too -- just very bad ones.
The pioneer of fluid dynamics and weather prediction Lewis Frey
Richardson in the 1930s faced the same kind of resistance when
he attempted to introduce the quantitative study of social science.
Several of his books spend their early pages issuing rebuttals to
the people who invoked free will, complexity, unpredictability and
the specter of moral decay to object to the idea of measuring
human behavior for statistical analysis. But Richardson was intent:
he wanted to end war, and he thought you needed mathematical models
to do it.
I for one also think mathematical models (and computer simulations)
are important in almost every field if only to make sure you really
understand the logical consequences of your own premises.
This talk of world resource modeling brings me right up to the
controversy surrounding Bjorn Lomborg
( www.lomborg.com )
and his book, "The Skeptical Environmentalist: Measuring the
Real State of the World" (2001).
( www.amazon.com/exec/obidos/ASIN/0521010683/hip-20 )
The topic is fraught with passions so I don't want to "stick my
foot in my mouth" prematurely; I'm going to read the book and do
other research before I weigh in. (But if Lomborg's allegations
hold water I would find it mighty ironic that some governments in
the 19th century wanted to suppress news of doom while some
environmentalists in the 21st century want to suppress news of
Okay, enough about the world modeling scene for now.
Back to the larger question of what ever happened to cybernetics.
One rather obvious explanation is that it was just too specialized
an idea, focusing as it did on feedback systems. That looked like
the best place to advance knowledge in the 1940s, but later other
stuff piled on not directly related to feedback, such as a more
general theory of systems that processed information instead of
or in addition to mass and energy. Bertalanffy pointed this out
in his "General System Theory" (1968).
( www.amazon.com/exec/obidos/ASIN/0807604534/hip-20 )
CYBERNETICS is a theory of control systems based on
communication (transfer of information) between system
and environment and within the system, and control (feedback)
of the system's function in regard to environment. As
mentioned and to be discussed further, the model is of wide
application but should not be identified with "systems theory"
in general. In biology and other basic sciences, the cybernetic
model is apt to describe the formal structure of regulatory
mechanisms, e.g., by block and flow diagrams. Thus the
regulatory structure can be recognized, even when the actual
mechanisms remain unknown and undescribed, and the system is
a "black box" defined only by input and output. For similar
reasons, the same cybernetic scheme may apply to hydraulic,
electric, physiological, etc. systems. The highly elaborate
and sophisticated theory of servomechanisms in technology has
been applied to natural systems only in a limited extent
(cf. Bayliss, ["Living Control Systems"] 1966; Kalmus,
["Regulation and Control in Living Systems"] 1966; Milsum,
["Biological Control Systems Analysis] 1966).
Bertalanffy makes a case here for looking at complex systems as
sets of interacting components, usually modeled by a set of
ordinary differential equations (ODEs), which taken together define
how all the variables inter-relate, in a network of influences.
Now you're doing systems theory. If you trace a few closed loops
through the network and analyze them apart from the rest of the
system, now you're doing cybernetics. So cybernetics is a branch
of systems theory, sez he.
When I enrolled at UCSC they offered a major in "computer science."
The in a space of four years it was changed to "information and
computer science," then just "information science." I asked one
of the faculty, John Cunningham, why this was. He answered, "You
don't call it chemistry and test-tube science, do you? The computer
is just a tool we use."
It was out of this "Information Science" department that the only
class (freshman level) on "Cybernetics" was offered.
So, arguably, "Cybernetics" has disappeared into "Information Science,"
which is the kind of safe, boring communications engineering work that
Claude Shannon was doing at Bell Labs when he invented the "bit."
I've given some thought to why this might be. I recall the
"Bottlenecks are usually at the tops of bottles."
The people who decide which departments are created in universities
tend to be a mix of administrators and donors. Administrators tend
to be risk-averse and agenda-driven; donors tend to entrepreneurial and
very practical in their orientation to problems. Neither seems likely
to vote for a new theory of everything. Even the Macy conferences
were funded by a non-profit foundation whose mandate was the very
practical study of human medicine -- only in the conferences on the
human nervous system did they launch into a new roadmap for knowledge.
The Macy Foundation didn't order that.
Plus, the name "Cybernetics" is an attempt to communicate some
subtle truths about the bi-directional nature of control and
communication. I don't think most administrators and donors are
in any mood to absorb new, potentially-humbling insights about
the nature of control and communication.
Bateson used to say people wanted to believe in A causing B because
they wanted to believe in the immortality of their own ego.
Feedback reminded them too much of death.
Also, quite paradoxically, "Cybernetics" has been tainted from
both sides by seeming at once too "establishment" and too
"counter-culture." Its roots in "Operations Research" reminds
me of the old saying, "You can kill more men with a pencil in
Whitehall than with a rifle in Flanders," or whatever it is.
It was probably the application of the principles of Operations
Research in World War Two that lead to the allied fire-bombing
of the civilian city of Dresden, to draw German fighters away
from military targets. Later, defense secretary Robert McNamara
under Lyndon Johnson was a big fan of a systems approach to waging
the Viet Nam War. His attempts to maximize enemy body count lead
to soldiers digging up graves and shooting water buffalos and
putting it all in body bags to count as dead Viet Cong.
And yet, I found in my attendance at seminars involving Buckminster
Fuller in the 1980s as well as my reading of cybernetics literature
from the 1970s through 1990s, there seem to be in both groups a large
number of people who are exceedingly averse to precise thinking, and
are seeking a refuge from rigor in holism. "I don't know what it
all means but I think it's great," is a frequent comment, along
with lip service to "systems" and "feedback," with no actual specifics.
And then there's pop culture and the pop-consuming masses, bless 'em.
From "Psycho-Cybernetics" to "cyborgs" they've used the "cyber-" prefix
to mean affirmations, computers, robots, artificial intelligence, and
virtual reality. The BBC radio comedy "The Hitchhiker's Guide to the
Galaxy" had some products from the "Sirius Cybernetics Corporation"
which included an ingratiating elevator and a depressed, cynical robot.
( www.amazon.com/exec/obidos/ASIN/B000008NC1/hip-20 )
The science fiction genre originally called "mirroshades" later was
known as "cyberpunk," and equated "cyber-" with direct neural
interfaces to a simulated 3D data world called "cyberspace," first
mentioned in "Neuromancer" by William Gibson (1984).
( www.amazon.com/exec/obidos/ASIN/0441569595/hip-20 )
The saga continues on May 15, 2003 with the release of the new
"Matrix Reloaded" in IMAX 3D, the long-awaited first sequel to
"The Matrix," the mainstream cyberspace mega-hit movie from 1999.
( www.filmstew.com/Content/DailyNews/Details.asp?ContentID=5731&Pg=1 )
And the mainstream press picked up "cybersex" as a term for internet-
mediated erotic communications. Visit the on-line magazine "Salon"
( www.salon.com )
and search for "cybersex," to get an idea of the amount of virtual
ink spilled on this topic.
Lately I have a new pet theory about what went wrong with cybernetics.
It wasn't just the name, or the focus, of the metaphors. I think that
some of the early breakthroughs occurred because mathematicians were
drafted (or volunteered) in World War Two and given what were really
engineering problems to solve, and some of the insights from the
engineering leaked over into the math. This continued in the early
days of commercial computers in the 1950s, when mathematicians were
hired to be programmers because there were no programmers.
But eventually there were professional programmers, and all the
mathematicians went back to their chalk boards, and our society failed
to institutionalize the transfer of insight from engineering to science
that was involved in this one event, the creation of cybernetics.
The story is typical of John von Neumann at the Institute for Advanced
Studies in Princeton and how he fought to be able to build a digital
computer in the basement (after all, it was a theoretical institute,
not an engineering school). This is the same resistance that Wolfram
faces today; he's been staying up all night looking at computer screens
of cellular automata instead of acting like a theoretical researcher
But, you know what? I do believe cybernetics is coming back.
The increasing number of "hits" I see on my web page, "A Curriculum
for Cybernetics and Systems Theory"
( www.well.com/~abs/curriculum.html )
as well as increasing requests for this e-Zine, assures me of that.
UCLA finally has its cybernetics major, which as a mixture of
information and computer science, chemistry, and biology, seems
perfect for a genomics researcher in the 21st century. And the
"pop" world continues to create interest in "cyber" stuff.
There is even more crosstalk from engineering to science now;
since the laptop Personal Computer (PC) became ubiquitous in
the 1990s, we are all now our own secretaries and our systems
administrators. We all are looking at screens more, learning
about spam, computer viruses and denial of service attacks, feeding
a lot of natural history about computer-assisted memes into our minds.
Well, that's what I think. But I have my biases. I see the world
from the southern end of California. I know that I have subscribers
to this e-Zine around the world, in Australia, Brazil, Canada, Chile,
Egypt, France, Hungary, India, Iran, Israel, Mexico, Nepal,
Netherlands, Poland, South Africa, and the United States. Surely
there is more to the story that I have missed. What ever happened
to cybernetics? What do you think?
Postscript: Cybernetics Journals
I got the following email recently, and wanted to share the exchange
with any of you who might find this useful. Thanks to Bill Moulton
for keeping abreast of this stuff and sharing his knowledge.
Subject: Academic journals on cybernetics
Date: Tue, 11 Feb 2003 23:54:36 -0800
From: "K_____ D_____" <____@____>
Hello Mr. Scrivener
I am looking for articles on Cybernetics and Systems Thinking from
academic journals. Any ideas?
Alan B. Scrivener wrote:
It seems to me you're my nearest expert on this subject.
Hey hi thar Alan,
To wit, her are some of my cyber-favs, pass it on!
Principia Cybernetics Project
Biological Cybernetics - journal
Control & Cybernetics - journal
Advances in Complex Systems
JASS (Journal of Applied Systems Studies) Journal
Systems Research and Behavioral Science
IEEE, Systems, Man and Cybernetics
Open Systems & Information Dynamics
Cybernetics and Systems
International eJournal of Abstracts for Cybernetics and Systems
The Evolution of Cybernetics Journal
Cybernetics & Human Knowing: A Journal of Second Order Cybernetics & Cybersemiotics
Journal of Sociocybernetics
American Society for Cybernetics Archives
Computer & Computational Sciences (LANL)
The Cybernetics Society: Proceedings
The Ross Ashby Papers
The Heinz Von Foerster Papers
American Society for Cybernetics Papers
BCL (Biological Computer Laboratory) Publications
Gregory Bateson Assembled Links Site
Gordon Pask Archive
Website on Systems Thinking as applied to organization
Institute for Study of Coherence and Emergence
Journal of Mind and Behavior
New England Complex Systems Institute
General Systems Bulletin
Journal of Memetics
The Primer project:
The Stafford Beer Project - Team Syntegrity Model
History of Cybernetics slide show
Privacy Promise: Your email address will never be sold or given to
others. You will receive only the e-Zine C3M unless you opt-in to
receive occasional commercial offers directly from me, Alan Scrivener,
by sending email to email@example.com with the subject line "opt in" -- you
can always opt out again with the subject line "opt out" -- by default
you are opted out. To cancel the e-Zine entirely send the subject
line "unsubscribe" to me. I receive a commission on everything you
purchase during your session with Amazon.com after following one of my
links, which helps to support my research.
Copyright 2003 by Alan B. Scrivener