Cybernetics in the 3rd Millennium (C3M) --- Volume 8 Number 1, Apr. 2009
Alan B. Scrivener --- www.well.com/~abs --- mailto:firstname.lastname@example.org
~ or ~
Architecture in Buildings and Software
[If you haven't read part one, see the archives, listed at the end.]
"We apologize for the previous apology."
-- Monty Python
Patient readers, I'm sorry you've had to wait 14 months for part two
of this 'Zine. I've been concentrating on my self-published book
"A Survival Guide for the Traveling Techie" which is ALMOST ready
for the print-on-demand folks. Meanwhile, the Beta is still available.
( travelingtechie.com )
"I call architecture frozen music."
In my haste to get the last issue out I left out two items: this
picture I drew around 1995 to help me sort out architecture trends:
( i164.photobucket.com/albums/u12/c3m_2007/architecture.jpg )
and this actual map of the LA Architecture Tour that we used to navigate:
( i164.photobucket.com/albums/u12/c3m_2007/LA_arch.jpg )
"hardware - the easy part of the system"
-- Stan Kelly-Bootle, 1981
"The Devil's DP Dictionary"
( www.amazon.com/exec/obidos/ASIN/0070340226/hip-20 )
The first part of this 'Zine was devoted to architecture, a word
that can be translated as "top tech."
In his polemic on Modern Architecture, "From Bauhaus to Our House" (1981)
( www.amazon.com/exec/obidos/ASIN/055338063X/hip-20 )
Tom Wolfe talks about how, unlike many of the arts, architecture
is hard to ignore. When I was a boy, it was pretty easy to ignore
software architecture, except for the occasional oddball story in
Readers Digest about a billing computer that went bonkers and
sent somebody a gas bill for a million dollars.
Today this is no longer the case. I remember in the early 1990s
I used to say "a bad day in the year 2000 is when your newspaper,
coffeemaker, car and phone all don't work due to software problems."
Well, I pretty much nailed that one. Nowadays software problems
are everybody's business, and represent one of the most efficient
ways we know to destroy value in an hurry.
DOING MY THINKING FOR ME
"Take for example when you go to the movies these days.
They try to sell you this jumbo drink 8 extra ounces of
watered down Cherry Coke for an extra 25 cents, I don't
want it, I don't want that much organization in my life.
I don't want other people thinking for me. I want my
Junior Mints! Where did Junior Mints go in the movies?
I don't want a 12 lb. Nestles Crunch for 25 dollars.
I want Junior Mints!"
-- Jimmy Buffet, 1994
"Fruitcakes" (song) on "Fruitcakes" (CD)
( www.amazon.com/exec/obidos/ASIN/B000002OS8/hip-20 )
The genesis of this issue was a series of three things that occured
in June of 2005, when I was traveling extensively on business.
1) The first was that I read the book "The Inmates Are Running the
Asylum: Why High Tech Products Drive Us Crazy and How to Restore
the Sanity" (1999) by Alan Cooper.
( www.amazon.com/exec/obidos/ASIN/0672326140/hip-20 )
Most things I read come highly recommended by somebody -- my
wife, a friend, or at least some random blogger -- but this one
I found on a neighborhood library shelf, while dusting off my
old trick of starting at Dewey Decimal number 000 and working
my way through the books on epistemology, library science,
the unexplained, Atlantis, UFOs, and cybernetics. These days
you'll also find software design covered in those first few
Cooper gave me several useful mental tools, including the
idea that you can classify people by which way they want to
go when boarding an airplane. If you just want to get to
your seat and have a cocktail, and relax while you're flown
someplace, you want to turn right. If you want to see the
cockpit and all of the glowing gizmos that are modern aircraft
instruments and controls, you want to turn left. Our entire
civilization consists of a majority who want to turn right,
depending on technology engineered by a minority who want to
He described some recent events in which a U.S. Navy battleship
and a number of Porsche sportscars were rendered "dead in the
water" due to software bugs. He pointed out that both the
U.S. Navy and the Porsche SE company are usually pretty good
at getting what they want from engineers, the former driven by
urgency of surviving naval battles, and the latter maintaining
a 75 plus years reputation for excellence, and yet they're just
as helpless as everybody else at getting what they want from
This leads him to the question, "What do you get when you cross
a computer and a [fill in the blank]." What do you get when you
cross a computer and a battleship? A computer. What do you get
when you cross a computer and a sportscar? A computer. What
do you get when you cross a computer and a video recorder? A
computer. And so on. What he means by this is whenever you
introduce computer technology into these devices, you inherit
the computer's failure modes. Your highly redundant Space
Shuttle loses its ability to launch and land due to a software
( en.wikipedia.org/wiki/STS-1 )
Cooper also explains how "software engineers" (a.k.a. programmers,
developers, and "the inmates") jigger time estimates to achieve
outcomes they want. In most software development organizations,
the marketing department ostensibly controls what new features
and/or bug fixes will be in each release. But if the developers
don't want to work on something (because it is "inelegant"
or "boring" or "stupid" perhaps), all they have to do is give it
a very long time estimate, and marketing will say they can't wait,
go on without it.
But the most vivid mental tool I got from this book was his
usability analysis of two items usually found in my pocket:
* a Swiss Army knife
* an automobile remote entry fob
The Swiss Army knife, though quite dangerous, rarely injures
its users, because the danger is obvious. The available
functionalities are also obvious.
In contrast, the fob has this red button with a horn icon.
What does it do? It sets off the horn in alarm mode.
The designers typically explain that this is a security
feature, especially for young urban women in parking garages.
But Cooper and I have both found that (a) people universally
ignore car alarms, (b) no young urban women ever use this
feature, and (c) the only time it goes off is by mistake,
becoming, in effect, an "I'm stupid" button.
2) The second thing that happened was that right after reading
Cooper's book I was sent on a business trip from San Diego,
California to Vestal, New York, (near Binghamton, "Twilight
Zone" creator Rod Serling's home town) to solve a customer
problem that neither I nor my boss could solve over the phone.
We'd sent this guy a Micorsoft Access database program, packed
inside a "zip" archive, pretty standard stuff. The only potential
curveball was that we had Windows 2000 in our lab, but the
customer had the new Windows XP which we weren't running yet.
But it shouldn't be an issue, right?
Because of another appointment in central Massachusetts and the
difficulty of lining up puddle-jumper flights that didn't take
all day, I ended up driving about five hours in a rented car
to Vestal, New York.
When I got there I solved the problem in a minute. Come to
find out, in XP the icon for a zipped archive isn't a little
filing cabinet in a vice anymore,
( i164.photobucket.com/albums/u12/c3m_2007/zip_2k.jpg )
it's a file folder with a zipper.
( i164.photobucket.com/albums/u12/c3m_2007/zip_xp.jpg )
When you double-click it, instead of unarchiving its contents, it
appears to OPEN THE FOLDER and show you what's "inside." There
was our database, still inside the zip archive. When you
double-clicked the database icon, it ACTUALLY RAN THE ACCESS
DATABASE, which did fine on all the data READS, but crashed the
first time it tried to WRITE data to the disk. Thank you
Microsoft, for trying to "help" me, adding half of a new feature.
3) The third thing that happened was on the way back from Vestal,
driving through the whole state of Massachusetts west to east
on the Mass. Pike, from Stockbridge to Concord, near Boston.
I took a detour to see the historic Wayside Inn in Sudbury.
( www.wayside.org )
It was the inspiration for Longfellow's "Tales of a Wayside"
( www.amazon.com/exec/obidos/ASIN/1425519547/hip-20 )
and it has a beautiful old grist mill beside it, and I wanted
to take some pictures. I arrived at twilight and the light was
fading fast. I hurried to photograph the mill wheel and waterfall
( i164.photobucket.com/albums/u12/c3m_2007/Wayside_001.jpg )
down by the old mill stream,
( i164.photobucket.com/albums/u12/c3m_2007/Wayside_002.jpg )
and then I spotted an amusing sign in the colonial style that
said, "LONGFELLOW'S Wayside Inn FOOD, DRINK and LODGING for
MAN, WOMEN and BEAST" and I wanted to get one last picture
of it. Right then the batteries in my digital camera decided
to die. No worries, I had fresh batteries in the car.
After I replaced them the camera refused to go on until I set
the date and time. My watch had stopped, so I sat down behind
in the driver's seat, inserted the ignition key, and turned it
until the clock radio lit up, and noted the time. Then, unable
to clearly see the little tiny buttons on the camera, I got back
out of the car and put the camera on the roof so I could see
it better in the twilight. After I set the time I closed the
car door and walked over to the funny sign and got my picture.
( http://i164.photobucket.com/albums/u12/c3m_2007/Wayside_003.jpg )
Upon returning to the car I discovered that the car had decided
to "help" me by locking the doors. The key was in the ignition,
in the on position, and the doors were closed, so I MUST want
the door locked, right? Of course, my cell phone was inside
( i164.photobucket.com/albums/u12/c3m_2007/Wayside_004.jpg )
It took walking over the inn several times to beg the
use of their phone, calling triple-A repeatedly, and otherwise
waiting 90 minutes by the car (I knew tow truck driver's don't
look very long and hard for you) while mosquitos attacked --
it was their favorite time of day! -- before I got back in to
Once again I had been tripped up by engineers trying a little
too hard to be helpful but not getting it right. It was just
like Cooper's analogy of the Swiss Army knife and the key fob.
I began muttering the quote above from Jimmy Buffet about how
"I don't want other people thinking for me. I want my Junior
And so this was the genesis for this issue. Despite having hear my whole
life that you should "write about what you know," this will only be the
third time I've composed an essay having anything to do with software
design, a field I have worked in for three decades. (The first essay
was on user interface design, in 1980 and the second was on visual
programming, in 1993.) But I've reached a threshold of being fed up
with how badly the whole thing is done, and feel I must protest the sins
of my own profession.
"On two occasions I have been asked, 'Pray, Mr. Babbage, if you put
into the machine wrong figures, will the right answers come out?'
I am not able rightly to apprehend the kind of confusion of ideas
that could provoke such a question."
-- Charles Babbage
In his books on the history of ideas, "The Idea of Nature" (1960)
( www.amazon.com/exec/obidos/ASIN/0195002172/hip-20 )
and "The Idea of History" (1928),
( www.amazon.com/exec/obidos/ASIN/0198239572/hip-20 )
R. G. Collingwood reminds us how difficult it is to imagine not having an
idea. Imagine not having the idea of "software," which is actually pretty
odd. Indeed, in the intellectual climate of the 19th century the basic
concepts of computer programming were very alien. It was into
this world that Charles Babbage attempted to introduce his concepts
of "the Difference Engine" and "the Analytical Engine," and his
girlfriend, Lord Byron's daughter, Lady Ada Lovelace, was the first
person to write software documentation, and also invented the GOTO
Babbage's analytical engine, which ran on steam and used gears to build
its logic, had the same architecture as electronic computers when they
were independently reinvented almost a century later. What he called
the MILL we call the PROCESSOR. What he called the STORE we call the MEMORY.
We look in vain in our history and mythology for guidance in managing
this new magic-like power of the mechanical mind, and find very little:
the Sorcerer's Apprentice, Hercules vs. the Hydra, and Jesus' Parable
of the Mustard Seed are all that come to my mind.
"What most experimenters take for granted before they begin
their experiments is infinitely more interesting than any
results to which their experiments lead."
-- Norbert Wiener
( thinkexist.com/quotes/norbert_wiener/ )
It is quite amazing how the mathematics, engineering and applications
of computers interacted in the "pre-history" days before we had the
modern programmable electronic computer. I have written before,
in C3M volume 5 number 2, "Even Better Than the Real Thing,"
( www.well.com/~abs/Cyb/4.669211660910299067185320382047/c3m_0502.txt )
of "A Computer Perspective" (1973),
[A] deep and thought-provoking museum exhibit and book of the same
name ... by the incomparable Charles and Ray Eames,
( www.amazon.com/exec/obidos/ASIN/0674156269/hip-20 )
that tells the story of computing devices before the computer, including
the weather-predition work of Lewis Fry Richardson the calculating
machines of Pascal, and the integrating analog computers of Lord Kelvin.
( en.wikipedia.org/wiki/William_Thomson,_1st_Baron_Kelvin )
I am particularly intrigued by the latter, which use disks, balls and
cylinders to find the integral of a rotating shaft's angular velocity.
( upload.wikimedia.org/wikipedia/commons/thumb/2/24/Harmonic_analyser_disc_and_sphere.jpg/800px-Harmonic_analyser_disc_and_sphere.jpg )
In the picture, the disk is spinning with an input velocity, the ball
is constrained to move only left and right, with a spring pushing left
and centrifugal force from the spinning disk pushing it right, and the
cylinder matching the ball's speed, as it traces a larger or smaller
path along the disk.
Once he had an integrator he could -- using Fourier's Integral -- build
a harmonic analyzer, precursor to today's oscilloscopes with waveform
capture and analysis software, such as the "Tektronix WaveStar."
( www.testequity.com/products/567 )
This also enabled a practical application that required harmonic
analysis: tide predicition.
( en.wikipedia.org/wiki/File:DSCN1739-thomson-tide-machine.jpg )
Another fascinating history is "The Computer from Pascal to von Neumann"
(1972) by Herman H. Goldstine.
( www.amazon.com/exec/obidos/ASIN/0691023670/hip-20 )
Just a reading of his table of contents is quite illuminating.
( i164.photobucket.com/albums/u12/c3m_2007/Goldstine_toc1.jpg )
( i164.photobucket.com/albums/u12/c3m_2007/Goldstine_toc2.jpg )
Of course, the great theoretical breakthrough in computing came with
Alan M. Turing's 1936 paper in the Proceedings of the London Mathematical
Society, with the title nobody likes to cite, just saying "Turing 1936."
It is "On Computable Numbers, with an Application to the Entscheidungsproblem."
( en.wikipedia.org/wiki/Alan_Turing )
Turing managed to prove some remarkable things about "computability,"
creating a family of mathematical models called Turing machines, and then
proving that a Universal Turing Machine (UTM) existed that could EMULATE
all the other machines. Turing argued that anything we would consider
"computable" can be done by a Universal Turing Machine. From this
theory the term "Turing Complete" has entered the jargon of programming,
to describe any model, computer language, or mechanism that can do what
a UTM can do. In "A New Kind of Science" (2002)
( www.amazon.com/exec/obidos/ASIN/1579550088/hip-20 )
Stephen Wolfram proved that at least one of the 256 simplest cellular
automaton models is Turing Complete.
Turing also gave us the "Halting Problem"
( en.wikipedia.org/wiki/Halting_problem )
which says that no Turning Machine can reliably tell -- in the general
case -- if another Turing Machine will finish a computation, or get
stuck in an infinite loop. This explains why a program's own estimate
of when it will finish is almost always wrong. The little blue
rectangles showing progress, they lie. Sometimes they even go backwards!
I've seen it!
(And, of course, Turing gave us the Turing Test for Artificial Intelligence.)
( en.wikipedia.org/wiki/Turing_test )
Turing's ideas are intimately intertwined with Kurt Godel's famous
incompleteness theorem, and the so-called "predicate calculus" which
eerily resembles a programming language though it was developed before
computer. All of this is well-explained in "Godel, Escher, Bach,
An Eternal Golden Braid" (1979) by Douglas Hofstadter.
( www.amazon.com/exec/obidos/ASIN/0465026567/hip-20 )
What nobody seemed to realize until everything was declassified 25 years
later was that Turing actually BUILT some UTMs, during World War II, as
part of the secret cryptography work done by the British at Bletchly Park.
This is why William Gibson, in his cyberpunk novel "Count Zero" (1987),
( www.amazon.com/exec/obidos/ASIN/0441117732/hip-20 )
( project.cyberpunk.ru/lib/count_zero/ )
has one of his characters -- an expert on Intrusion Countermeasure
Electronics (ICE) -- say:
"I been in the trade forever. Way back. Before the war,
before there was any matrix, or anyway before people knew
there was one. ... I got a pair of shoes older than you are,
so what the #### should I expect you to know? There were cowboys
[i.e., system crackers] ever since there were computers. They
built the first computers to crack German ice. Right?
Codebreakers. So there was ice before computers, you wanna
look at it that way."
This is documented in the nonfiction book "The Codebreakers: The
Comprehensive History of Secret Communication from Ancient Times to
the Internet" (1996) by David Kahn
( www.amazon.com/exec/obidos/ASIN/0684831309/hip-20 )
and fictionalized nicely in the novel "Cryptonomicon" (1999) by Neal Stephenson.
( www.amazon.com/exec/obidos/ASIN/0060512806/hip-20 )
Meanwhile, at Los Alamos, the Manhattan Project scientists building the first
atom bombs had to do their complex implosion calculations using old-style
"computers" -- human beings operating adding machines, enthusiastically
lead by physicist Richard Feynman. (The Los Alamos guys wouldn't
get an electronic computer until after the war when John von Neumann went
to the Institute of Advanced Study in Princeton and got one built for them.)
THE BOOTSTRAPPING ERA
COUNT ZERO INTERRUPT -- On receiving an interrupt, decrement
the counter to zero.
He's only had the Ono-Sendai deck for a month, but he already knew
he wanted to be more than some Barrytown hotdogger. Bobby Newmark,
aka Count Zero, but it was already over. Shows never ended this
way, not right at the beginning.
"You've probably seen one of these before," Beauvoir said, as the man
he called Lucas put the projection tank down on the table, having
methodically cleared a space for it.
"In school," Bobby said. ... "They used one to teach us our way
around in the matrix, how to access stuff from the print library,
"Well, then," Lucas said, straightening up and brushing nonexistent
dust from his big pink palms, "did you ever use it for that, to
access print books?" He'd removed his immaculate black suit coat;
his spotless white shirt was traversed by a pair of slender maroon
suspenders, and he'd loosened the knot of his plain black tie.
"I don't read too well," Bobby said. "I mean, I can, but it's work.
But yeah, I did. I looked at some real old books on the matrix and
"I thought you had," Lucas said, jacking some kind of small deck
into the console that formed the base of the tank. "Count Zero.
Count zero interrupt. Old programmer talk."
-- William Gibson, 1987
"Count Zero" (cyberpunk sci-fi novel)
( www.amazon.com/exec/obidos/ASIN/0441117732/hip-20 )
( project.cyberpunk.ru/lib/count_zero/ )
As I have studied the early history of electronic computers and
how they were programmed, one thing that stands out is how slow progress
seemed at first. Early models had their programs entered by connecting
patch cords like those used in the old telephone switchboards. The famous
"von Neumann Architecture" involved storing the instructions in the same
kind of memory as the data. The invention of the "assembly language" allowed
the use of mnenonics, little nick-names for the instructions like ADD
and JUMP instead of binary codes like 0101 and 0110. A special program
called "the assembler" converting the assembly language into the binary
codes, saving the programmer one step in the tedium.
After some false starts the concept of a "parameter" was developed,
a place to put numbers to be operated on, sort of like the "x"
in "f(x)", so that a sequence of instructions could be re-used.
The invention of "relocatable" code, which required "relative" jumps,
was another step in the direction of modularity. A new program
was created, the "loader," to load a relocatable program, later renamed
the "linker" as it was taught to thread bits of code together.
Then the idea of the "subroutine," which required a "call return stack"
got things really cooking. The the "interrupt" was invented, which
required a bit of saving and restoring partially done calculations,
but which allowed a computer to temporarily pay attention to a transitory
problem, allowing the programmers to express problems in a way closer to
the way they thought.
The eternal quest was for "a calculus of intention," a term coined
by neuroscientist Warren McCulloch, according to Paul Ryan in the 1971
article "Cybernetic, Guerrilla Warfare" in "Radical Software" v. 1 n. 3
( www.radicalsoftware.org/e/volume1nr3.html )
We're still not there, searching for the magic "DWIM" instruction
("do what I mean!") that would avoid this old lament:
I really hate this damned machine
I wish that they would sell it.
It never does what I want
Only what I tell it.
-- Men's room wall at Berkeley
But in those early days, the programmer was almost always the
electronic engineer who built then hardware. The creation of
a separate programmer job hadn't happened yet.
The engineers were used to tedium. After painstakingingly building
an electronic computer out of relays or vacuum tubes or even
transistors, the engineer would then painstakingingly program it,
often by toggling in bits on front panel switches. When paper and
magnetic tape storage of programs came along, they had less to toggle in,
just the "bootstrap routine" that taught the computer how to read in
a tape, allowing to "raise itself up by its bootstraps."
Something else that hadn't happened yet was the people programming
these computers had not yet realized how much of their time they
would spend finding and correcting their own errors, or "bugs."
In "The Airy tape: An Early Chapter in the History of Debugging"
Annals of the History of Computing, IEEE, Volume 14, Issue 4, 1992,
( ieeexplore.ieee.org/iel4/85/5016/00194051.pdf?arnumber=194051 )
M. Campbell-Kelly describes the discovery of a paper-tape relic consisting
of a draft (buggy) program written for the EDSAC computer in 1949, possibly
the first real, nontrivial application ever written for a stored-program
...the Moore School of Electrical Engineering organized a summer
school entitled "Theory and Techniques for Design of Electronic
Digital Computers," which took place during July and August 1946.
A number of computer programs were given by the lecturers on the
course for expository purposes, but the fact that they contained
some obvious errors caused no special comment because, like
typographical errors in a mathematical proof, they did not affect
the general principles involved. In fact, no evidence has yet
emerged that anyone had conceived of the debugging problem until
programs were tried on real computers. ... Nowhere in the ...
150 pages of ... reports [from the Princeton group] is the
possibility that a program might not work the first time it was
presented to a computer so much as hinted at. Consequently, there
was a surprise in store for the first group that completed a
digital computer and attempted to run a nontrivial program on it.
This turned out to be the EDSAC computer group, in the summer of 1949.
He quotes the original author of the first EDSAC program, Maurice Wilkes,
as he recalls his discovery of the error-correction problem in his Memoirs:
By June 1949 people had begun to realize that it was not so easy
to get a program right as had at onetime appeared. I well remember
when this realization first came on me with full force. The EDSAC
was on the top floor of the building and the tape-punching and
editing equipment one floor below on a gallery that ran round the
room in which the differential analyser was installed. I was trying
to get working my first non-trivial program, which was one for the
numerical integration of Airy's differential equation. It was on
one of my journeys between the EDSAC room and the punching equipment
that "hesitating at the angles of stairs" the realization came over
me with full force that a good part of the remainder of my life was
going to be spent in finding errors in my own programs.
The process or correcting the errors, or "bugs" in software was so new
that it didn't have a name yet.
WE'RE DEBUGGING IT
"I meant to do that."
-- Pee Wee Herman
There has been a bit of a flap as some ignorant bloggers have claimed
that Grace Hopper boosters credit her with inventing the term "bug" to
describe a problem, when that term dates back to 19th century
( www.xnetmedia.com/users/antibug/en/bug.html )
( en.wikipedia.org/wiki/Grace_Hopper )
According to her own account in Anecdotes, Annals of the History of Computing,
In the summer of 1945 we were building Mark II; we had to build it in
an awful rush -- it was wartime -- out of components we could get our
hands on. We were working in a World War I temporary building. It was
a hot summer and there was no air-conditioning, so all the windows
were open. Mark II stopped, and we were trying to get her going. We
finally found the relay that had failed. Inside the relay -- and
these were large -- was a moth that had been beaten to death by the
relay. We got a pair of tweezers. Very carefully we took the moth
out of the relay, put it in the logbook, and put scotch tape over it.
Now, Commander Howard Aiken had a habit of coming into the room and
saying, "Are you making any numbers?" We had to have an excuse when
we weren't making any numbers. From then on if we weren't making any
numbers, we told him that we were debugging the computer.
( cs.anu.edu.au/Student/comp2100/lectures/FirstBugCorrections.html )
What Captain Hopper actually deserves credit for is coining the verb
"to debug," an activity that definitely needed a name. It has long been known,
for example, that mathematicians work backwards from their conclusions
to their premises, and then publish the forward sequence as if that's
how the work was done. The work is already debugged when it is published.
Mathematicians seem ashamed of the jerky paths they followed to reach
their results, and usually don't share them. This attitude spills
over into engineering sometimes, where it is even more dangerous.
With machines, failure has a mathematics all its own.
Herbert Simon in "Sciences of the Artificial" (1969)
( www.amazon.com/exec/obidos/ASIN/0262691914/hip-20 )
provides a rigorous definition of "mechanism," but that definition
applies only to a machine functioning correctly, so that it matches
the model it is based on. The model breaks down when a machine fails,
needing to be replaced with a far more complex model of a malfunctioning
From a programmer's point of view, hiding failure is disastrous.
Many more things can go wrong after an initial failure that "hide
its tracks." There is even a programming principle, "fail early,"
which recommends grinding to halt at the first sign of incongruity,
before more damage can be done (to a database, or a robot, or even
The 10-hour drive to extract a zip archive which I described at the
beginning of this 'Zine could have been avoided if Microsoft's
programmers had followed the "fail early" principle more.
THE MAINFRAME ERA
"Dear God I am so bored."
-- inscribed by an operator on the NORAD SAGE computer,
(in use from 1962 until 1983), watching for a Soviet
nuclear attack that never came; on display at the
Computer Museum in Boston from the late 1980s until 1999
( www.smecc.org/sage_a_n_fsq-7.htm )
The funny thing about about "mainframes" is that nobody called them
that until they were obsolesced by minicomputers in the 1970s. We
just called them "computers."
I wrote my first program in 1967, at age 14, in FORTRAN, on a piece of
notebook paper, after reading a book about the language. It did long
division in decimal the way I had been taught in grade school. I wish
I still had it; I'd love to see how many bugs it has! The first
program I wrote that a computer actually RAN was in 1971, when I was 18,
in the ALGOL language, punched into cards for an IBM 360-30 mainframe,
for a college freshman programming class. We handed our decks through
a slot in a glass wall, and 15 to 120 minutes later got back a printout.
My final project for the class was a program that would play solitaire
with itself and tell me who won. When I found the card deck in my garage
a dozen years later, I fell down laughing at what a stupid concept that was.
But before all that I read everything I could get my hands on ABOUT
computers. Most of it was hogwash, anecdotes like "whenever it breaks
down we have to get that little red-headed programmer in here. She's the
only one it likes." Clearly the authors had no idea what they were
writing about. I learned that a computer could NOT RUN without software,
but not what software was.
Once I saw media critic Marshall McLuhan on TV and he was saying that
the biggest threat to privacy ever created was the computer. This
made no sense to me; I imagined a refrigerator-sized box wearing
sneakers and peeking in someone's bedroom window. Huh? (Of course
he meant that the ability to aggregate, sort and retrieve facts on
people quickly was a privacy threat, which I eventually realized many
Finally I happened upon a book that explained it this way: a computer is
like a rail switching yard. There are zillions of side-tracks with short
trains of boxcars, each all black or all white. A little switcher engine
grabs these little trains one at a time and pulls them into a special
sidetrack called the "instruction register." Then a yard worker in a tower
looks down at the little train, and DEPENDING ON THE ORDER OF BLACK AND
WHITE BOXCARS, directs other switcher engines to move little trains of
boxcars around. And so on. Add an "arithmetic unit" to duplicate and
re-arrange some of the trains, and some I/O (input/output), and you've
got a computer! (It's even Turing-complete.) I got it, at about age ten.
Later in life I found studying the right comics to be a great way to
learn about computers. Richard L. Didday's "Finite State Fantasies" (1976)
( www.free-soil.org/tour/history.html )
and Ted Nelson's "Computer Lib/Dream Machines" (1974) come to mind.
( www.amazon.com/exec/obidos/ASIN/0914845497/hip-20 )
( www.digibarn.com/collections/books/computer-lib )
Studying Babbage's design for the Analytical Engine in detail is also
a great boon to understanding computers. (There is now a Java Applet
( www.fourmilab.ch/babbage )
Babbage got his funding from the Navy, who wanted to automatically
print ballistics tables. When Grace Hopper found the dead moth
in the relay nearly a century later, it was still Navy money paying
for the computing machines, and with the same application in mind.
With the end of the war and the beginning of commercial uses of computers,
the new biggest funding source was insurance companies. They had
accumulated statistical information from all of their field offices
on millions of people's lifespans, and now wanted to use this hard-won
data to calculate aggregate actuarial tables.
The company that made the most hay in the mainframe era was, of course,
International Business Machines, founded by Herman Hollerith
right across the Susquehanna River from Vestal, NY (remember Vestal?)
in Endicott. As the Wikipedia article on IBM
( en.wikipedia.org/wiki/Ibm )
The company which became IBM was founded in 1896 as the Tabulating
Machine Company by Herman Hollerith, in Broome County, New York
(Endicott, New York (or Binghamton New York), where it still
maintains very limited operations). It was incorporated as
Computing Tabulating Recording Corporation (CTR) on June 16, 1911,
and was listed on the New York Stock Exchange in 1916.
Again, Binghamton was the prototype for Rod Serling's "Twilight Zone."
And of course, the great software management classic, "The Mythical
Man-Month" (1974) by Frederick P. Brooks,
( www.amazon.com/exec/obidos/ASIN/0201835959/hip-20 )
was a product of the mainframe era.
Some of the culture of this time is also to be found in a humorous lexicon
inspired by Ambrose Bierce's sarcastic classic "Devil's Dictionary" (1881),
( www.amazon.com/exec/obidos/ASIN/1595479406/hip-20 )
( classiclit.about.com/library/bl-etexts/abierce/bl-abierce-a.htm )
called "The Devil's DP Dictionary" (1981) by Stan Kelly-Bootle,
( www.amazon.com/exec/obidos/ASIN/0070340226/hip-20 )
which gave us classics such as:
implementation - n. The fruitless struggle by the talented and underpaid
to fulfill promises made by the rich and ignorant.
infinite loop - see loop, infinite
loop, infinite - see infinite loop
recursive - see recursive
A lot of what I've learned about the mainframe era (besides my own
late-night ALGOL-coding sessions in the 1970s) came when I had many
opportunities to hang out at the Computer Museum in Boston in the 1990s
(alas now defunct). One thing I thought was cute was that when these
insurance companies got their brand new computers they needed to hire
programmers for them, but there was no such job almost no one with any
experience. So they put ads in newspapers for people with mathematics
degrees who also played chess, thinking this combination might do the
trick -- and apparently it did.
Another thing I learned, that I've thought about a lot since, had to
do with how the computers were programmed -- and debugged. There was
a console with a teletype, but that was for scheduling and monitoring
jobs. Programmers were expected to key their programs into cardpunch
machines and load the card decks into card readers connected to the
computer in order to run their programs. Typing programs in on the
console was possible, but frowned upon, because computer time was so
INCREDIBLY EXPENSIVE. But this was the favorite way for programmers
to debug their code, because they could make a change and re-run a program
quickly, speeding up the feedback loop between human and machine. Finally
management banned programmers from the console entirely, and hired special
"computer operators" who DIDN'T know how to program to run the jobs
and monitor them. But some of them SECRETLY learned to program, and
would sneak in debugging time, and so it went.
THE PESKY HUMAN FACTOR
"Mind in the loop."
-- motto of Mindtel Corp.
( www.mindtel.com )
Grace Hopper went on to help invent the language FORTRAN, short for
FORMULA TRANSLATOR, which was the first "high level" language.
( en.wikipedia.org/wiki/Fortran )
The story is told how her group got the first FORTRAN compiler working,
and tested it with their own programs, and then went looking for a
"naive user" unconnected with their project to be a guinea pig. They
taught him FORTRAN and had him write a program. He punched it
in and gave it to the compiler, which spit out an error: "Missing
right parenthesis." At this point this early "software developer"
asked a question that has never been answered to my satisfaction:
"If the computer knew what was wrong, why didn't it fix it?"
Without fail the "human factor" in software design has been the hardest
part to get right since the mainframe era. Having engineers even
think in terms of human psychology has been an uphill struggle, though
after the fact (and after Steve Jobs proved that good user interface
design can make you a billionaire) the "human engineering" approach has
been acknowledged as vital.
On several occasions I met an aviation pioneer named George Hoover,
( www.astronautix.com/astros/hooeorge.htm )
who claimed to have coined the term "human engineer" when he wanted
to hire a perceptual psychologist for cockpit design work.
When Walt Disney began designing the dark rides for Disneyland in
the early 1950s, he was able to hire aerospace engineers in the Los
Angeles area with "man-rated" experience working on aircraft cockpits.
Another famous "man-rated" device as the aircraft simulator with
motion base, which appeared in the form of the Link Trainer, in
Binghamton, New York in 1927. (George Hoover claimed the first design
was a failed attempt at a carnival ride, and it was HIS idea to make
it a trainer.) Today one of the early units is on display in the
baggage claim of the Binghamton Airport.
( en.wikipedia.org/wiki/Link_Trainer )
(Off topic: while I was working on this 'Zine the news had a story about
a laid-off IBM employee shot up an office in Binghamton. While searching
Google News for that story just now I found this one instead:
"Binghamton earns Tree City USA title"
News 10 Now - 2 hours ago
BINGHAMTON, NY -- Binghamton celebrated the designation of
"Tree City USA" for the second year in a row...)
I remember the first time I visited Rockwell International's Space Systems
Division in Downey, California (an LA suburb), which was where space shuttles
were designed and built, as well as the Apollo capsules before them.
The engineers took me into a giant hangar and showed me a huge wall that
looked like a sideways model railroad. Miniature grass, lakes, trees,
buildings, roads and such covered a wall dozens of feet high, forming the
geography of Kennedy Space Center at Cape Canaveral, Florida. A little
video camera on a 3-D XYZ gantry and tilt angles for roll, pitch and yaw
could "fly" over this wall and produce real-time video of what looked
like the actual cockpit view out of a landing space shuttle orbiter.
( en.wikipedia.org/wiki/Space_Shuttle_Orbiter )
A full-sized plywood mock-up of the orbiter cockpit had the instruments
in place, and TV screens for the windshields, showing the video from the
giant wall and interacting when you operated the cockpit controls correctly.
You could land it correctly, or crash it, depending on your skill.
It was in essence a simulator with an analog graphics system. So I asked
my escort, "Was this to train the astronauts?" No, there was another
simulator in Houston to do that. This system was built expressly and only
to test software the way it would be used in flight, with a "man in the loop."
That first visit was in 1983, working for a company trying to sell
Rockwell computer graphics system. Well, they bought a few, and later
I went to work for them programming one, which was kind of "instant
karma" since I'd designed the Application Program Interface (API).
While I got started experiencing the ramifications of my design decisions,
I got to know my neighbors in the aerospace cube farm, including two
ladies who were called "human factors engineers." They were really
good at answering questions like "what are the longest and shortest arms
you'll find on 90% of the American male population?" They also designed
animated humans for us to test in our computer graphics world of space
shuttles and space stations.
Well, that was the only job I ever had where we had human factors
experts by name on the payroll. (Oh, wait, there was a woman who
showed up a few months before a dot-com imploded who was supposed
to be a UI expert, but she didn't really do much.)
Instead we expect the engineers to be the human engineers as well.
My experience is that this doesn't work so well. Organizations
seem to excel at filtering out customer feedback, passing it through
technical sales and product marketing before it gets to engineering,
garbling all the way like a game of "telephone."
The attitude towards users ranges from the bug-eyed marketing guy
in Dilbert comics
( www.amazon.com/exec/obidos/ASIN/0740772279/hip-20 )
who, when Dilbert points out his new keyboard mock-up has no "Q" key says,
"You sound just like our whiney customers," to the movie "Tron" (1982)
( www.amazon.com/exec/obidos/ASIN/B00005OCMR/hip-20 )
in which the critters inside the computer have made belief in the
"users" a banned religion.
The fact is, the human is the most flexible and yet the least understood
component in the feedback loop between human and machine. All those
who insist software design must "logical" and "rational" miss this.
In my experience it is more useful for it to be metaphorical and
And though Moore's Law continues to chug along giving us exponentially
faster, cheaper, more powerful, lower-energy-using computers, the real
breakthroughs occur when we get new I/O devices, and those are
remarkably rare. In the early 1990s I did some research on the
origins of some common human-machine interfaces, and included it in
a paper I presented at the conference "Medicine Meets Virtual Reality II
-- Interactive Technology and Healthcare: Visionary Applications for
Simulation, Visualization and Robotics," January 27-30, 1994, sponsored
by UCSD School of Medicine. It was called "The Impact of Visual Programming in Medical Research.
( www.well.com/~abs/impact_vis_med.html )
Here is an excerpt:
...most types of computer input devices predate the computer itself,
and many date from the 19th century. The once-popular punch card
was developed in 1805 for the control of Jacquard looms. The
keyboard goes back to the manual typewriter, invented in 1829.
The joystick comes to us from the airplane, an 1890 innovation.
The buttons and paddles of today's video games are simply
electrical switches and potentiometers, understood since around
the 1820s. The audio speaker goes back to the 1876 telephone.
The cathode ray tube dates from 1855, and its use in television
is 1930s vintage. (In fact, so is the NTSC television broadcast
standard which we are still saddled with while government and
industry endlessly debate High-Definition TV). Even the high-
tech-sounding "asynchronous serial communications line" (as in
the still-popular RS-232 standard) and the near-universal ASCII
code (used for storing and sending alphanumeric text) both
originally come to us from an electro-mechanical device: the
teletypewriter . First invented in 1914, the old yellow-paper
"teletype" is still fondly remembered as the workhorse of
newspaper wire services, as well as for its more recent, somewhat
anachronistic use (well into the 1970's) as a cheap and reliable
hard-copy terminal for minicomputers.
In fact, I can only think of a handful of human interface
peripherals which were designed expressly for computers: the
light pen (almost never used any more), the track ball and its
upside-down counterpart, the mouse, the touch-sensitive screen,
the magnetic or mechanical 3D space tracker (such as those used
in virtual reality "data gloves" and "eyephones"), and a rather
obscure knob-like 3D positioning device called "the spaceball."
(Even futuristic eye-trackers, direct brainwave interfaces, and
electric field muscle motion detectors came first from the analog
electronics world.) It is easy to imagine that there are many
revolutionary human-machine interface devices waiting to be
"Possibly the single most successful minicomputer design in history,
a favorite of hackers for many years, and the first major Unix
machine, The first PDP-11s (the 11/15 and 11/20) shipped in 1970
from DEC; the last (11/93 and 11/94) in 1990. Along the way, the
11 gave birth to the VAX, strongly influenced the design of
microprocessors such as the Motorola 6800 and Intel 386, and
left a permanent imprint on the C language (which has an odd
preference for octal embedded in its syntax because of the way
PDP-11 machine instructions were formatted)."
-- entry for "PDP-11" in "The Jargon File"
( www.catb.org/jargon/html/P/PDP-11.html )
In 1977, straight out of the redwoods of UC Santa Cruz, newly wed,
I took a job at a minicomputer manufacturer: Data General, in a
little New England town called Westborough, Massachusetts.
While driving east from California I read a book I'd come by somehow
that turned out to be the ideal introduction to my new job,
"Computer Lib/Dream Machines" (1974) by Ted Nelson.
( www.amazon.com/exec/obidos/ASIN/0893470023/hip-20 )
( www.digibarn.com/collections/books/computer-lib )
This amazing oversized book, as big as a "Whole Earth Catalog,"
( www.amazon.com/exec/obidos/ASIN/B001EPHEFY/hip-20 )
was actually two books in one: "Computer Lib," which sought to liberate
the "common man" from a fear of computers by offering a concise though
detailed and thoroughly understandable tutorial on the current state
of the art, and "Dream Machines," which offered a compelling vision
of the future of the human interface, including bit-mapped graphics,
smooth bit-wise scrolling, animated window opening and closing, and,
his greatest achievement, HYPERLINKS.
Upon arriving in Westborough we eventually ended up living at
1313 Summer Street, Apartment 13, with a second floor balcony that
overlooked the middle of town, a traffic circle (or, as the locals
called it, "a rotary"). Every Memorial Day the town fathers awoke
us at dawn firing cannon -- packed with powder, wadding and grass
clippings -- from the green at the center. I was reminded of an
old black-and-white TV show, "Window On Main Street" (1961),
( www.imdb.com/title/tt0054579 )
starring Robert Young of "Father Knows Best" and "Marcus Welby, M.D."
fame. I remembered it as about a journalist who moved back to his New
England home town to tell its tales. (Actually, it was set in the midwest.
Whatever.) Because of these flawed memories, it was a long-time goal
of mine to live overlooking the center of a little New England town.
So I got my wish, but it wasn't quite what I bargained for. Sure,
there was a certain segment of population that liked putting rooster
windvanes on their cupola roofs. But another segment, possibly larger,
liked breaking beer bottles on the headstones in the town cemetery.
But when I used my badge to open the security doors into the headquarters of
Data General Corporation, on an industrial park street called "Computer
Drive" (!), I entered a whole different world. Of course it had the
now-familiar fluorescent lights, acoustic-tile ceilings and cubicle farms.
Each cubicle had a teletype with yellow paper on a big roll producing
a constant hard copy record of all interactions with a timesharing
minicomputer in a machine room nearby. In college I'd had access to one
of these -- it was how I learned BASIC -- but I'd had to share it with
all of the other tens of thousands of students at the university, since
there was only one. But now I shared it with only one office mate.
(The eventgs by which minicomputers leapfrogged mainframes on the
technologies of timesharing and virtual memory is a story in itself.)
And the most amazing things in that building were the people: from all
over America and all over the world they came to DG, because the company's
annual hiring needs just for programmers exceeded the annual number of
computer science graduates in the U.S. I had entered the world of HIGH
TECH, where I would stay virtually uninterrupted for the next 30 plus years.
I met DG founder Edson de Castro in June of 1977 at an open-bar reception
for the new-hires, and -- with bravado from a Tequila Sunrise -- told him
he needed to get into personal computers and computer graphics in a big way.
"Nah," he said, and assured me, "They're both going to be a flash in the pan."
One of the first things DG did was to send me down the road to Southborough
where they built the hardware (BUILT THE HARDWARE!), where I took a class
in assembly language programming. In four years of information and computer
science classes I had never heard the phrase "assembly language," which
was odd when I think about it now since I took a class in parsers, a
fundamental building block of compilers, but is was so THEORETICAL (dealing
with a formal model called LR(k) parsers)
( en.wikipedia.org/wiki/LR_parser )
that the subject of implementation never came up.
But now I was surrounded by people who wrote assembly language code every
day, and I was tasked with updating the three main manuals used by those
They taught me a lot technologically, but also I learned about their
subculture. It was during this era that I was first exposed to what
we now call "Colossal Cave Adventure"
( en.wikipedia.org/wiki/Colossal_Cave )
a text-only game that we just called "Adventure."
The main paradigm shift that occured when I was there was the
introduction of Cathode Ray Tube (CRT) terminals to replace the
teletypes. While it meant we no longer had an automatic log of
every session, it made new types of software possible, such as
"screen editors" instead of the old "line editor."
As it turned out, during the approximate era I was at DG, a writer
was camped out with a secret hardware development project I never
even heard about until later (it was that secret). Some keen insights
into the techies of the time can be found in the resulting book,
"The Soul of a New Machine" (1981) by Tracy Kidder.
( www.amazon.com/exec/obidos/ASIN/0316491977/hip-20 )
( www.cs.clemson.edu/%7Emark/330/soulques.html )
One of the things that impressed me most about DG at the time was
its excellent Quality Assurance (QA) process. When a product was
ready to ship a product board would meet, and hear presentations
from hardware, software, documentation and QA. The first three
groups would have to attest that they had finished their jobs and
done them right, and the fourth group had to verify the other three.
Then, after sign-off, the product would be publicly announced and the
company would begin taking orders. Nothing was ever "pre-announced"
at DG. (This was a company started by an engineer.)
Our fiercest competitor was Digital Equipment Company (DEC), up
the brain belt in Maynard, makers of the famous PDP line, and about
to introduce their new VAX line and become the defacto monopolist
But we still competed for mainframe customers too, and IBM knew it.
They brought out their own mini, the Series/1, which was an abomination
To celebrate DG's tenth anniversary in business, founder Edson de Castro
had an ad drew up that, according to rumor, was refused by the industry
press. It said:
"They Say IBM's Entry Into Minicomputers Will Legitimatize
The Market. The Bastards Say, Welcome."
( oreilly.com/pub/a/oreilly/frank/legitimacy_1199.html )
Having just seen Star Wars (1977), I felt like part of the Rebel Alliance.
THE AFTERMATH OF THE GREAT UNBUNDLING
-- Thomas J. Watson
In 1969 IBM responded to an ongoing antitrust action by "unbundling"
its software; previously software had been "free" (or "bundled") with
the hardware purchase. Customers who had depended on IBM for all of
their software solutions now could -- or had to -- hire programmers
to write programs just for them. Thus began the first software industry
( en.wikipedia.org/wiki/History_of_IBM )
The second boom came only a few years later, when minicomputers
became cheap and powerful enough for many medium-sized institutions
to get them, including nonprofits and schools, often for small groups
of people to use.
Useful textbooks on programming began to appear. Inspired by the
English composition classic "Elements of Style" (1918,1958) by
Strunk and White,
( www.amazon.com/exec/obidos/ASIN/9562916464/hip-20 )
( www.bartleby.com/141 )
the marvelous "Elements of Programming Style" (1974) by Brian W. Kernighan
and P. J. Plauger
( www.amazon.com/exec/obidos/ASIN/0070342075/hip-20 )
is still useful today, even though it was written before objects
and has all of its examples in FORTAN or PL/I. It was followed by
"Software Tools" (1976) by Kernighan & Plauger,
( www.amazon.com/exec/obidos/ASIN/020103669X/hip-20 )
which introduced me to the charms of what I now recognize as the UNIX
shell environment, with its pipes, redirects and simple tools; it also
gave instructions on how to duplicate some of that environment when you're
stuck on inferior systems.
This book's concrete advice helped me save a FORTRAN project gone bad
that I was hired as a consultant to fix in 1985.
I mentioned before that Ada Lovelace invented the "go to" instruction in
a computer program. According to the King James Bible, the Lord saw the
tower of Babel and said "Go to, let us go down and confound their language."
In any case, according to then provocative article "Goto Statement
Considered Harmful" (essay, 1968) by Edsger Dijkstra,
( en.wikipedia.org/wiki/Considered_harmful )
the use of "goto" instructions in high level languages makes the code
difficult to read and therefore to debug. (These days we call it
"spaghetti code.") What replaces the "goto" are more structured looping
constructs such as "for/until" and "do/while" instructions. As K&P
said in "Elements of Programming Style" (op. cit.):
Everyone knows that debugging is twice as hard as writing a
program in the first place. So if you're as clever as you
can be when you write it, how will you ever debug it?
This was the beginning of a paradigm shift known as "structured programming."
In 1980 I saw a dramatic demonstration of its power by a colleague who
later became a friend, Bob B. who was at the time manager a group of
programmers at Datagraphics. Using a methodology called Nassi-Shneiderman
Diagrams, Bob had his team design each of their software modules visually
and share the designs with the rest of the team in a "structured walk-through"
before he let them begin writing code. (One guy admitted he wrote his code
first and then diagrammed it -- but after the design review in his module's
walk-through he had to change it!)
( www.edrawsoft.com/Nassi-Schneiderman.php )
The resulting software -- to operate a minivan-sized laser printer purchased
from Siemens Corp. in Germany -- had no bugs. It did run a few orders of
magnitude too slow. The Vice President of Engineering, a blowhard if I ever
met one, bragged that he would recode the whole thing in assembly language
itself to "save" the project (and throw Bob under the bus). But Bob's team
profiled the code, and found a few routines ate up all the time. Those were
rewritten in assembly language and performance became acceptable. (This is
straight out of K&P.) The result was easy to debug and maintain code. (Alas
it was never used in production because, after Bob quit, the VP of Engineering
put a toady in to replace him, who accidently deleted all the code while
trying to punish employees for keeping personal files on the computer.)
Somehow the process of software engineering takes the problems with
communications, egos, power struggles and culture wars that are always
present in engineering projects, and magnifies them while speeding them up.
One manifestation of this at the time was the struggle between proponents
of the languages PL/I and C. PL/I was designed by a committee to be all
things to all men and championed by IBM, and ended up being a bloated
abomination nobody could love, and as far as I know it is now a dead
( en.wikipedia.org/wiki/PL/I )
C was designed by one guy for his own use and was deliberately kept simple;
it gained a popular fan base early on and remains in wide use to this day;
I have even been paid to write C code in the last year.
( en.wikipedia.org/wiki/C_(programming_language) )
This same guy, Dennis Richie, teamed up with three other guys to invent
UNIX at Bell Labs (back when there was only one phone company), again for
their own use.
( en.wikipedia.org/wiki/Unix )
As I have been telling people since the 1980s, UNIX will win because it
is the only operating system not developed by a hardware vendor trying
to meet a marketing deadline. In fact, because of its "state-sponsored
monopoly" status in telephony, Bell was prohibited by law from selling any
computer hardware, software or devices. They finally got an exemption to
sell UNIX to universities as "surplus" materiel, which is how it got to
Berkeley for the next phase in its evolution, being ported to the VAX
and becoming the Berkeley Standard Distribution (BSD).
Much of the culture of the minicomputer era is well-documented in
"The Hacker's Dictionary" (1996) by Eric Raymond.
( www.amazon.com/exec/obidos/ASIN/0262680920/hip-20 )
I'VE GOT THE PLIERS JAMMED INTO THE BOARD FRED
Spacewar! was a fairly good overall diagnostic of the PDP-1
computer and Type 30 Precision CRT Display, so DEC apparently
used it for factory testing and shipped PDP-1 computers to
customers with the Spacewar! program already loaded into the
core memory; this enabled field testing as when the PDP was
fully set up, the field representative could simultaneously
relax and do a final test of the PDP. Spacewar! was extremely
popular in the 1960s, and was widely ported to other systems.
As it required a graphical display, most of the early ports
were to other DEC platforms like the PDP-10 or PDP-11, or
various CDC machines.
-- Wikipedia entry on "Spacewar!"
( en.wikipedia.org/wiki/Spacewar! )
The minicomputer makers were mainly aiming to eat IBM's lunch in the
medium-sized business market, and they did, but they were also greeted
with enthusiasm by people who wanted to redefine the paradigm, expand
their minds and smash the state. These were people who emphatically
believed that personal computers and computer graphics would NOT be
"flashes in the pan." Ted Nelson, mentioned earlier, was proselytizing
about the mind-altering benefits of interactivity.
Douglas C. Engelbart and the researchers working with him in the
Augmentation Research Center at Stanford Research Institute (SRI),
in Menlo Park, CA, gave their famous demo in 1968, showing off
Engelbart's new "mouse" concept among other ideas,
( sloan.stanford.edu/MouseSite/1968Demo.html )
and continued in the 1970s to try to shift focus from Artificial
Intelligence (AI) to Intelligence Augmentation (IA) with computers.
The vanguard audio comedy quartet The Firesign Theatre were exposed
to a mincomputer (PDP-8?) running the popular psychiatrist simulator
"Eliza" over an acoustic modem connected to a teletype at a campus
gathering. People would type words into the teletype and Eliza would
respond with vaguely encouraging questions. It was a big hit at parties.
( en.wikipedia.org/wiki/ELIZA )
In the middle of the fun the Eliza program "crashed." The person playing
was confused by the system prompt, so the "techie" who'd instigated
this came over and typed a few commands to the operating system,
such as "SYSTAT" for system status, which returned something like:
SYSTAT: Uptime = 9:01; I have been awake for 9:32:47.
Log 5; 5 Jobs, Two Detached
When he was satisfied that everything was OK, the techie restarted
Eliza and the fun resumed.
The Firesigns were fascinated. The asked to take home the yellow TTY
paper at the end of the evening, and it became the first rough draft for
their album about a "Future Fair" which became known as "I Think We're
All Bozos On This Bus" (1971, comedy recording).
( www.amazon.com/exec/obidos/ASIN/B00005T7IT/hip-20 )
It contains this passage, when the hero reaches "Dr. Memory," the master
computer that controls the Future Fair:
SYSTAT: Uptime = 9:01; I have been awake for 9:32:47
Amylfax Shuffletime = less than 12% Freight Drain
Log 5; 5 Jobs, Two Detached
Minimum Entry Gate = 1
Total National Imbalance = 3456 Boxcars
( https://trac.ccs.neu.edu/trac/larceny/browser/tags/initial/larceny_lib/fortune.txt )
( pdp-10.trailing-edge.com/BB-PBDEB-BB_1990/01/10,7/system/systat.hlp )
Hardware hackers in colleges and universities built the first video
games such as "Spacewar!" using minicomputers. Because they cost less
than mainframes, it was easier to get permission to "mod" them (or it
was easier to get access, which was sometimes used instead of permission).
The first video gamers were sci-fi fans of E. E. "Doc" Smith's "Lensman"
series and wanted to experience spaceflight as he described it.
( www.amazon.com/exec/obidos/ASIN/0843959495/hip-20 )
In the wake of the "Summer of Love" the San Francisco Bay Area was a
wellspring of innovation in the early 1970s. One techno-utopian
group was the "Peoples Computer Company"
( en.wikipedia.org/wiki/People%27s_Computer_Company )
( www.digibarn.com/collections/newsletters/peoples-computer/index.html )
which sought to bring computer power to the people. If the secret to
social change was "Organize!" then maybe computers could help us do it.
A related project was the "Community Memory" project, which was the
first computer bulletin board system.
( en.wikipedia.org/wiki/Community_Memory )
Meanwhile, my friend Kim Levitt had moved to Washington, D.C.,
and was working for Housing and Urban Development (HUD) on a proposal
for an "information utility" that sounded a lot like today's internet,
with experiments in community bulletin boards as prototypes.
One embarrassing fact about these early democratizing experiments was that
they were ridiculously expensive. We were still waiting for Moore's Law
to make the computers cheap enough to be personal.
Many of these stories are told in "Hackers: Heroes of the Computer
Revolution" (1984) by Stephen Levy,
( www.amazon.com/exec/obidos/ASIN/0385191952/hip-20 )
( www.autistici.org/2000-maniax/texts/s.%20levy%20-%20hackers%20(full%20text).txt )
and "What the Dormouse Said: How the Sixties Counterculture Shaped the
Personal Computer Industry" (2006) by John Markoff.
( www.amazon.com/exec/obidos/ASIN/0143036769/hip-20 )
THE PERSONAL COMPUTER ERA
"There is no reason for any individual to have a computer in his home."
-- Ken Olsen, founder and CEO of DEC, 1977, who threatened
to fire anyone heard saying "personal computer" at the company
( www.snopes.com/quotes/kenolsen.asp )
In 1979 I moved back to California and continued to work with minicomputers,
as well as occasional mainframes, but I found ways to PLAY with the new,
almost toy-like, Personal Computers (PCs). This included the Altair
MITS with its 8080 chip and CP/M operating system, the Radio Shack TRS-80,
(we called it the "trash-80"), the Apple II (I bought one of these for
about $3000), the Atari 400 and 800, the Amiga, and of course the IMB PC.
One thing I learned during these explorations was that someone was
computer phobic, a good antidote was the old "Colossal Cave Adventure"
now ported from the minicomputers. This game came with virtually no
documentation. The only way to learn anything at all was to jump in
and make mistakes. And you couldn't hurt anything.
Here is a typical game fragment:
You are standing at the end of a road before a small brick building.
Around you is a forest. A small stream flows out of the building and
down a gully.
That's not something you can enter.
You can't see any such thing.
What do you want to go in?
That's not something you can enter.
And so on. (BTW, the game is now available for on-line play.)
( jerz.setonhill.edu/if/gallery/adventure/index.html )
Every PC In played with had some kind of text-based command line interface,
even if it was brain-numbingly simple. Little operating system interfaces
like Control Program for Microcomputers (CP/M), Microsoft Disk Operating
System (MS-DOS), the Apple II's meager Apple DOS, Amiga's Command Line
Interpreter (CLI), and so on, required memorized key presses to accomplish
tasks like copying a file or running a program. In this way they were
similar to minicomputers with their CLIs and mainframes with their Job
Control Languages (JCL).
But what was new is that these little "toys" could display COMPUTER
GRAPHICS! It was slow, it was low-res, it wasn't always in color,
but it was GRAPHICS!
(Prior to this, for most of us, the word "graphics" meant using
a non-proportional-spaced printer such as a "Daisy Wheel" or "Golf Ball"
to print out a Snoopy calendar, with the picture of Snoopy made of
punctuation marks (what we now call "ASCII art").
( www.travelnotes.de/rays/fortran/fortran.htm )
Of course there was no internet back then, so we had to scrounge
information where we could. I bought books about my Apple II, and began
learning its 6502 assembly language. When my friend Bob S. got his first
Atari, he found a mimeographed "book" called "Des Res Atari" (Latin for
"concerning Atari") that revealed its secrets. I subscribed to a few
magazines, including BYTE and Creative Computing. After some good
reviews I bought a little language for my Apple II called "GraFORTH,"
a version of the FORTH language that did 3D interactive (monochrome
( hopl.murdoch.edu.au/showlanguage.prx?exp=7000&language=GraFORTH )
I'd done similar work in college on a goofball Burroughs mini, but now I had
it in my own den, and could stay up all night animating space shuttle landings.
I read an article in Creative Computing about a guy who worked at an
electric utility. Their mainframe had a database of the locations of
all the buried power cables in their jurisdiction, but it didn't do
graphics. He had an Apple II at home that did pretty good graphics,
but had tiny floppy disks and couldn't store the database. So he
connected them with a modem, and wrote early versions of "client/server"
applications, and was able to show where not to dig on his home screen.
He did it on his own time as a proof of concept, and wrote it up for
hobbyists after his bosses were underwhelmed.
I took notice, and thought this must be the architecture of the future:
data in a high-powered, centralized database on the server, and a local
graphics engine to manipulate a small portion of the data on the client side.
Today many programs, including the the popular application Google Earth,
use this same architecture.
TO BE CONTINUED...
In 2008 two new books were published about Gregory Bateson: Jesper Hoffmeyer's
"A Legacy for Living Systems: Gregory Bateson as Precursor to Biosemiotics,"
( www.amazon.com/exec/obidos/ASIN/1402067054/hip-20 )
and Noel G. Charlton's "Understanding Gregory Bateson: Mind, Beauty, and the
Sacred Earth (S U N Y Series in Environmental Philosophy and Ethics)"
( www.amazon.com/exec/obidos/ASIN/0791474526/hip-20 )
Bateson of course was a founder of the science of cybernetics.
Also as of 2008 the Ross Ashby Digital Archive is now on-line.
( www.rossashby.info )
Ashby, of course, is author of one of the best textbooks on cybernetics for
beginners, "An Introduction to Cybernetics" (1964).
( www.amazon.com/exec/obidos/ASIN/0416683002/hip-20 )
The February 2009 issue of "The International Journal of General Systems"
is a special issue about "The Intellectual Legacy of W. Ross Ashby".
( www.informaworld.com/smpp/title~content=t713642931~db=all )
Privacy Promise: Your email address will never be sold or given to
others. You will receive only the e-Zine C3M from me, Alan Scrivener,
at most once per month. It may contain commercial offers from me.
To cancel the e-Zine send the subject line "unsubscribe" to me.
I receive a commission on everything you purchase from Amazon.com after
following one of my links, which helps to support my research.
Copyright 2009 by Alan B. Scrivener