Cybernetics in the 3rd Millennium (C3M) --- Volume 8 Number 2, July 2009
Alan B. Scrivener --- www.well.com/~abs --- mailto:email@example.com
~ or ~
Architecture in Buildings and Software
[If you haven't read parts one and two, see the archives, listed at the end.]
THE WORKSTATION ERA
"As the UNIX system has spread, the fraction of its
users who are skilled in its application has decreased."
-- Brian W. Kernighan and Rob Pike, 1984
"The UNIX Programming Environment"
( www.amazon.com/exec/obidos/ASIN/013937681X/hip-20 )
In the early 1980s, while consumers and small businesses which had never
been able to afford computers before were discovering the benefits of IBM PCs
and dot matrix printers for generating invoices (especially when compared
to typewriters and carbon paper), the historic users of bigger computers,
many of whom had already switched from mainframes to minicomputers, began buying
little UNIX-like systems. Often they were built into the cases of DEC
VT-100 terminals, and used the new 68000 chip as CPU. The UNIX was usually
an unlicensed workalike system. One of the weekly computer newspapers
even published a satirical article on the avalanche of 68000-based UNIX
workalikes in VT-100 terminal enclosures, concluding by claiming there was
a high school class in Braintree, Massachusetts that was selling their own.
One advantage of these little critters was they came with all the UNIX
timesharing and networking tools in place. You could use it from the
"console" (the screen and keyboard of the VT-100 terminal it was crammed
into) but you could also hook up RS-232 "serial" cables to other terminals
and log in that way, or even hook up an Ethernet network (the one surviving
technology from Xerox Parc's experiments with the Alto and Star systems)
( en.wikipedia.org/wiki/Xerox_Alto )
and log in from another computer! But what these little critters didn't
have was any graphics (beyond the ubiquitous Snoopy calendar made of
printed out symbols).
Before this UNIX had only run on minicomputers, and only a small fraction
of the buyers of the minis put UNIX on them -- most used the manufacturer's
operating system, such as the ubiquitous RSTS on DEC's PDP-11 models.
But these new critters came with fake UNIX by default. The user base
grew. And along with UNIX came a mind-set and a community. Networked
UNIX users created the email protocols and tools like File Transfer
Program (FTP), as well as newsgroups such as rec.arts.sf (recreational ->
arts -> science fiction).
It was during this era that "The UNIX Programming Environment" (1984)
by Brian W. Kernighan and Rob Pike
( www.amazon.com/exec/obidos/ASIN/013937681X/hip-20 )
came out; I am currently re-reading it and it still is relevant to programmers
who use UNIX (with a slant towards the C language, but not exclusively).
It was also during this era that I began to learn the UNIX environment.
In 1983 I was a tech writer at a company called GTI (formerly Glass Tight
Industries) in San Diego, which bought a DEC VAX and put a copy of the
Berkeley Standard Distribution (BSD) UNIX on it from company called "Uniq,"
a boutique Berkeley UNIX reseller that took their name from a UNIX utility:
UNIQ(1) BSD General Commands Manual UNIQ(1)
uniq -- report or filter out repeated lines in a file
uniq [-c | -d | -u] [-f fields] [-s chars] [input_file [output_file]]
The uniq utility reads the standard input comparing adjacent lines,
and writes a copy of each unique input line to the standard output.
The second and succeeding copies of identical adjacent input lines
are not written. Repeated lines in the input will not be detected
if they are not adjacent, so it may be necessary to sort the files
I was fortunate that there were some patient people who helped me learn
the oral tradition that surrounds UNIX, including Dan E., Phil M. and
Jeff L. (who kept saying "I am not a manual" but helped me anyway).
One thing I learned was that commands you could type from the shell
had a pattern to their abbreviations. Because they evolved during
the teletype era (which took upwards of half a second to type each character,
"ca-thump"), commands were as short as possible. Not always, but often,
the abbreviation was two letters: the first letter followed by the next
consonant. For example:
ar - archive
as - assembler
at - at a time do something
cc - C compiler
cp - copy
ed - edit
ln - link
ls - list
mv - move
rm - remove
sh - shell
vi - visual editor
I have never seen this written down anywhere.
On a less trivial level, I learned that a vital design principle
of UNIX is that programs are small and designed to work together.
A program never knows whether its inputs and outputs come from and
go to a file, another program, or a human, nor should it care.
This makes it easy to do things like:
du -a | sort -nr | head
which shows the 5 biggest files, using "disk utilization" with "a"
(all files" selected, piped through "sort" doing a "n" (numeric) sort
in "r" (reverse) order, and then piped through "head" which by default
shows the first five lines of its input.
THE W.I.M.P. ERA
"Windows '95 = Macintosh '84"
-- Geek graffiti, 1995
"If it's soft and hard to find, it's Wimpy's!"
-- The Firesign Theater, 1985
"Eat Or Be Eaten"
( www.amazon.com/exec/obidos/ASIN/B001GBYJZO/hip-20 )
User interface professionals use the acronym W.I.M.P for windows-icons-mouse-
pointing, to refer to the Graphical User Interface we now see on most
computers. The tale is oft-told of the development of this interface.
How detailed your story is depends on how far back you begin. A joke goes:
As an example of how to profit from research and development (R&D),
look at graphical computer interfaces: Xerox did the research,
Apple did the development, and Microsoft made the profit.
Indeed, later, after Pepsi executive John Scully was recruited to Apple by
Steve Jobs and then forced Jobs out, his big idea for making money was
to sue Microsoft for ripping off the "look and feel" of the Mac. As
the Wikipedia article
( en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microsoft_Corporation )
Midway through the suit, Xerox filed a lawsuit against Apple,
claiming Apple had infringed copyrights Xerox held on its GUIs.
Xerox had invested in Apple and had invited the Macintosh design
team to view their GUI computers at the PARC research lab; these
visits had been very influential on the development of the Macintosh
GUI. Xerox's lawsuit appeared to be a defensive move to ensure
that if Apple v. Microsoft established that "look and feel" was
copyrightable, then Xerox would be the primary beneficiary, rather
than Apple. The Xerox case was dismissed because the three year
statute of limitations had passed.
But of course you can begin the story farther back, with Doug Engelbart,
( en.wikipedia.org/wiki/Doug_Engelbart )
or with Ted Nelson,
( en.wikipedia.org/wiki/Ted_Nelson )
or even with Vannevar Bush.
( en.wikipedia.org/wiki/Vannevar_Bush )
But there's no denying that Steve Jobs' Apple, followed by Bill Gates'
Microsoft, brought W.I.M.P. to the masses. And therein lies the other,
invisible half of the story (says the cyberneticist). People had to
LEARN how to use this interface, usually by being shown by someone
else. We all learned quickly, and forgot that we did, but it happened.
Once in the late eighties I was visiting my friend Tony Bove, who had
just produced an interactive CD-ROM
( en.wikipedia.org/wiki/CD-ROM )
called "Haight Ashbury in the Sixties."
( www.rockument.com/ha.html )
I showed up to visit him in at his high-tech compound -- in the redwoods
of Gualala, north of San Francisco on the Pacific coast -- and he was in
the middle of giving a demo of the CD-ROM to his neighbor. I patiently
watched as well, since I hadn't seen it yet and was interested. About
a half-hour after I arrived, the neighbor said, "Well I'll be danged!
I just noticed that when you move that little handle there [the mouse]
that little arrow on the screen [the mouse cursor] moves the same way!"
Now, I'm not trying to make fun of hicks from the sticks (well, not
too much) because this was when most people didn't have computers,
and most of those who did didn't have mice. Nowadays the mouse cursor
is such a familiar symbol you can get a kite shaped like it.
( www.windfiredesigns.com/timbofolio_pages/PointerKite.html )
The point is that these things must be learned, and a generation of users
that has been learning all along sometimes needs to pass the culture
on to newbies.
The anthropology memoir "Oh, What a Blow That Phantom Gave Me!" (1973) by
( www.amazon.com/exec/obidos/ASIN/B000EG5S9S/hip-20 )
chronicles a number of events in which stone-aged tribes had their
first contact with high-tech media, such as instant photos and motion
picture cameras and projectors. He described how tribesmen in New
Zealand had to be taught to read a Polaroid photo of a face.
Stone axes were still in use when we arrived; cameras and recorders
were absolutely unknown.
We gave each person a Polaroid shot of himself. At first there was no
understanding. The photographs were black & white, flat, static, odorless
-- far removed from any reality they knew. They had to be taught to
"read" them. I pointed to a nose in a picture, then touched the real
nose, etc. Often one or more boys would intrude into the scene, peering
intently from picture to subject, then shout, "It's you!"
Recognition gradually came into the subject's face. And fear.
Suddenly he covered his mouth, ducked his head & turned his body
away. After this first startled response, often repeated several
times, he either stood transfixed, staring at his image, only his
stomach muscles betraying tension, or he retreated from the group,
pressing his photograph against his chest, showing it to no one,
slipping away to study it in solitude.
Over the 11 years from Macintosh '84 to Windows '95, a significant
portion of the population of the English-speaking world (and eventually
many others) learned the W.I.M.P. paradigm.
Apple sold every reporter a PowerBook notebook computer,
( en.wikipedia.org/wiki/Powerbook )
and later sold every pro graphic artist, publisher and prepress house a
high-end Mac with ColorSync,
( en.wikipedia.org/wiki/Colorsync )
( developer.apple.com/technotes/tn/tn2035.html )
and still later sold every professional video editor a high-end Mac
with a huge disk farm and a copy of the FinalCut Pro.
( en.wikipedia.org/wiki/Final_Cut_Pro )
Microsoft sold every gamer a souped-up PC with a graphics video card,
( en.wikipedia.org/wiki/Graphics_card )
and every office worker a copy of some form of Windows (mostly '95),
( en.wikipedia.org/wiki/Windows_95 )
starting them on the upgrade path that would lead them to Windows 98,
NT, ME, 2000, XP and/or Vista.
And so, computers finally came to the masses.
THE GRAPHICAL WORKSTATION ERA
"X [Windows] is primarily a protocol and graphics primitives
definition and it deliberately contains no specification for
application user interface design, such as button, menu, or
window title bar styles. Instead, application software -- such
as window managers, GUI widget toolkits and desktop environments,
or application-specific graphical user interfaces - define and
provide such details. As a result, there is no typical X interface
and several desktop environments have been popular among users."
-- Wikipedia article on X Windows
( en.wikipedia.org/wiki/X_windows )
I think it was Sun Microsystems that introduced the first graphical
UNIX workstation, with bitmap graphics not unlike the Mac; but though
they claimed to based on standards, and offered a standard UNIX, they
built their own proprietary graphical windowing system.
( en.wikipedia.org/wiki/Sun_microsystems )
The early days of Sun are described in "Sunburst: The Ascent of Sun
Microsystems" (1991) by Mark Hall and John Barry.
( www.amazon.com/exec/obidos/ASIN/0809239892/hip-20 )
But it didn't take long for a standard to emerge. Some researchers, tired
of having better graphics on their $3000 home computers than on their
$100,000 minicomputers at work, banded together and created a new
windowing standard. The way I heard it was that DEC and some others
were funding research on climate change, and they wanted to make their
collaborative research easier, so they took a detour to create a new
cross-platform networked windowing system, with a group called Project
Athena to accomplish it.
( en.wikipedia.org/wiki/Project_Athena )
They came up with the W Windows system, followed by X Windows, versions 1
through 11, the last of which became the de-facto standard, still in use
today on Linux systems and elsewhere: X11.
( en.wikipedia.org/wiki/X_Window_System )
New graphics workstation vendors began to enter the market, such as
Silicon Graphics, Apollo, Hewlett-Packard, IBM and even DEC, which embraced
X11 but not the underlying UNIX networking protocol TCP/IP, instead
using their own proprietary DECNet (which contributed to their downfall).
The momentum of X11 was great enough to get Sun to come around, and chuck
their original windowing system.
The problem with X11 is that it defines a graphical windowing interface
from the machine's point of view, but not from the human's. As the
above epigram indicates, the "marketplace of ideas" was supposed to pick
the winning windowing paradigm by some sort of Darwinian natural selection.
I worked for a company that sold software for all these different graphical
workstations, and I did demos; I can't remember how many different key
bindings I had to memorize, involving Alt, Control, Shift, and the Left,
Middle and Right mouse buttons, in order to raise, lower, iconify, de-iconify,
move, resize, and close windows. The war over operating systems and then
windowing systems was repeated yet again with Window Managers. This chaos
survives to this day in the Linux world, in the competition between the GNOME
( en.wikipedia.org/wiki/GNOME )
and KDE desktops.
( en.wikipedia.org/wiki/Kde )
THE OBJECT-ORIENTED REVOLUTION
"Class Object is the root of the class hierarchy.
Every class has Object as a superclass. All objects,
including arrays, implement the methods of this class."
-- Java 2 Platform Standard Ed. 5.0 documentation page for
java.lang Class Object (java.lang.Object)
( java.sun.com/j2se/1.5.0/docs/api/java/lang/Object.html )
If the above is gibberish to you, don't worry; it is to most people.
This is the main marketing problem that the "Object-Oriented" revolution
has had from the get-go. Programmers who barely understood the concepts
tried to sell them to managers who were baffled. The real problem
here is that programmers are routinely called upon to create abstractions
that have never existed before in the history of civilization. Sometimes
they run out of creative "juice" and give these abstractions uninformative
or even misleading names. That appears to have happened here. I am
reminded of the physicists who've given their subatomic particles properties
such as color, charm and strangeness.
I explained to my wife once what objects were all about. I told her
about encapsulation and data hiding,
( en.wikipedia.org/wiki/Encapsulation_(computer_science%29 )
and inheritance and the power of defaults,
( en.wikipedia.org/wiki/Inheritance_(computer_science%29 )
and polymorphism and how it simplifies the name space,
and she said it all sounded logical to her. "How did it used to work,
before objects?" she asked. Her expression grew more and more
horrified as I described cut-and-paste coding, functions with
lots of switch/case blocks to handle lots of differences in details,
global variable abuse and confusion, and the problem of fixing a bug
in one place but not others. None of these problems is automatically
solved by Objects, but the paradigm makes it easier to do things right.
The biggest cultural problem with Objects was that after initially
resisting the approach, some managers embraced it as a "Magic Bullet"
that would improve the productivity of expensive programmers or --
even better -- allow them to replaced with cheaper ones.
This and other idiocies are described in my friend Bruce Webster's
"The Pitfalls of Object-Oriented Programming" (1995).
( www.amazon.com/exec/obidos/ASIN/1558513973/hip-20 )
The first Object-Oriented (OO) language I learned was C++, which I
thought wasn't OO enough. The old ways of doing things, in the plain old
non-OO C language, were still around, tempting the programmer to avoid
making the conceptual leap.
The next OO language I learned was Smalltalk, which was the original
"fully-OO" computer language, from academia. I found it to be TOO MUCH
in the "pure OO" camp. You override the methods that defined an integer's
behavior, and make 1+1=3. WTF? Why hang yourself?
Eventually Sun managed to pull the Java language out of a hat. It had
originally been a language called Oak, made for TV-set-top boxes. The
project imploded, but Sun renamed, repurposed, and remarketed Java as
"the language of the internet" (more on that later). It had a zippy
name, OF COURSE it had objects (which came in under the radar for
most managers) and it was NEXT BIG THING!
It has even survived to (finally) deserve much of its original hype.
If you want to learn the Java language, the best book I've found (after
rejecting several) is "Java By Example" (1996) by Clayton Walnum.
( www.amazon.com/exec/obidos/ASIN/0789708140/hip-20 )
I used it as a guide to re-implement some of the old programs from
Kernighan & Plauger's "Software Tools" (1976).
( www.amazon.com/exec/obidos/ASIN/020103669X/hip-20 )
The OO revolution and Java in particular was instrumental in tilting
certain advantages away from Microsoft and towards the graphics workstations.
Microsoft recognized this and fought back with a plan to:
"Kill cross-platform Java by grow[ing] the polluted Java market."
This came out in evidence when Sun sued them, according to Wikipedia.
( en.wikipedia.org/wiki/Microsoft_Java )
THE EVIL EMPIRE VS THE JEDIS?
"It's not about pop culture, and it's not about fooling people,
and it's not about convincing people that they want something
they don't. We figure out what we want. And I think we're pretty
good at having the right discipline to think through whether a
lot of other people are going to want it, too. That's what we
get paid to do."
-- Steve Jobs
( mariosundar.wordpress.com/2008/08/14/top-10-steve-jobs-quotes )
I am often asked if I am a "Mac guy" or a "PC guy," as if all people could
ever drink is Coke or Pepsi. ("I could've had a V8!") I usually tell them
this story: In the first decade of my career I wrote a lot of code that is
lost to me now; some because it was proprietary to my employer, but most
because I lost access to the environment (language compiler, libraries,
operating system) required for the programs to run. Starting in 1983
I began writing C code for UNIX. (That year I was given a fake car license
plate modeled after New Hampshire's, that said "Live free or die! UNIX.")
To my astonishment this code has survived; Linux Torvald's port of UNIX to
the PC, the one they call Linux, made this possible. Since then some other
cross-platform solutions have come along, such as Java, Perl, and ports of
UNIX shell and utilities to other platforms (such as Cygwin for Windows),
( http:www.cygnus.com )
and I have been able to branch out. (Good old Moore's Law finally made the
chips fast enough to run INTERPRETED languages, and the scripting revolution
is off and running with Perl, Ruby, Python, and so on.) But on or near my
desk I have a PC running an old Windows (2000), a Mac running OS X, an older
Mac running OS 9, and a repurposed PC running Red Hat Linux. I also have
access to a work laptop running Windows XP. I routinely test my code on all
of them. Thanks to Java Applets I can deploy web-based code on all five,
as well as locally run programs. I'm betting on all the horses at this
point. I'm a "keep my code alive" kind of guy.
But if you badger me enough you can get me to admit I don't like Microsoft.
Why? Because they don't like me. The have never been innovators and they
bought every significant technology they ever deployed -- except Clippy
the Paper Clip, remember him? They've gotten ahead by being bullies.
Their business strategies have included a "DR-DOS killer" (look it up),
a "Visicalc killer" (Excel), a "Novell killer" (Windows for Workgroups 3.11)
a "Netscape-killer" (Internet Exporer), a "RealMedia Killer" (Windows Media),
a "Google killer" (first MSN, the Live Search, now Bing), and an
"iPod killer" (the ever-so-hip Zune music player). They've punished
partners, end-users, even big customers, for ever using any microcomputer
software that Microsoft didn't sell. An enterprising bloke at the
European Union, whose been paying attention for a LONG TIME, put together
this stunning summary of their anti-competitive shenanigans:
( www.ecis.eu/documents/Finalversion_Consumerchoicepaper.pdf )
There's a lot of stuff at Groklaw too:
( www.groklaw.net/staticpages/index.php?page=2005010107100653 )
Don't get me started.
And on top of that they make crappy products too. I have a friend named
Chuck who's a lawyer. In the late 1990s his law partner brought in new
PCs for the office running the Windows NT operating system and an Excel
"database," both from Microsoft. On several occasions they got the
"blue screen of death" and lost all of their data, which made it kind of
hard to bill for those billable hours, etc.
( www.iconocast.com/EB000000000000099/V1/News1_1.jpg )
"You're a computer guy," he said, "Why are these things so unreliable?"
"They're not all unreliable," I replied. "Some Linux systems have been
up for years." And they I asked him if he and his partner had gotten
advice from any COMPUTER professionals (not other lawyers) on what to
buy. They hadn't. "Well, as a computer professional, I advise you
you to convert to Linux, and get the free Star Office suite from Sun."
He said he'd talk to his partner about it. (The partner decided to
stick with the devil he knew.)
If YOU decide to try going cold turkey on those turkeys in Redmond, my
old buddy Tony Bove has just the book for you, "Just Say No to Microsoft:
How to Ditch Microsoft and Why It's Not as Hard as You Think" (2005).
( www.amazon.com/exec/obidos/ASIN/159327064X/hip-20 )
So after enough of this ranting people ask me, "So, you're an Apple guy,
right?" That's when I have to explain to them that if Apple had won the
personal computing wars, and they had a 90% monopoly market share, things
would be EVEN WORSE.
True, Apple makes better computers, but they also cost more and are even
more closed than Microsoft, because they control the hardware AND software.
Their products are better because of this control, and because they pay
more than lip service to usability, and usability testing, and because
Steve Jobs personally tries everything before it ships and delays it until
it's right by HIS standards. (As we say in the biz, "he eats his own
dogfood.") I swear, this guy seems like the only high-tech CEO in the
business who can get his engineers to do what he wants. Gates sure
couldn't -- read the leaked memos.
( gizmodo.com/5019516/classic-clips-bill-gates-chews-out-microsoft-over-xp )
But just because Apple is user-friendly doesn't mean it's the user's friend.
Always watch your wallet when dealing with a vendor, "Caveat emptor," let
the buyer beware, that's what I say.
Excruciatingly detailed accounts of the personal computing wars can be found
in "Accidental Empires: How the Boys of Silicon Valley Make Their Millions,
Battle Foreign Competition, and Still Can't Get a Date" (1996) by
Robert X. Cringely,
( www.amazon.com/exec/obidos/ASIN/0887308554/hip-20 )
"Microserfs" (novel, 1996) by Douglas Coupland,
( www.amazon.com/exec/obidos/ASIN/0060987049/hip-20 )
"Insanely Great: The Life and Times of Macintosh, the Computer That
Changed Everything" (2000) by Steven Levy
( www.amazon.com/exec/obidos/ASIN/0140291776/hip-20 )
and "iCon Steve Jobs: The Greatest Second Act in the History of
Business" (2006) by Jeffrey S. Young and William L. Simon.
( www.amazon.com/exec/obidos/ASIN/0471787841/hip-20 )
THE TRUTH ABOUT PLAIN TEXT
Imagine a crossroads where four competing auto dealerships
are situated. One of them (Microsoft) is much, much bigger
than the others. It started out years ago selling three-speed
bicycles (MS-DOS); these were not perfect, but they worked,
and when they broke you could easily fix them.
There was a competing bicycle dealership next door (Apple)
that one day began selling motorized vehicles--expensive but
attractively styled cars with their innards hermetically
sealed, so that how they worked was something of a mystery.
The big dealership responded by rushing a moped upgrade kit
(the original Windows) onto the market. This was a Rube Goldberg
contraption that, when bolted onto a three-speed bicycle, enabled
it to keep up, just barely, with Apple-cars. The users had to wear
goggles and were always picking bugs out of their teeth while
Apple owners sped along in hermetically sealed comfort, sneering
out the windows. But the Micro-mopeds were cheap, and easy to fix
compared with the Apple-cars, and their market share waxed.
Eventually the big dealership came out with a full-fledged car:
a colossal station wagon (Windows 95). It had all the aesthetic
appeal of a Soviet worker housing block, it leaked oil and blew
gaskets, and it was an enormous success. A little later, they
also came out with a hulking off-road vehicle intended for
industrial users (Windows NT) which was no more beautiful than
the station wagon, and only a little more reliable.
Since then there has been a lot of noise and shouting, but little
has changed. The smaller dealership continues to sell sleek
Euro-styled sedans and to spend a lot of money on advertising
campaigns. They have had GOING OUT OF BUSINESS! signs taped up
in their windows for so long that they have gotten all yellow
and curly. The big one keeps making bigger and bigger station
wagons and ORVs.
On the other side of the road are two competitors that have come
along more recently.
One of them (Be, Inc.) is selling fully operational Batmobiles
(the BeOS). They are more beautiful and stylish even than the
Euro-sedans, better designed, more technologically advanced, and
at least as reliable as anything else on the market -- and yet
cheaper than the others.
With one exception, that is: Linux, which is right next door,
and which is not a business at all. It's a bunch of RVs, yurts,
tepees, and geodesic domes set up in a field and organized by
consensus. The people who live there are making tanks. These
are not old-fashioned, cast-iron Soviet tanks; these are more
like the M1 tanks of the U.S. Army, made of space-age materials
and jammed with sophisticated technology from one end to the
other. But they are better than Army tanks. They've been modified
in such a way that they never, ever break down, are light and
maneuverable enough to use on ordinary streets, and use no more
fuel than a subcompact car. These tanks are being cranked out,
on the spot, at a terrific pace, and a vast number of them are
lined up along the edge of the road with keys in the ignition.
Anyone who wants can simply climb into one and drive it away
Customers come to this crossroads in throngs, day and night.
Ninety percent of them go straight to the biggest dealership
and buy station wagons or off-road vehicles. They do not even
look at the other dealerships.
Of the remaining ten percent, most go and buy a sleek Euro-sedan,
pausing only to turn up their noses at the philistines going to buy
the station wagons and ORVs. If they even notice the people on the
opposite side of the road, selling the cheaper, technically
superior vehicles, these customers deride them cranks and half-wits.
The Batmobile outlet sells a few vehicles to the occasional car nut
who wants a second vehicle to go with his station wagon, but seems
to accept, at least for now, that it's a fringe player.
The group giving away the free tanks only stays alive because it is
staffed by volunteers, who are lined up at the edge of the street with
bullhorns, trying to draw customers' attention to this incredible
situation. A typical conversation goes something like this:
Hacker with bullhorn: "Save your money! Accept one of our free tanks!
It is invulnerable, and can drive across rocks and swamps at ninety
miles an hour while getting a hundred miles to the gallon!"
Prospective station wagon buyer: "I know what you say is true...but...
er...I don't know how to maintain a tank!"
Bullhorn: "You don't know how to maintain a station wagon either!"
Buyer: "But this dealership has mechanics on staff. If something
goes wrong with my station wagon, I can take a day off work, bring
it here, and pay them to work on it while I sit in the waiting room
for hours, listening to elevator music."
Bullhorn: "But if you accept one of our free tanks we will send
volunteers to your house to fix it for free while you sleep!"
Buyer: "Stay away from my house, you freak!"
Buyer: "Can't you see that everyone is buying station wagons?"
-- Neal Stephenson, 1999
"In the Beginning Was the Command Line"
( www.amazon.com/exec/obidos/ASIN/0380815931/hip-20 )
( www.cryptonomicon.com/beginning.html )
( artlung.com/smorgasborg/C_R_Y_P_T_O_N_O_M_I_C_O_N.shtml )
This excerpt is from a non-fiction book by post-cyberpunk sci-fi author
( wapedia.mobi/en/Neal_Stephenson )
famed for "Snow Crash" (1992),
( www.amazon.com/exec/obidos/ASIN/B001I98XAQ/hip-20 )
"Interface" (1994), with J. Frederick George,
( www.amazon.com/exec/obidos/ASIN/0553383434/hip-20 )
"The Diamond Age: Or, a Young Lady's Illustrated Primer" (1995),
( www.amazon.com/exec/obidos/ASIN/0553380966/hip-20 )
( www.amazon.com/exec/obidos/ASIN/0060512806/hip-20 )
and "The Baroque Cycle" (2004),
( www.amazon.com/exec/obidos/ASIN/0060593083/hip-20 )
among others. Unlike Gibson, who started on an electric typewriter,
Stephenson is a coder, and brings a hacker's sensibilities to his ideas.
The key idea of this screed is that if you want to be a code wizard,
at some point you have to move the Graphical User Interface (GUI)
aside, and interact with the computer MORE DIRECTLY with a keyboard,
because underneath every GUI is a Command Line Interface (CLI).
The ten-year-old fable quoted above needs updating, because the OS
landscape has changed, but the big picture remains much the same.
As the Wikipedia article
( en.wikipedia.org/wiki/In_The_Beginning_Was_The_Command_Line )
The essay was written before the advent of Mac OS X. In a "Slashdot"
interview on 2004-10-20, he remarked:
I embraced OS X as soon as it was available and have never
looked back. So a lot of "In the Beginning...was the Command
Line" is now obsolete. I keep meaning to update it, but if I'm
honest with myself, I have to say this is unlikely.
I too now use OS X -- "like UNIX with training wheels by Armani" as
John Perry Barlow called the NeXT OS, its predecessor -- but i still
like dropping down into a shell. That's where I am now, merrily using
the vi editor to create this document.
( en.wikipedia.org/wiki/Vi )
Now, don't get me wrong; there are certain applications that require
a Graphical User Interface (GUI), such as a What You See Is What You Get
(WYSIWYG) editor, a paint program, or a Computer Aided Design (CAD)
package. But in terms of sheer bandwidth it is fastest to communicate
your intentions to the computer through text, be it source code in a compiled
language, or an interpreted script in a scripting language.
In fact, even film effects (F/X) done with computer, so-called Computer-
Generated Images (CGI), require a scripting interface. In the mid-nineties
I was going to Los Angeles SIGGRAPH meetings
( la.siggraph.org )
and I heard Technical Directors (TD) complaining how difficult it was
to get the software vendors to wise up to this. Sure, it was cool
to do something to a frame with the mouse, but they had 24 frames a second
for hours on end, literally hundreds of thousands of frames, that they had
to do the same thing to. They needed a scripting interface or else an
Application Program Interface (API), also called a "binding," from a
In his 1988 book "The Media Lab: Inventing the Future at M. I. T."
( www.amazon.com/exec/obidos/ASIN/0140097015/hip-20 )
Stewart Brand described a demo called "Put That There" which used speech
recognition and gesture recognition to allow a user to stand before
a screen where objects appeared, and say "put that" (pointing at object),
"there" (pointing at blank spot), and the object would move to the spot.
Imagine using this interface while gagged, and you see the problem with
modern GUI. The shell interface allowed you to type commands, structured
like an imperative sentence (don't you remember diagramming sentences
like "Hold the bus!" and "Eat your liver" in the Imperative Mood?),
( en.wikipedia.org/wiki/Imperative_mood )
in which the "subject verb object" form has an implied "you" for the
subject. In a shell you can express things formed like "verb noun"
or even "verb noun with noun" and more complex constructs. For example:
print hisfile to printer1
etc. But the in a GUI we can only point at things and grunt like cavemen:
"Ughh," or "Ughh ughh" if we click twice.
When I was asked by organizer and chair Benjamin Bratton to be on
his panel at the ACM SIGGRAPH 1993 conference, called "Electronic
Image and Popular Discourse," I chose the topic "Hypertext or Game
Boy: We Are At a Fork In the Road" and made the same general argument
I'm making here.
( www.siggraph.org/conference )
(That must be why I like Stephenson's book so much -- I already
agreed with it before I read it.)
In fact, now that I think about it, I realize that throughout my career
there have been a number of occasions when a field engineer hacked
together a scripting interface to a GUI-based system, and the
engineers refused to support it, considering it "inelegant."
But since I was usually the guy who put the demos together, and ported
them to each new version of the software, I found the scripting
One of the hardest things for companies to do was take their darling
technology, written in a fabulous language like Smalltalk or Objective
C or Java, and provide an Ada interface to make the military guys happy.
But a scripting interface made that trivial -- the Ada could could just
spit out a script. Boy did the engineers despise this approach. They
hated plain text and they hated Ada. (I guess only a True Object was
worthy to interface with their software.)
Lately plain text has been reinvented as the eXtensible Markup
Language (XML), and now it seems OK with the engineers. (Perhaps
that's because it can now be endlessly made complicated.)
Now, don't get me wrong. I have a mouse and I use it. The GUI has
its place. But I think it should be a fundamental design principle
of software interfaces that, for each functionality, there is a way
to do it through a GUI, and a way to do it with a script in some plain
text scripting language, and an API for doing it. It's easy: code the
API first, and have the other two call it.
One of the first excellent implementations to use this principle was
a project that I think began at UCSD, where Gary Perlman developed
"Menunix" -- a GUI interface to a UNIX shell. Since UNIX already
had a kernel with the API and shell with the plain text script, all he
had to add was a GUI layer, in this case merely keyboard-driven drop-
down menus, like on the old DOS systems. These were driven by a plain
text file that defined each menu's layout, what they said, what the key
commands were, and what really got done by a UNIX shell when you made a
selection. If I recall, primitive dialog boxes and file browsers made
choices easy. But the big innovation was that when you made a menu choice,
on the bottom line of the screen Menunix showed you the exact shell command
it was executing for you. If you selected "Directory Listing" from the menu,
the bottom line might say:
(list all files in columns formatted). The software EMPOWERED you.
If you wanted more information you could go into a shell and type:
(manual entry for ls command) and see:
LS(1) BSD General Commands Manual LS(1)
ls -- list directory contents
ls [-ABCFGHLPRTWZabcdefghiklmnopqrstuwx1] [file ...]
For each operand that names a file of a type other than
directory, ls displays its name as well as any requested,
associated information. For each operand that names a
file of type directory, ls displays the names of files
contained within that directory, as well as any requested,
and so on. As you used Menunix it encouraged you to learn the shell, and
worked at "weaning" you from the menus.
For more information on Menunix, see "The Design of an Interface to a
Programming System and MENUNIX: A Menu-Based Interface to UNIX (User Manual).
Two Papers in Cognitive Engineering. Technical Report No. 8105. (ED212289)
by Gary Perlman, 1981-11-00.
This report consists of two papers on MENUNIX, an experimental
interface to the approximately 300 programs and files available
on the Berkeley UNIX 4.0 version of the UNIX operating system.
The first paper discusses some of the psychological concerns
involved in the design of MENUNIX; the second is a tutorial
user manual for MENUNIX, in which the features of the program
are more fully explained. It is pointed out that the goal of
MENUNIX is to provide novice users with information about what
commands are available and how they are used, while providing
experts with an environment for efficiently executing commands.
In short, MENUNIX provides a friendly user-interface to UNIX
programs for users of all levels of expertise.
( www.eric.ed.gov/ERICWebPortal/custom/portlets/recordDetails/detailmini.jsp?_nfpb=true&_&ERICExtSearch_SearchValue_0=ED212289&ERICExtSearch_SearchType_0=no&accno=ED212289 )
( www.stormingmedia.us/92/9298/A929801.html )
IN PRAISE OF OBSOLETE SYSTEMS
"Give me your tires, your power,
Your metal chassis turning to debris,
and I'll bill you later."
These words, sprayed on the base of
Our Beloved Lady With a Torch,
remain eternal, because no individual,
all alone in a democracy,
has the right to remove them.
-- Proctor & Bergman, 1973
"TV Or Not TV: A Video Vaudeville In Two Acts"
( www.amazon.com/exec/obidos/ASIN/B0000AAZY1/hip-20 )
People laugh at me sometimes, but I like using obsolete technology
because it has stopped changing. I have seen several companies spend
upwards of a million dollars trying to keep up with the software
"upgrades" (redesigns) from their tool vendors. It's a Catch-22:
everybody knows these tools suck but the user base is screwed if the
vendor really fixes them. Examples: Sun's transition from SunOS
to Solaris, NeXT's tr4ansition from NextStep to OpenStep, Apple's
transition of the Macintosh operating system from OS 9 (homegrown)
to OS X (based on NeXT's Mach kernel and OpenStep environment),
and Microsoft's transition from Windows XP to Vista.
Once the vendors move on, the obsolete technologies they leave behind
are usually pretty good, tried and true, and STABLE: no upgrades for
users to burn money adapting to.
In the mid 1990s I was visiting McClellan Air Force Base near Sacramento
( www.globalsecurity.org/military/facility/mcclellan.htm )
(now closed), as part of the Silicon Graphics, Inc. (SGI)
( en.wikipedia.org/wiki/Silicon_Graphics,_Inc. )
touring tech show called the "Magic Bus." For you youngsters, SGI
was the hippest computer company, with a post-modern name, a 3D
logo, and products with names like "Reality Engine." Their best demo
was a 3D snowboarding game that was rad, dude. They even had Jurassic
Park bubblegum cards! (SGI computers had rendered the 3D effects.)
And the Magic Bus had all that coolness showing up (usually) in tech
company parking lots. As one randomly-chosen press release
( www.prnewswire.com/cgi-bin/stories.pl?ACCT=104&STORY=/www/story/06-08-1998/0000676926&EDATE= )
Magic Bus Also Open to Visitors
Silicon Graphics also announced today it is bringing its
"Magic Bus" to the party -- a specially outfitted, 18-wheel
tractor-trailer that offers hands-on demo experience with the
latest Silicon Graphics technology to hundreds of companies,
universities and other organizations throughout North America.
The Magic Bus will be on display and open to the public on
June l3th outside of the Milwaukee Art Museum.
So I was "on the bus" at McClellan AFB, giving demos of AVS software
( www.avs.com )
on Indigo2 hardware
( en.wikipedia.org/wiki/SGI_Indigo )
when one of the Air Force guys invited me to take a break and come
into a hangar to see the world's second largest robot. It was a device
for taking X-rays pictures of fighter jet wings, basically a 3-axis
gantry or crane with a three-axis X-ray camera mounted on it, that
could move to any position in the hangar and point in any direction
(so-called 6 degrees of freedom). "And here is the computer that
controls it," my host said, pointing to a 386 running DOS.
"Why don't you use Windows 95?" I asked.
"Less stable and reliable, needs a more powerful computer, has some funky
bugs, represents an expense that doesn't buy us anything" was the essence
of the answer.
(I could've seen the largest robot in the world. It was in the next
hangar and looked about the same. But it used alpha radiation to do its
imaging, and I would've had to wear a dosimeter, and it seemed pointless.)
In the early years of the 21st century I did a bunch of consulting
producing 3D graphics on two obsolete computers literally rescued
from dumpsters, using free or donated software.
( aero.mindtel.com/odb-dgc-osd-nii/04-04/HIP_images/_image_master_index.html )
I was proving a point. And when my wife's eight year old PC died suddenly
(thankfully well backed up), I said, "Replace it with a 7-year-old one and
it will seem better and cost $200." It worked.
I am inspired by the work of the Long Now Foundation
( www.longnow.org )
on the Clock of the Long Now," which will have a series of chimes, including
one which only sounds every 10,000 years.
( www.longnow.org/projects/clock/prototype1 )
They're asking the very interesting question, "How do you build a machine
to last 10,000 years?" If history is any guide, the greatest threat
to it will be humans. Therefore, it is designed to make it obvious
from visual inspection how it works, so those who stumble on it in the
future won't be too tempted to dismantle it.
I love these kind of questions. Bateson used to say they represent
an approach to wisdom. My own favorite is "If you were designing a
spacecraft to leave Earth and explore, and return in 500 years with
some kind of physical record of the data collected, what format would
you use?" It's a great question to ask a bunch of techies drinking
beers in a bar. (And while we're at it, how many companies can't
read their own financial data from 10 years ago?)
Well, having created such a challenge, I feel I should respond to it.
I would use a plastic tape encoded the same was as paper tape, only with
many more, smaller holes. I would use 7-bit ASCII code. I would use
plain text and obvious data format. To help the future folk confirm
the encoding, I would begin with words like UNITED STATES OF AMERICA,
LIBERTY, and 1776, which appear frequently on our coins and monuments.
(Upper case is fine; keep it simple.) So the data might appear like this:
UNITED STATES OF AMERICA, ESTABLISHED IN LIBERTY 1776-JUL-04.
LIBERTY SPACECRAFT LAUNCHED 2016-JUN-01.
TIME: 23:23 ZULU
TEMPERATURE: 05 CELSIUS
RADIATION 0-100 ANGSTROMS: 14 JOULES
RADIATION 100-300 ANGSTROMS: 21 JOULES
* * * * * *
and so on. Notice the use of international metric units and time format,
and the avoidance of decimal fractions.
THE INTERNET ERA
Emulation refers to the ability of a computer program or
electronic device to imitate another program or device.
Many printers, for example, are designed to emulate
Hewlett-Packard LaserJet printers because so much software
is written for HP printers. By emulating an HP printer, a
printer can work with any software written for a real HP
printer. Emulation "tricks" the running software into believing
that a device is really some other device.
In a theoretical sense, the Church-Turing thesis implies that
any operating environment can be emulated within any other.
However, in practice, it can be quite difficult, particularly
when the exact behavior of the system to be emulated is not
documented and has to be deduced through reverse engineering.
It also says nothing about timing constraints; if the emulator
does not perform as quickly as the original hardware, the
emulated software may run much more slowly than it would have
on the original hardware, possibly triggering time interrupts
to alter performance.
-- Wikipedia article on "emulator"
( en.wikipedia.org/wiki/Emulator )
"Amazon is the world's most aggressively marketed beta product."
-- Mike Daisey, 2002
"21 Dog Years: Doing Time @ Amazon.Com"
( www.amazon.com/exec/obidos/ASIN/0743225805/hip-20 )
Back when I was a tech writer for Data General in the late 1970s I
documented a new networking system they developed to run on their
new Eclipse minicomputers (VAX-class), based on the new X.25 protocol.
( en.wikipedia.org/wiki/X.25 )
The system was called XODIAC, and it was another proprietary networking
system like DECnet, and it never went anywhere. But even earlier,
in the late 1960s, the Advanced Research Projects Agency (ARPA) of the
Department of Defense (DoD) was funding engineers at BBN
( en.wikipedia.org/wiki/BBN_Technologies )
and researchers at places like UCLA, the Stanford Research Institute (SRI),
UC Santa Barbara, and University of Utah to cobble together ARPANET,
( www.dei.isep.ipp.pt/~acc/docs/arpa.html )
the computer network which became DARPANET, then NSFNET, then finally just
What made this possible was the proposal and adoption of protocols,
including the so-called TCP/IP stack, which define three of the seven layers
of a network:
Layer 1: Physical Layer Ethernet cable
Layer 2: Data Link Layer WAN Protocol architecture,
IEEE 802 LAN architecture
Layer 3: Network Layer Internet Protocol (IP)
Layer 4: Transport Layer Transmission Control Protocol (TCP)
Layer 5: Session Layer usually handled by TCP
Layer 6: Presentation Layer encryption, ASCII and/or XML coding
Layer 7: Application Layer your browser, mail program, IM client,
internet radio player, etc.
For greater detail see the Wikipedia article on this set of layers, the Open
Systems Interconnection Reference Model (OSI Reference Model or OSI Model).
( en.wikipedia.org/wiki/OSI_model )
Aside: I've noticed that when I need to figure out whether someone in
high-tech is in engineering or "other" -- management, marketing, legal,
accounting, etc. -- I ask them what IP stands for. If they say "Internet
( en.wikipedia.org/wiki/Internet_protocol )
they're an engineer; if they say "Intellectual Property" they're "other."
( en.wikipedia.org/wiki/Intellectual_property )
When I was documenting that proprietary network stack at Data General,
I came up with a diagram to explain the relationships of the levels.
Originally I doodled it on my desk pad with colored pencils for my
own edification. It was an onion, cut in half, with the onion's layers
being the network protocol layers and each half residing on one of two
computers communication point-to-point. My boss loved it, and insisted I
put it in the manual. Here is my final drawing, as I delivered it to
the art department to be professionally re-drawn, of the "XODIAC Onion."
( i164.photobucket.com/albums/u12/c3m_2007/Xonion.jpg )
The solid lines show the transport of information down through the
layers, across the physical channel, and back up to the destination
resource. The dotted lines show the "virtual" flow of information --
each layer is free to ignore the layers below, and pretend it has a
direct link with the corresponding layer on the other system.
Of course XODIAC was a false start, but it helped pave the way for
today's internet as users and engineers learned from each iteration.
The story is told in detail in "Where Wizards Stay Up Late: The Origins
Of The Internet" (1998) by Katie Hafner.
( www.amazon.com/exec/obidos/ASIN/0684832674/hip-20 )
From the hindsight of history we can seen that the juggernaut of
Moore's Law -- chip efficiency doubles every 18 months -- fixes
performance problems but not compatibility problems.
An old timer will tell you that emulation drives innovation. From
the old serial port modems and their definitions of Data Terminal
Equipment (DTE) and Data Communications Equipment (DCE), to the
universal emulation of DEC VT-100 terminals, to the widespread
emulation of Hayes Smartmodems, to printers that emulate Hewlett Packard's
models, to todays operating system "virtualization," emulation has allowed
new technologies to "plug in" to "sockets" designed for older uses.
Indeed, the internet protocols represent a new breed of "magic words"
which enable distances to shrink and a global network of computers
to become what McLuhan called in the 1970s a "global village,"
or what we call today a "social network."
But that wasn't obvious in the beginning. What of the first problems
after ARPANET was deployed -- a network of routers build by BBN connected
to each other and host computers at each site -- was figuring out what
it was good for. The routers were welded shut so the users wouldn't
reprogram them for other tasks, ensuring that the network could stay
up much of the time, unlike the old acoustic modems that had to "dial up."
What good was that level of connectivity?
The obvious ideas were tried first. Remote shells like 'rsh' made
it possible to operate another computer as if by remote control.
File transfer programs like 'ftp' made it easy to move files around.
Email was popular from the start. Some users cleverly launched what
became the USENET newsgroups.
( en.wikipedia.org/wiki/Usenet )
According to the legend I heard, these users realized that they couldn't
sneak in a recreational use for a network of government-funded researchers,
and so they applied for a grant to study network traffic, and argued they
needed "real" data, and so the newsgroups were born to provide that traffic.
Another profoundly important use for the early network was to enhance
the technical definitions of the network. Requests for Comment (RFCs)
were the vehicle for establishing new protocols, enabling new applications.
And in the process, a community was forming. It is worth noting that
since all users were Federal grant recipients, and many had security
clearances, ARPANET was viewed as a "trusted" network, and internal
security (protecting one use from another) was a low priority issue.
This approach was to have profound implications later.
Another clever thing to do with networked computers was to make their
assorted disk drives look like one big shared file system. XODIAC did this,
and eventually Sun's Network FIle System (NFS) did the same thing.
That's why Sun could say "We put the dot in dot com!" later on --
NFS introduced the dot notation.
I have been fascinated by the way each evolutionary step in the
development of the internet has made the next steps easier. Before
the World Wide Web (WWW) people used FTP to download Wide Area
Information Search (WAIS), an FTP-based search engine,
( en.wikipedia.org/wiki/Wide_area_information_server )
and the similar Archie, and its spin-offs Jughead and Veronica,
( en.wikipedia.org/wiki/Archie_search_engine )
along with Gopher, a data fetching service that layered on top of other tools.
( en.wikipedia.org/wiki/Gopher_(protocol%29 )
All of these were going strong before Web surfing was invented, enough
that O'Reilly and Associates was able to publish "The Whole Internet
User's Guide and Catalog" (1993) which became their biggest hit ever
and put them on the map.
( www.amazon.com/exec/obidos/ASIN/B000HNQ9T6/hip-20 )
It was Tim Berners-Lee, in 1989,
( en.wikipedia.org/wiki/Tim_Berners-Lee )
an English researcher at the CERN atom smasher in Geneva, Switzerland,
who invented the World Wide Web, including its two standards: Hypertext
Transfer Protocol (HTTP),
( en.wikipedia.org/wiki/Http )
and the Hypertext Markup Language (HTML).
( en.wikipedia.org/wiki/Html )
He and a grad student got it running (on a NeXT box!) by 1990.
It was Marc Andreessen,
( en.wikipedia.org/wiki/Marc_Andreessen )
an American student at the famous National Center for Supercomputing
Applications (NCSA), at the University of Illinois at Urbana-Champaign,
( www.ncsa.illinois.edu )
who added the innovation that pictures could appear ON THE PAGES
instead of always in separate links, like in a comic book or photo-novella.
The last piece fell into place when the National Science Foundation
lifted their restrictions on the commercial use of the internet,
which was in 1991 or 1992, depending on who you believe. One source
says it was an act of congress on 6/9/92, Congressman Rick Boucher's
amendment to the NSF Act of 1950, which G. H. W. Bush signed on 11/23/92.
Another source claims in happened in a NSF committee in 1991. I think
it odd that this event is in such obscurity -- I'd like it to be a
national holiday. It was a day that the government caused something
extremely good to happen by STOPPING something it was doing. (As my
friend Dr. Dave would say, "Reward that behavior!")
I have found that most "history of the internet" articles and even books
begin AFTER the above occurred; usually the Netscape IPO is cited as the
"beginning" of the story.
I think it was 1994 when my friend Steve Price told me Mosaic was
the Next Big Thing.
( www.sjgames.com/svtarot/net/nextbigthing.html )
A few days later I was visiting a customer at UCLA and he mentioned
that he had Mosaic, and I begged him to show it to me. I was fascinated.
Shortly afterwards I got it for the Silicon Graphics Indigo that I
had in my office in Irvine, CA, and used it over a Point-to-Point Protocol
(PPP) connection to a server in Waltham, MA, and discovered the wonders
of URLoulette, a site that sent you to a random link.
I knew this was going to be big. In 1993, disgusted with the media
coverage of the second OJ trial, my wife and I decided to get rid of
the cable TV and spend the money on internet access instead, which we
did for the next five years. One of the best decisions we ever made.
The remarkable thing about a web browser is how fault tolerant it is.
A web page typically has no documentation, no training, is "stateless"
be default (and must be to respond correctly to the "Back" button),
and disconnects the network link after every one-way transaction.
It completely hides the incompatibilities in different operating
systems in line termination (line feed and/or carriage return?).
Missed packets are usually not reported and then pages are rendered
with missing data, oh well. Errors in HTML are also ignored or
corrected with guesswork. It's like the polar opposite of the
old mainframe operating systems that would give a verbose error
message if you typed a Carriage Return by itself (i.e., with no command).
Being nearly idiot-proof, the Netscape browser became big. Microsoft ripped
it off with Internet Explorer and began giving it away. Then around 1999
or so, EVERYBODY started saying "this is going to be HUGE!" and buying
every internet stock they could find. When everybody's saying it, it's
a bubble. Pop!
More details on the dot com boom can be found in "Bamboozled
at the Revolution: How Big Media Lost Billions in the Battle
for the Internet" (book, 2002) by John Motavalli,
( www.amazon.com/exec/obidos/ASIN/0670899801/hip-20 )
and "In Search of Stupidity: Over Twenty Years of High Tech Marketing
Disasters" (book, 2005) by Merrill R. (Rick) Chapman,
( www.amazon.com/exec/obidos/ASIN/1590597214/hip-20 )
and on the comedy album "Boom Dot Bust" (2000) by the Firesign Theater.
( www.amazon.com/exec/obidos/ASIN/B00001WRKQ/hip-20 )
One one of the best windows into Internet Culture I have found
is the on-line comic "User Friendly" about the folks who work at
a fictional Internet Service Provider (ISP) in Canada, called
Columbia Internet, with the hot shot programmer, sleazy sales guy,
Russian hacker, overwhelmed phone support guy, old UNIX dude,
and nervous boss, along with all of their foibles and squabbles.
( www.userfriendly.org )
"The Pointy-Haired Boss (often abbreviated to just PHB) is
Dilbert's boss in the Dilbert comic strip. He is notable for
his micromanagement, gross incompetence and unawareness of
his surroundings, yet somehow retains power in the workplace.
... The phrase 'pointy-haired boss' has acquired a generic
usage to refer to incompetent managers."
-- Wikipedia entry for "Pointy-Haired Boss"
( en.wikipedia.org/wiki/Pointy-Haired_Boss )
And speaking of comics, lets not forget Dilbert, who brought the world
of big company engineering to the masses.
By the 1990s most people had experienced some sort of bad software.
When this stuff was written by teenagers in garages (circa 1982)
it was easy to blame bad programmers, but by Windows 95 the scale was
so large that bad software managers had to be involved as well.
In fact, as we close out the first decade of the new millennium,
I would say we have a full-blown software management crisis.
If you read about the 2001 I2 and Nike fiasco and litigation,
( news.cnet.com/i2-Nike-fallout-a-cautionary-tale/2100-1017_3-253829.html )
or the 2005 FBI $170 million "virtual casebook" fiasco,
( www.windley.com/archives/2005/01/fbis_virtual_ca.shtml )
it becomes evident that mistakes continue to be made.
My friend Bruce Webster, in his blog "And Still I Persist,"
( and-still-i-persist.com/index.php?s=coincidently )
suggested that whatever is wrong in software management is starting
to leak back into the "real world."
Gerry Weinberg, who has been a leading light in software
engineering for nearly 40 years, once famously remarked,
"If builders build buildings the way programmers wrote
programs, then the first woodpecker that came along would
Up in Boston, it appears that the woodpeckers have struck,
with tragic results.
* * * * * *
I have been following the Big Dig off and on over the past
two decades, and the various problems, setbacks, and cost
overruns have smacked far more of a badly-run, large-scale
software project than a typical construction project.
He also, on another occasion, suggested that deploying bad software
without adequate testing is a lot like the way most laws are passed
and implemented, with zero quality control or follow-up.
One has to ask, "Why?" What is it about software project management
that makes it so easy to do badly?
One problem I noticed long ago is that you can usually tell by a visual
inspection if a ditch is half-dug. This makes ditch-digging pretty easy
to manage (or so I would imagine). But it is very difficult -- nigh
on impossible -- to tell if a program is half-written.
Another problem is that in case of project difficulties upper management
doesn't dare fire the programmers -- typically only those who wrote the
code understand it (if them), and chaos would ensue if they left for
any reason. So middle managers are fired or shuffled, and there is
often supervisory turbulence in otherwise stable teams. Of course,
a new manager wants to make changes, to show they are "doing something,"
and often these changes are disruptive. Efficiency and morale suffer.
Rinse and repeat.
Another danger thing that bosses can do is search for the "magic bullet"
that will make programmers work more efficiently, or maybe even make
it possible to replace them with cheaper, less senior coders. Software
tool vendors are always promising these kinds of results when they
pitch management. I know; I used to be on these types of teams. We were
always trying to find ways AROUND the people with technical know-how,
to get to the "big picture" upper managers who were presumably easier
to hypnotize. (Not that I ever saw this strategy work...)
One of the most common vendor pitches involves an Integrated Development
Environment, or IDE. I agree with Andrew Hunt and David Thomas in
"The Pragmatic Programmer: From Journeyman to Master" (1999),
( www.amazon.com/exec/obidos/ASIN/020161622X/hip-20 )
in which they say:
A lot of programming is tedious and requires you to construct tools.
Parsers are the canonical example of such tedium. The need to invent
little tools hasn't gone away and people are building little and not
so little tools all the time. But, the large majority of programmers
are addicted to their IDEs, and haven't a clue about building any
tools. The chase for the silver bullet that addresses all of one's
need is constantly on.
I think that frequently a boss feels like his or her own engineers
make them feel stupid, while that friendly IDE vendor keeps telling them
how smart they are.
A major problem a non-technical person faces is what I call the
Expert Paradox. If you don't understand a technology then you
need an expert to advise you. In fact, you need a GOOD expert.
But without knowledge of the technology, you can't tell a good
expert from a bad one. You need someone to advise you; another
expert. But figuring out how to select THEM puts you right back
in the same quandary.
This is how an owner's nephew often ends up in charge of the e-commerce
site for a small company. They may not know anything, but they are trusted.
Another insidious problem is what I call the Brooks Paradox.
Anyone who has done even the most rudimentary research on software
management will know that the grandaddy of books on the topic is
"The Mythical Man-Month" (1974) by Frederick P. Brooks,
( www.amazon.com/exec/obidos/ASIN/0201835959/hip-20 )
as I mentioned in the last issue. And yet there are many software
managers who not only haven't read it, but haven't heard of it.
The paradox as I have observed it is that just about anyone who has
read this book didn't really need to. The kind of managers who've
sought out this classic typically are the kind that seek the counsel
of wiser ones, and strive for continual improvement, and engage in
self-criticism. Conversely, the type that seeks no counsel, and
thinks they need no improvement, and can't bear any criticism, let
alone self-criticism, in most cases has never heard of the book.
(In a previous 'Zine I proposed calling this "Scrivener's Law," but
I'm feeling less arrogant today.)
I've heard it said that there are two types of software development
organizations, "What spec?" and "What's a spec?" The first contains
overworked, abused pros, and second overworked, abused amateurs.
One of the biggest disconnects in nearly every technical organization
is the perception of risk at different management levels. Cal-Tech's
Nobel-prize-winning physicist Richard Feynman noted this when he was
invited to join the Rogers Commission investigating the Space Shuttle
( en.wikipedia.org/wiki/Rogers_Commission )
In his minority report which Rogers tried to suppress,
( science.ksc.nasa.gov/shuttle/missions/51-l/docs/rogers-commission/Appendix-F.txt )
Feynman describes the result of months of interviews with engineers
and managers, both before and after the accident (the "before" for a
Range Safety Officer report):
It appears that there are enormous differences of opinion
as to the probability of a failure with loss of vehicle and
of human life. The estimates range from roughly 1 in 100 to
1 in 100,000. The higher figures come from the working engineers,
and the very low figures from management. What are the causes
and consequences of this lack of agreement? Since 1 part in
100,000 would imply that one could put a Shuttle up each day
for 300 years expecting to lose only one, we could properly
ask "What is the cause of management's fantastic faith in the
* * * * * *
It would appear that, for whatever purpose, be it for internal
or external consumption, the management of NASA exaggerates the
reliability of its product, to the point of fantasy.
Another take on this popped recently in the blogosphere; some blogger
who I can't seem to find again -- possibly at Information Week
( www.informationweek.com )
or InfoWorld --
( www.infoworld.com )
talked about how technical managers often have unrealistically high
estimates of the probability of success of software projects. Their
techies will give the same projects much lower success probabilities,
in other words, higher risk.
I've often noticed an attitude of stingy managers; when their employees
ask for extra tools to avert calamities (like, oh, backup systems, and QA
system) the managers think they're just lazy, kind of a "give them a
pillow and they'll ask for a goose-down pillow" attitude. What I realized
recently that this not only allow the manager to save money and be an
A-hole, they also can STAY IN DENIAL, which is probably the most important
A chronicle of how software management can go horribly wrong is in the
instant classic "Death March" (1999) by Edward Yourdon,
( www.amazon.com/exec/obidos/ASIN/013143635X/hip-20 )
and by happy coincidence my aforementioned friend Bruce Webster has
just gotten a new book review on Slashdot:
( books.slashdot.org/story/09/07/15/1255258/Why-New-Systems-Fail?art_pos=1 )
about the new book "Why New Systems Fail: Theory and Practice Collide"
(2009) by Phil Simon,
( www.amazon.com/exec/obidos/ASIN/1438944241/hip-20 )
which looks the the buy side of the build-or-buy decision, and its
Another wonderful source of software development failure data, from
the field inn raw form, is Daily WTF:
( thedailywtf.com )
Now, I can't claim to know the best way to manage programmers,
but I know what DOESN'T work. Drawing an "architecture diagram"
on a napkin, transferring it to a Power Point, and assigning a programmer
to each box, with no real spec and no definition of how the boxes
communicate (protocol? API?), and then preventing the engineers from
working this all out by having an insecure manager who must run all
meetings, and won't allow conversations he or she doesn't understand,
until the engineers are covertly meeting... THAT doesn't work.
(See also Conway's Law:
The rule that the organization of the software and the
organization of the software team will be congruent;
commonly stated as "If you have four groups working on
a compiler, you'll get a 4-pass compiler". The original
statement was more general, "Organizations which design
systems are constrained to produce designs which are
copies of the communication structures of these organizations."
This first appeared in the April 1968 issue of Datamation.)
( www.elsewhere.org/jargon/jargon.html#Conway%27s%20Law )
Given what doesn't work, I have found that just about ANY methodology
would be better than the above. The traditional "waterfall" with specs
at the beginning and testing at the end,
( en.wikipedia.org/wiki/Waterfall_model )
(though as a cyberneticist I must protest the lack of feedback channels
in the model),
the approach of "Design Patterns,"
( en.wikipedia.org/wiki/Design_pattern_(computer_science%29 )
(more on them later), and the newfangled "Extreme Programming"
( en.wikipedia.org/wiki/Extreme_programming )
all work better than what I call the "Napkin Approach." As long as the
methodology is applied at the beginning, and not slapped on to "fix" a
broken project, as long as there is buy-in by the engineers, and as
long as there isn't the expectation that the methodology will be a
"magic bullet," they mostly all work.
Someday, in my "copious free time," I want to write humorous book
called "How To Drive a High-Tech Company Into the Ground," detailing
the important roles played by investors, management, engineering,
marketing, sales, customer support and even legal and finance in
building a perfect bonfire of the Venture Capital (VC) money.
But I guess it's fair to ask me what I think constitutes "best
practices." On the technical side, I favor the approach usually
called "Design to test." I worked with a guy once who coined
"Emmons Law" -- "If it isn't tested, it doesn't work." I have
found this to be true virtually always. If you assume this,
and design things to be tested at every level, it will be easier
to fix all those problems you didn't anticipate.
This is the approach described in "The Pragmatic Programmer: From
Journeyman to Master" (1999) by Andrew Hunt and David Thomas.
( www.amazon.com/exec/obidos/ASIN/020161622X/hip-20 )
They talk about something I'd figured out but hadn't named, what they
called "tracer bullets." In an interview
( www.artima.com/intv/tracer.html )
the authors explain:
Bill Venners: In your book, The Pragmatic Programmer, you suggest
that programmers fire "tracer bullets." What are tracer bullets?
And what am I trading off by using tracer bullets versus
"specifying the system to death," as you put it in the book.
Dave Thomas: The idea of tracer bullets comes obviously from
gunnery artillery. In the heavy artillery days, you would
take your gun position, your target position, the wind,
temperature, elevation, and other factors, and feed that
into a firing table. You would get a solution that said to
aim your gun at this angle and elevation, and fire. And you'd
fire your gun and hope that your shell landed somewhere close
to your target.
An alternative to that approach is to use tracer bullets.
If your target is moving, or if you don't know all the factors,
you use tracer bullets -- little phosphorous rounds intermixed
with real rounds in your gun. As you fire, you can actually see
the tracer bullets. And where they are landing is where the
actual bullets are landing. If you're not quite on target --
because you can see if you're not on target -- you can adjust
Andy Hunt: Dynamically, in real time, under real conditions.
Dave Thomas: The software analog to firing heavy artillery by
calculating everything up front is saying, "I'm going to
specify everything up front, feed that to the coders, and
hope what comes out the other end is close to my target."
Instead, the tracer bullet analogy says, "Let's try and
produce something really early on that we can actually
give to the user to see how close we will be to the target.
As time goes on, we can adjust our aim slightly by seeing
where we are in relation to our user's target." You're
looking at small iterations, skeleton code, which is
non-functional, but enough of an application to show
people how it's going to hang together.
Basically, it all comes down to feedback. The more quickly
you can get feedback, the less change you need to get back
Okay, that's on the technical side. On the group dynamic side,
I have a plan that I think has seldom if ever been tried. I would
have a bullpen of programmers and pay them a modest base salary to
sit around playing with new technologies and doing experimental coding.
When a project needed doing an outside project manager would post a spec,
and then the QA group would produce and code a test suite. In the
bullpen, two team captains would choose up two teams to attack it.
Each group gets to see the other's work, but not attend their meetings.
The first group to turn in bug-free code that meets the spec gets a
large bonus; if the other group also finishes in the time limit they
get a small bonus.
Annually replace the bottom 10% of earners, and anyone who's a pain.
The biggest objection to this approach is that it is "wasteful"
becasue you have 100% redundancy. Ever been on a project that
went in excess of 100% over budget and/or schedule? Flush, there goes
the sound of money that could've bought redundancy. Instead, you
get a "Death March" with workers and management essentially blackmailing
Another approach I would love to take is to invest in researching the
psychology of perception. (See Bateson's essays on "Experimental
Espitemology.") My intuition tells me there's some cherry-picking
to be done there in interface design. I am intrigued by some of the
research of Richard Mark Friedhaff, author of "Visualization: The Second
Computer Revolution." (1989)
( www.amazon.com/exec/obidos/ASIN/0716722313/hip-20 )
About the Author
RICHARD MARK FRIEDHOFF is a scientist and technologist working in
the areas of computer science and human perception. He has been
associated with many corporations and institutions, including the
Polaroid Corporation; Silicon Graphics, Inc.; the University of
California; Dolby Laboratories; and the Rowland Institute for Science.
His research has focused on human/computer interactions and computer
models of color perception. He is coauthor with William Benzon of
the widely praised Visualization: The Second Computer Revolution
(W. H. Freeman and Company, 1991)
I would also love to finally make it to SIGCHI, the ACM's Special
Interest Group (SIG) on Computer-Human Interaction (CHI).
( sigchi.org/chi2009/Default.html )
Here are the top-awarded papers from 2009:
CHI 2009 BEST PAPERS, AWARDED BY SIGCHI
"Predicting Tie Strength With Social Media"
Eric Gilbert, Karrie Karahalios,
University of Illinois at Urbana-Champaign, USA
"Undo and Erase Events as Indicators of Usability Problems"
David Akers, Stanford University, USA
Matthew Simpson, Robin Jeffries, Google, Inc., USA
Terry Winograd, Stanford University, USA
"From Interaction to Trajectories: Designing Coherent
Journeys Through User Experiences"
Steve Benford, The University of Nottingham, UK
Gabriella Giannachi, The University of Exeter, UK
Boriana Koleva, Tom Rodden, The University of Nottingham, UK
"Musink: Composing Music through Augmented Drawing"
Theophanis Tsandilas, Catherine Letondal, Wendy E. Mackay,
INRIA / Universite` Paris-Sud, France
"Sizing the Horizon: The Effects of Chart Size and Layering
on the Graphical Perception of Time Series Visualizations"
Jeffrey Heer, Stanford University, USA
Nicholas Kong, Maneesh Agrawala, University of California, Berkeley, USA
"Social Immersive Media: Pursuing Best Practices for Multi-
user Interactive Camera/Projector Exhibits"
Scott S. Snibbe, Sona Research, USA
Hayes S. Raffl e, Massachusetts Institute of Technology, USA
"Ephemeral Adaptation: The Use of Gradual Onset to
Improve Menu Selection Performance"
Leah Findlater, Karyn Moffatt, Joanna McGrenere,
Jessica Dawson, University of British Columbia, Canada
Actually, though, in my opinion there have been remarkably few
deliberate attempts to push the edge of the envelope in computer-human
interfaces, especially with the goal of QUANTIFYING, MEASURING AND
INCREASING THE BANDWIDTH in each direction.
Of course the perennial Xerox Palo Alto Research Center (PARC) keeps
coming up with good idea's that Xerox can't figure out how to monetize.
( en.wikipedia.org/wiki/Xerox_Parc )
When I visited in the mid-1990s they were working on a dry-cleaner
metaphor, with your data numbered and hanging on a chain that runs
through the virtual building. They were achieving remarkably high
densities of discernible data per pixel with this approach.
The U.S. Army was funding something for a while called the Light
Helicopter Experiment (LHX),
( en.wikipedia.org/wiki/LHX )
which was based on the observation that engineers couldn't make
helicopters any lighter until they found a way to combine the
two-person crew -- pilot and gunner -- into one, and this is a
USER INTERFACE problem. (I was offered a job at Hughes Aircraft
Company working on user interface design for LHX in 1986, but I
went instead to work at Rockwell on computer graphics of space
During the end of the dot com boom there were some day traders
throwing wads of money at their user interface problem: how to increase
the bandwidth of intelligible stock data.
Of course the forward looking folks at Mindtel have worked on the
"Grok Box" and other projects designed to measure and maximize interface
bandwidth, as documented in a survey paper I wrote in 2005:
( www.well.com/~abs/HIP/Mindtel/VPHT2.html )
As I said, I'd love to do work on all these fronts. But if you REALLY
want to see results, I'd recommend some type of copy of the DARPA Grand
Challenge -- a contest with a big cash prize for the most quantifiable
improvement in computer-human bandwidth, CAREFULLY DEFINED.
THE DIGITAL GHOST-TOWN
Wrestling with stress puppies in the data swamp?
Speed skating with wolves on the glass ceiling?
Beating off the rat-race with a mouse?
Hey Sage, you're too busy to lose the kind of money you're making.
It's time to put our strong hand in your pocket.
Turn it over, give it up, submit, to:
Fly in on a boom -- drive home on a bus.
a platform-agnostic, browser-blind,
big bubble bit-broker from
U.S. ("What gate?") Plus.
* * * * * *
From like you're drinking from a half-empty dribble glass?
Haunted by the bull you buried in the basement?
See yourself standing at the wrong end of the Cheese Stamp line?
Hey Sid, it's not worth saving the kind of money you're losing.
It's time to cup our strong hand under the hole in your pocket.
Let it go, hand it over, submit, to:
Flown in on a broom, swept out with the dust.
A usurer-friendly, randomly-managed, ethically indifferent,
cash-back unit from
U.S. ("What gate?") Plus.
-- The Firesign Theater, 1999
"Boom Dot Bust"
( www.amazon.com/exec/obidos/ASIN/B00001WRKQ/hip-20 )
( www.dizzler.com/music/Firesign_Theatre )
After the bust, a lot of people sat around feeling sorry for themselves,
and the book "Who Moved My Cheese?: An Amazing Way to Deal with Change
in Your Work and in Your Life" (1998) by Spencer Johnson,
( www.amazon.com/exec/obidos/ASIN/0399144463/hip-20 )
suddenly became a hit. But it didn't seem like much of a recession
to me. You still couldn't get a cab in Vegas. So now in this new
recession, when you can't get a parking space at the mall (I know,
I know, "they're each spending less"), I'm thinking back to that old
recession. I remember when we reached the point where 50% of households
had VCR players in their homes, what was it, 1988? Suddenly when you went
to the Dodge dealer they gave you a videotape about the new Neon (or
whatever); they knew you'd be likely to be able to play it at home.
So I got to wondering when did we get to 50% of America having internet
access? I found figures for North America,
( www.internetworldstats.com/stats.htm )
and it looks like it was about the first quarter of 2003. Gee, I thought,
that was about bottom of market, wasn't it? Another web search showed
that yes, it was.
( stockcharts.com/charts/historical/djia2000.html )
Hmmm. "Green shoots," as they say. (A also remember digital camera
sales going through the roof -- now its smart phones.)
This is UTV, for You, the Viewer.
-- Proctor & Bergman, 1973
"TV Or Not TV, A Video Vaudeville" (comedy CD)
( www.amazon.com/exec/obidos/ASIN/B0000AAZY1/hip-20 )
I think the most prescient book I read as a teenager was "Future Shock"
(1971) by Alvin Toffler.
( www.amazon.com/exec/obidos/ASIN/0553277375/hip-20 )
He said said change due to technological progress would keep accelerating.
In a sense he presaged the "singularity" of 2012 or whatever.
( en.wikipedia.org/wiki/2012_doomsday_prediction )
And so far it has. Even a bad economy can't slow it much. (And if
you remember what we learned from the "Limits To Growth" models,
we have to keep innovating ourselves out of a systemic population-
resources crash, so this is a good thing.)
All the while, through the peaks and valleys, the web has been evolving.
The big problem from the get-go was the lack of interactivity in a browser.
Heck, you couldn't even play Tetris on one. Techies would explain about
page refreshes and network latency, but users' attitudes were "Why is
this my problem? I can play Tetris on a Game Boy, why not a Browser
on a real computer?" So various solutions were found, including the
open standard Java,
( en.wikipedia.org/wiki/Java_(software_platform%29 )
which Microsoft has found ways to slow, the proprietary solution
of plug-ins, especially Flash, the grand-daddy of them all,
( en.wikipedia.org/wiki/Adobe_Flash )
( en.wikipedia.org/wiki/Ajax_(programming%29 )
Netscape before it lost the Browser War to Microsoft -- and the new,
XML, an open standard for program-to-program communication which web
server and web browser can use. In this mash-up nearly magical effects
can appear in the browser window, such as we see in Google Maps.
( en.wikipedia.org/wiki/Google_maps )
It's also the only technology of the three available on the iPhone
to date, and so Apple is leaning on the scales that way.
But I don't think they've quite made it possible to play Tetris in
AJAX yet. So this is all still a step backwards from GameBoy.
But at least it's a step forward for the internet. Bill O'Reilly,
tech book magnate, has declared Web 2.0,
( en.wikipedia.org/wiki/Web_2.0 )
the social-networking internet, in which we Facebook and MySpace
and YouTube and Twitter up our own content, and it defines us and our social
matrix, or whatever.
People keep asking me (like I'm the Computer Guy) what's up with this
"Twitter" phenomenon. Well, I tell them, consider this:
* What makes this powerful is it is limited, so you must
hit send and move on, encouraging more frequent use, but
is also universal; it aggregates web, email, text messaging
and custom PC and smartphone apps for both input and output.
* The original "killer app" that I kept mentioning was the
Army wife whose husband is deployed to Iraq or somewhere.
She tweets throughout the day about the grocery bag
ripping and the kids fighting at the doctor, and when he
gets internet access he can read them all and catch up,
and then when he finally gets to call her he feels more
connected, and can continue the conversation.
* The next "killer app" I identified was in natural disasters, such
as the 2007 wildfires here in San Diego County. Some twittering
went on there, and helped coordinate information.
* Sure enough, when there was the big 8.0 quake in the Sichuan Province
of China last year (May 12, 2008),
( en.wikipedia.org/wiki/Sichuan_Earthquake )
it appeared on Twitter a full minute before it appeared in the United
States Geological Survey (USGS) web site which is AUTOMATICALLY
UPDATED from earthquake sensors.
* when the Lakers traded Shaq he found about it on Twitter.
* Demi Moore used her Twitter account to prevent a suicide.
* Twitter delayed a network upgrade during the post-election
demonstrations in Iran to allow Iranians to keep using Twitter
as a pro-democracy tool, at the request of the Obama administration.
And so it goes. But as some blogger pointed it, it's the social networks
that are important, not the software and systems the use.
A recent on-line edition of the Onion
( www.theonion.com/content/news_briefs/twitter_creator_on_iran_i?utm_source=b-section )
had this fake news story:
Twitter Creator On Iran: 'I Never Intended For Twitter To Be Useful'
JUNE 24, 2009
SAN FRANCISCO -- Creator Jack Dorsey was shocked and saddened this
week after learning that his social networking device, Twitter, was
being used to disseminate pertinent and timely information during
the recent civil unrest in Iran. "Twitter was intended to be a way
for vacant, self-absorbed egotists to share their most banal and
idiotic thoughts with anyone pathetic enough to read them," said
a visibly confused Dorsey, claiming that Twitter is at its most
powerful when it makes an already attention-starved populace even
more needy for constant affirmation. "When I heard how Iranians were
using my beloved creation for their own means -- such as organizing
a political movement and informing the outside world of the actions
of a repressive regime -- I couldn't believe they'd ruined
something so beautiful, simple, and absolutely pointless."
Dorsey said he is already working on a new website that will be
so mind-numbingly useless that Iranians will not even be able to
figure out how to operate it.
Once again I recall that much of this territory has been covered by
"Smart Mobs: The Next Social Revolution" (2003) by Howard Rheingold.
( www.amazon.com/exec/obidos/ASIN/0738208612/hip-20 )
Another peek into the social network's dynamics is found in the novel
"Eastern Standard Tribe" (2004) by Cory Doctorow.
( www.amazon.com/exec/obidos/ASIN/0765310457/hip-20 )
And in a world where the individual strives to make a difference,
ex-MTV Vee-Jay Adam Curry -- the one with the big Norse hair that
everyone thought was an airhead --
( en.wikipedia.org/wiki/Adam_Curry )
went on to invent the PodCast, and make it into a reality and a
de facto standard.
( en.wikipedia.org/wiki/Podcast )
And in a world of fast and cheap laptops (and even Netbooks, oh my,)
I excpect to see more of the customizations and "case mods" predicted
in stories such as "Idoru" (1996) by William Gibson,
( www.amazon.com/exec/obidos/ASIN/0425190455/hip-20 )
and demonstrated by the Steampunks, amongst others.
( en.wikipedia.org/wiki/Steampunk )
FROM INTROSPECTION TO ALGORITHM
[After Ray spills a box of toothpicks on the floor of a diner.]
Raymond: 82, 82, 82.
Charlie: 82 what?
Charlie: There's a lot more than 82 toothpicks, Ray.
Raymond: 246 total.
Charlie: How many?
Waitress: [Reading the box.] 250.
Charlie: Pretty close.
Waitress: There's four left in the box.
-- "Rain Man" (1988 movie)
( www.amazon.com/exec/obidos/ASIN/B0000YEEGM/hip-20 )
An old chestnut in Computer Science comes from Knuth I think:
ALGORITHMS + DATA STRUCTURES = PROGRAMS
His "Art of Computer Programming" (volumes 1-3, 1968 + etc.)
( www.amazon.com/exec/obidos/ASIN/0201485419/hip-20 )
( en.wikipedia.org/wiki/The_Art_of_Computer_Programming )
has the most efficient sort routines of their time. But no Magnum
Opus can solve all programming problems. Sooner or later the programmer
faces a New Requirement, the elusive Something They've Never Done Before,
which might even be Something No One Has Done Before. That's when a good
programmer pulls out that seldom-used tool, INTROSPECTION.
A recent series of events in my life, intersecting my interests in
math education and recreational programming, shows what I mean.
I am always on the lookout for puzzles and games to exercise my
daughter's mind (and my own, and sometimes my friends'). Towards this
end I sometimes buy, or borrow from the library, mathematical puzzle
books by classic authors such as Martin Gardner and Raymond Smullyan.
One day in a Gardner book I ran across his description of pentomino
( en.wikipedia.org/wiki/Pentomino )
It mentioned that a toy company used to sell the puzzles, so on a hunch
I looked on eBay and found some. I still carry them around in a little
bag along with the shrunk down chapter about them in Gardner's book
"Mathematical Puzzles and Diversions" (1959).
( www.amazon.com/exec/obidos/ASIN/B0010QA3X2/hip-20 )
My daughter never did get too interested (she went for some other puzzles
and games), but I became fascinated with the pentominoes. It typically
took me several days to solve a given challenge. Since I was constantly
starting over, how was I able to "make progress" I wondered.
I researched the puzzle on the internet, and eventually ran across an
essay by good old Knuth called "Dancing Links." which talked about
problems that require a lot of backtracking, and used pentominoes as an
example. As one pentomino fan's web site
( www.yucs.org/~gnivasch/pentomino )
described the essay:
"Dancing Links." Donald E. Knuth, Stanford University.
Available as PostScript at:
( www-cs-faculty.stanford.edu/~knuth/papers/dancing-color.ps.gz )
In this article Knuth uses the Exact Cover algorithm to solve many
additional problems, involving polyominoes, polyamonds, polysticks,
queens on a chessboard, and more...
I learned that often the human may solve a puzzle one way, but it can be
more efficient for the machine to solve it another way. There is a basic
problem in information science: given a binary matrix (only 1s and 0s)
delete some of the rows so that the remaining matrix has exactly one 1 in
each column. For example, this matrix:
1 0 0 0 0
0 1 0 0 0
0 0 1 0 0
0 0 0 1 0
0 0 0 0 1
1 0 0 0 0
can be made to fit the criterion by deleting the top row OR the bottom row.
Solvers exist which do the best possible job of finding the right row(s)
to delete, and so if you can MAP another problem onto this form you've
solved it. Knuth showed how to map the pentomino puzzle onto a binary
matrix, and "presto" he'd found an efficient solution.
This hat trick was still fresh in my mind when I went on a road trip
with my friends Mike P. and Wayne H., we stopped at a country-style
restaurant called Po' Folks in Buena Park, CA,
( maps.google.com/maps?client=safari&oe=UTF-8&ie=UTF8&q=po+folks+buena+park&fb=1&split=1&gl=us&ei=lVtjSp7LA5KcswP3w63xAQ&cid=17572086991595257885&li=lmd&ll=33.852954,-117.998056&spn=0.008732,0.01354&t=h&z=16&iwloc=A )
and they had an amusement at the table called "The Original IQ Tester."
( www.pegame.com/sol_orig.html )
I worked at solving it in the restaurant, and then bought a copy of the
puzzle in the gift shop and took it home. I couldn't seem to solve one
particular variation, and decided to write a computer program to solve it.
This required that I engage in some INTROSPECTION: just how did I solve
this puzzle? Ultimately I concluded that I randomly tried legal moves.
A legal move was a "jump" like in checkers, one piece jumping another
and removing it, landing on an empty space, like so:
X X _
_ _ X
So I wrote a program that would randomly pick a jump, and do it if it
was legal. Part of the creative process of this was finding a way
to map the triangular board into a rectangular data structure that
C understands, and how to represent possible and legal moves.
Knuth's mapping inspired me to seek elegant solutions to these
If you have a UNIX environment with a C compiler you should be able
to drop in this self-extracting shell file into an empty directory.
( www.well.com/~abs/swdev/C/ALL.iqtester )
This process of introspection is a vital part of the programming process,
but the textbooks seldom discuss it. Programming is often taught as
if all you do is apply known algorithms to well-understood problems.
One of the reasons I went with a random algorithm for the IQ Tester was
that I didn't understand fully how to iterate through the complicated
tree of moves in the game without missing any. As a result, my program
didn't prove that one particular configuration was unsolvable. Instead
it just made it seem quite likely, since it usually found a solution in
a few minutes, and couldn't solve this one for days. But I wasn't SURE.
(Wikipedia claims it has been proved unsolvable.)
And, well, wouldn't you know, a few months later in my professional
life I encountered the same problem of iterating over a complicated
tree of options, and I realize now this is an area where I need to
Come to find out, while I wasn't paying attention, Knuth just published
Volume 4 of his "Art of Computer Programming" (2009)
( www.amazon.com/exec/obidos/ASIN/0321637135/hip-20 )
which deals, among other things, with combinatorics -- the class of
problems I need to learn to solve in the general case. Clearly I need
more study, and more introspection.
But then, don't we all. As the "Pragmatic Programmer" guys say, we
have an ongoing problem because "our abstractions leak." This is subtle
problem that is tricky to explain. I remember one of the "dummies" books,
I think it was "Windows 3.1 For Dummies" had a quote I photocopied and
taped to my wall (I'm quoting from memory):
The problem with the Windows 3.1 File Browser is that it's like
a cross between a car and a bathtub. People are confused when they
open the car door and all the water runs out.
If you've ever used the the Windows 3.1 File Browser you'll just what
I mean; the others are probably asking, "Huh?"
Okay, I'll try again. In one report decades ago from Xerox PARC they talked
about the use of METAPHOR AND MAGIC in user interfaces. Throw something
in the recycle bin? Metaphor. Open a folder? Metaphor. Change your
wallpaper? Metaphor. Double-click a "Microsoft Word" document and
bring up Word? Magic. Install software with a "Wizard" program? Magic.
Uninstall a driver and reinstall it to fix a bug with your sound card?
High level wizard magic.
When we talk about making a program or web site "training-free" it usually
means using the right metaphor, and reducing the magic to as little as
But sooner or later we want to do something that goes beyond the metaphor,
and that's when they "leak." We have to keep re-thinking what we're
doing as requirements change.
WHAT'S THE CONNECTION?
"A tool is something with a use on one end and a grasp on the other."
-- Stewart Brand, 1974
quoted in "Cybernetics of Cybernetics"
edited by Heinz von Forester
( www.amazon.com/exec/obidos/ASIN/0964704412/hip-20 )
So here's the BIG WARP-UP: I promised I'd relate Software Architecture
back to Architecture. Well, they both can ruin your day in the 21st Century.
I have in this century worked in buildings that got too hot in the summer
due to stupid architecture and unintended "Greenhouse Effect," and also worked
with stupid software tools that violated the most fundamental principles
of software engineering, like easy source code control, compilation
management and code generation. Aargghh! But we need to look deeper
As I mentioned before, architectural historian Christopher Alexander
wrote his two-volume set on "A Pattern Language" for "A Timeless Way
of Building" and the architectural world yawned. What fun was it to
use the old wheel instead of re-inventing a new one?
In my semi-random explorations into Alexander I've found delightfully
obvious wisdom such as every room should have light coming from at
least two different directions, and you need lots of storage out of the
way unseen by guests for things like skis and BBQs off-season, etc.
But in the software world a group inspired by Alexander began to promote
the use of "software design patterns." Their book
( www.amazon.com/exec/obidos/ASIN/0201633612/hip-20 )
is described by Wikipedia:
( en.wikipedia.org/wiki/Object-oriented )
in the "Object-Oriented" article:
"Design Patterns: Elements of Reusable Object-Oriented Software" is
an influential book published in 1995 by Erich Gamma, Richard Helm,
Ralph Johnson, and John Vlissides, sometimes casually called the
"Gang of Four". Along with exploring the capabilities and pitfalls
of object-oriented programming, it describes 23 common programming
problems and patterns for solving them. As of April 2007, the book
was in its 36th printing. Typical design patterns are as follows:
Creational patterns (5): Factory Pattern, Abstract Factory
Pattern, Singleton Pattern, Builder Pattern, Prototype Pattern
Structural patterns (7): Adapter Pattern, Bridge Pattern,
Composite Pattern, Decorator Pattern, Facade Pattern, Flyweight
Pattern, Proxy Pattern
Behavioral patterns (11): Chain of Responsibility Pattern,
Command Pattern, Interpreter Pattern, Iterator Pattern,
Mediator Pattern, Memento Pattern, Observer Pattern, State
pattern, Strategy Pattern, Template Pattern, Visitor Pattern
Well, this stuff all sounds great, and I wish I had more time to study it,
and I'm sure anything that gets people talking about their design
decisions early on has got to be a good thing, but of course, by the
Brooks Paradox, any time I've seen a really MESSED UP software
development project and asked the project manager of they've even
HEARD OF software design patterns, the answer is .. any guesses?
Why is it always the same hands? Of course, no, they've never heard of them.
Some of the best design advice I've ever read anywhere was in the best
book on architecture I ever read, "How Buildings Learn: What Happens
After They're Built" (book, 1994) by Stewart Brand.
( www.amazon.com/exec/obidos/ASIN/0140139966/hip-20 )
He advocates "scenario planning" (what we in software sometimes call
"use case scenarios") in the design of buildings. What are the best
case (A), worst case (F), and in-between (B-D) scenarios for the future
of this business for the next thirty years? Brand suggests you design
to at least passably adapt to ALL of them.
Likewise, we programmers can benefit, and our customers can benefit,
from telling each other stories about typical customer uses, and
documenting that conversation. It's that simple. What do users
need to do hourly? Daily? Weekly? Monthly? Quarterly? In a
disaster? In an audit? Etc. This comes before the Design Spec, which
is a bunch of screen shots from marketing, and the Functional Spec,
which is a software architecture from engineering, and is in a language
everyone can understand.
In my favorite all-time Dilbert cartoon, Dilbert tells his customer
"As soon as you give me your requirements I can start designing," and
the customer says "Why don't you just build something and then I'll
tell my management it doesn't meet my requirements." Dilbert responds,
"You know, you're going to have to do some actual work on this project."
The user gets all bug-eyed and says "That's crazy talk!"
But it's what's needed. Programmers need to stop refusing to do the
back-filling jobs that are boring (though that's actually what interns
can be used for if you can get them), and users need to start working harder
on articulating their use-case scenarios, while managers need to stop
trying to shoe-horn in "magic bullets." We all need to raise
(And building architects could start listening to their clients more too.)
Privacy Promise: Your email address will never be sold or given to
others. You will receive only the e-Zine C3M from me, Alan Scrivener,
at most once per month. It may contain commercial offers from me.
To cancel the e-Zine send the subject line "unsubscribe" to me.
I receive a commission on everything you purchase from Amazon.com after
following one of my links, which helps to support my research.
Copyright 2009 by Alan B. Scrivener