Both sides - privacy advocates and law enforcement officials - are absolutely correct.
On one hand, the rapid extension of massive data-gathering and databasing techniques to new fields means that more and more facts about your life are becoming available to the proper authorities, to any hacker that takes an interest in you, and often to anyone who wants to pay for a list with your name on it. Depending on the country in which you live, these facts may include your sexual and lifestyle preferences, medical and pharmaceutical records, tax records, psychological records, educational evaluations, and even private email. The outlines of your biography - whether you are a left-handed liberal in litigation or a prominent prelate with a Prozac problem - are becoming more available to the powerful, the media, and sometimes the merely curious.
On the other hand, it's getting a lot easier to keep a secret, if you want to and if you know how. Robust new tools, and the very nature of some new technologies, are making it easier to keep any particular secret away from prying eyes, whether it's a criminal transaction, a declaration of illicit love, or industrial secrets filched from your employer. This is great news if you're a dissident in an authoritarian country or an international drug kingpin. It's terrible news if you're a reporter hoping for the next Charles-and-Camilla-type indiscretion, if you're a three-digit spook (FBI, MI5, CIA, NSA, DIA) or a local cop, or if you're just a snoop who likes to tune in to private conversations on his radio scanner.
The rules of privacy are being massively re-written around the world. The hand that finally writes those rules will play a major part in sculpting our future.
Cellular phones (which are just small, high-frequency radios) have been a regular feeding trough for private detectives, reporters, and nosy neighbors alike, as Prince Charles learned to his chagrin. Cheap scanners from Radio Shack, which simply search all available frequencies for signals strong enough to read, could pick up most traffic - old American TVs with UHF channels could even be hacked to listen in to adulterous cooings and legal haggles. But new digital standards for cellular telephony, designed mainly to cram more phone calls into the available bandwidth, will make that impossible. The most sophisticated, CDMA ("code division multiple access"), spreads the signal over 12.5 megahertz of frequencies. Originally devised for battlefield use, to allow an isolated unit to communicate with headquarters without giving away its position, the CDMA signal cannot be discovered even as background hiss by a traditional receiver. Sophisticated test equipment could pick the signal out of the air and decode it - given several hours on a Cray supercomputer to break into the some 40 billion possible code combinations. Again, the most effective listening device is a court order delivered to the cellular company, which turns the talk into a standard analog signal in the process of handling the call.
Make that two points for privacy.
But hardware and software keep getting cheaper and easier to use. Want a gigabyte of memory (a thousand million bytes)? That'll cost about US$1400, mail order. As massive databases (e.g. all HIV-positive people in California, all members of a certain organization, everybody on the Internet interested in Jupiter, porn, or adoption) become easy to gather and cheap to manage, privacy problems pop up in the strangest contexts. In British Columbia earlier this year, pharmacists were all set to start up Pharmanet, a system that would give any pharmacist access to your complete pharmaceutical record. To fill a routine prescription for antibiotics for, say, an ear infection, you would have to give your access code to the pharmacist (and to anyone else within hearing who could get to the pharmacy's computer). In many small towns, the pharmacist (or the clerk) might be your neighbor, your employer's golfing buddy, or on a church committee with your mother. And they would be able to see that you were HIV positive, had been treated for depression, or had a secret abortion - even if you had taken care to have it in another town, away from curious eyes. As Kelly Bert Manning of the Victoria, B.C. "FreeNet" computer conference observed, "A password that has been given to another person is useless for all practical purposes." British Columbia's powerful "privacy czar," Information and Privacy Commissioner David Flaherty, raised objections to the plan, and it was withdrawn for a re-design.
Such concerns can trip up the most legitimate research. In medicine, for instance, sophisticated new programs now make it possible to track every medical case as it happens in hospitals, clinics, and doctor's office, turning entire national health systems into vast laboratories on the effectiveness of every drug, every therapy, and every treatment plan - most of which have never been rigorously studied in the field. In a non-smoking male over 60 with pleurisy, should the dosage of drug x be 5 cc every four hours or 10 cc a day? How often should we change the filters in heart-lung machines? What is really more effective for dealing with clogged heart arteries: coronary artery bypass grafts or programs of diet and exercise? Under what conditions? To find the answer, researchers will be able to query the database-as-big-as-the-world, the scores of millions of real cases gathered in the "outcomes research" systems that are rapidly becoming common in the United States. But in Europe, similar efforts, and even basic epidemiological research, are running afoul of the rapid growth and redefinition of privacy laws in the European Community and its member countries. (New Scientist 11 December 1993)
The big concern is the old one: the ability of computers and nets to amass large amounts of personal information and analyze it. Use a credit card in a store that sells, say, sexually outré merchandise - and now there is a computer record of the fact that you like that kind of stuff. Commercial databases are becoming more and more refined, cross-referencing credit-card purchases, mailing codes, telephone prefixes, club memberships, and magazine subscriptions to divide hundreds of millions of names into finer and finer groups. If you want to send a political letter, or an ad, only to former Tories over 60 with Macintosh computers, you can do it.
Of course, there are plenty of more questionable ways to build massive databases. It is trivially easy to assemble huge lists of people with particular interests - if they express that interest online. Simply cruise the Internet, BitNet, UseNet, Compuserve, Prodigy, America OnLine, EUNet, and other networks, and capture the login IDs (the online identifiers, such as bbear@well.com) of everyone who posts in conferences that have to do with the subject. You can then have a program "finger" them: the program, on a computer connected to the Internet, would take each login ID and type, for instance, "finger bbear@well.com." (The "bbear" part is me. The rest is an address). The computer at the system where I have my account, the Well ("well.com") in Sausalito, California, would return my ".plan file," my basic identification file. This almost always includes the user's real name ("Joe Flower"), and often other information, including their phone number and address.
Hackers go a step further, sometimes many steps. For instance, if I were in San Antonio, Texas, and wanted to visit with my friends on the Well in Sausalito, I could dial a local phone number to connect to another computer on the Internet, type "telnet well," and be connected through the Internet to the Well. But I would have to send my password, and it would ricochet, unencrypted, through the Internet in a complex path that might include dozens of separate Internet nodes (the computers that make up the network. If someone has hacked one of those nodes (managed to sneak past its protections) and set up a "sniffer" program, they can capture that password. Once a "sniffer" has taken up residence in a hacked Internet node, it can examine the address header of each message that passes through that node, copy the login ID, search the message for anything that looks like a password, and send the lists of login IDs and passwords back to the hacker.
By capturing system administrators' passwords, hackers with "sniffers" can grab the keys to whole systems - often the next site down the line, where the first thing they will do is install another "sniffer." If you're after a particular person, just install a sniffer that ignores every password except the one attached to a particular ID. The Computer Emergency Response Team, a quasi-governmental U.S. organization attached to the Internet, issued a net-wide warning about sniffers in February of this year [1994]. Computer security experts guess that over a million Internet passwords were stolen in the first six months of this year alone. According to Peter Neumann of SRI International in Palo Alto, California, "There are probably no secure systems on the Internet."
Other sniffers can track everyone who reads particular newsgroups or conferences, even if they never post to them. If you ever glanced at alt.sex.bondage, you could be on a list. The sniffer-planting hackers might be mischievous pranksters. Or industrial spies seeking another company's secrets. Or a government agent. Or an employee of your ex-husband's attorney.
For some, the most bothersome privacy boundary is closer to home. Many employees treat company email the way they have always treated the phone, as a relatively private communication that's good not only for business but for gossip, romance, skullduggery, and even talking to your next employer. But email leaves a trail that your boss can read - or that a hacker on your local system can crack.
Many employers, arguing that they own the computers, the lines that connect them, and the employees' time, capture every email transaction, and look at them whenever they suspect a problem.
Commercial systems are not immune to email pilferage. The Well, an independent computer conferencing system sometimes considered the Greenwich Village of cyberspace, was recently plunged into controversy when it was revealed that, by changing one line in one file, any of the system's conference hosts could capture the entire email queue of anyone who visited that conference or (for instance) sought a file in the conference's library. The Well has changed its policies to close this loophole. Few of the Well's several hundred volunteer hosts knew of the security hole, and none had taken advantage of it, but the incident illustrates the vulnerability of even well-established systems.
The cybersphere holds its own anonymous and secret paths. Want to send someone anonymous email? Or participate in an online sex fantasy group without danger of revealing your identity? Simply send it out on the Internet or Usenet through an "anonymous remailer" such as the famous anon.penet.fi in Finland, or the Cypherpunk remailer network home-paged at http://monet.vis.colostate.edu/~nate /mailer.html. Each anonymous remailer is a computer server set up for a very simple task: it takes each email message that comes in, strips off its header (its "return address") and sends it on its way.
For computer communications, virtually uncrackable "public key" encryption has put into the hands of the average citizen a method of cryptography far beyond the powers of the CIA at the height of the Cold War. It's cheap and easy. In fact, it's free: anyone with a personal computer, a modem, and Internet access can download powerful, easy-to-use cryptography programs such as PGP ("Pretty Good Privacy") and RIPEM. In fact, this reporter just did it. Here is a sentence of the notes for this story, rendered by RIPEM in my own personalized code: "tpN7y6AqOtoLa+WUz67CgEq144I/j9hWAiFe8fpaPYH4OzReOhiU1X Ob7oJswhWAvyCbRruXgEISWJ8Qb1/LXxl9+tR5BM4j/YOx6dHiiYw+KWi6Pt93tayMAakRl/2"
If someone wants to communicate with me in a way that only I can read, they can send me the message in code. In classic cryptography, I have to have the key to decode the message - the same series of letters and numbers and the mathematical operation that turned the message into code, will now turn the code back into a message. Anyone who has the key can decode the message. But somehow the sender has to get the key to the me, and on the way it can be stolen or copied - or with enough computing power it can be deduced from the encoded message.
"Public key" codes, in contrast, have two keys. Your "public key" is for encoding a message to you. You decode the message by using an unrelated "private key." It doesn't matter if someone steals your public key (in fact, many people publish theirs on the Internet). The private key, which decrypts the message, cannot be discovered from the message itself or from the "public" key, no matter how great the computer resources applied. And that key never has to go anywhere. It stays in your computer.
Still that leaves you vulnerable, because you can be identified as someone who is sending and receiving obviously-encrypted messages A court order or a little wage coercion could get you to reveal your private key.
The newest methods get around even that problem. Suppose I send you a Michael Jackson tape, or tell you about a piece of music you might want to download from my computer. Or a picture, say, a snapshot from my vacation in Spain. You download the music and play it, or look at the pictures. Everything seems completely normal. But if you know that you are looking for something, and if I have used your public key, you can use your private key to extract, out of those millions of bytes of digitized information, my hidden message, text, pictures, sound, anything from love notes to complex financial transactions. No one could ever find it, or even suspect it, by looking at the picture, hearing the music, or even examining the CD or tape that I have put it into. Even if someone who knew the technique analyzed the data stream, cryptographic experts believe that the technique does not leave enough of a pattern of any kind to even be detected, let alone decrypted. Such "steganography," hiding messages in pictures or text, is a very old technique. In the era of computers, it has become cheap, easy and undetectable. Like other encryption software, "stego" software is available for free on the Internet.
Ordinarily, if I wanted to compress a photo of Harrison Ford to send it over the Internet, I would use a program called JPEG. To squeeze a loveletter (let's call it "kissies.doc") into the photo, I would hack the JPEG program, find the command line that said "cjpeg" and add "-steg kissies.doc" and then run the program. JPEG runs in two stages. After the first stage, "steg" would compress my love letter and substitute it for certain of the hundreds of thousands of bits of information used to compress Harrison. At the other end my sweetheart, armed with a similar addition to the "djpeg" line of her program, would receive my love letter. Anyone else would just get a picture of Harrison Ford. But she would have to choose: taking the message out will destroy the picture.
Strong, cheap, widely available encryption systems make possible something else: digital signatures. This is the other half of privacy: if I receive a message from you, how do I know you sent it, rather than some hacker who snatched your password out of the Internet stream? A digital signature is a piece of code that could only have come from your personal computer: given this particular message, only your computer could have produced this particular "hash," but anyone can confirm that the hash is correct. With such signatures, it would be possible to set up a banking system in cyberspace, with email "checks," complete with code that verifies whether or not they have been "cashed." But for fans of privacy, the anonymous form of cyberspace money, digital cash, is even more interesting: digital checks sent through a third party that simply matches order codes with the checks, stripped of any personal identifiers. Just like paper cash, digital cash would allow anonymous transactions: you can sell me something and have no idea who I am or where I live. An alternative economy could grow in cyberspace, independent of national governments, bypassing tax authorities and reporting requirements, hidden from customs officers and bank regulators, with trillions of "cyberdollars" flowing back and forth disguised as Metallica recordings or photos of someone's dog.
No one knows how likely this is, but the forces that could make it happen are powerful: for both legal and illegal reasons, it would be very useful for one individual to be able to transfer money directly to another through cyberspace. No new technology would be needed. All that is needed is the organization - someone actually doing it. And, like all forms. of money, it requires some history - money only has value because people believe it does.
Private cryptography makes this headache much worse, and governments have eagerly sought to limit it. Earlier this year the Dutch government attempted simply to ban it outright, making criminals out of anyone who had downloaded PGP or similar systems. It withdrew that proposal in July, but other governments, including that of France, now require that all cryptography be licensed, that the licensee have a reason for using it that the government will consider legitimate, and that a government department hold a copy of the private key.
The U.S. government took a different tack: cryptography is fine, it decided, as long as it held the keys to a "trapdoor" in every system. It sponsored the "Clipper" computer chip as an industry standard for telephone cryptography (and its cousin, Capstone, for computers). The Clipper would encrypt the phone conversation, all right, but
it would include in it a separate encrypted data stream called a "Law Enforcement Access Field" (LEAF), a personal identification number for that phone. Legal snoops could tap the phone and discover the LEAF. Armed with that, they could get access to the key to the code, which would be held in two pieces in two separate government departments. Privacy advocates raised their eyebrows to their hairlines about the human element that this introduced into a technological fix. Electronic industry watchers predicted that the attempt would simply force a lot of cryptographic phone business out of the United States - few foreign governments or companies would be interested in a technology to which the U.S. government held the keys. Yet it seemed like a done deal. After all, industry experts had tried and failed to crack the Clipper, and had declared it "safe." But recently 33-year-old Matthew Blaze, a Bell Labs crypto whiz, started fooling around with Capstone. It, too, had a LEAF in its Tessera algorithm. Blaze didn't try to crack it. He tried to jam the LEAF. He succeeded. It took him 28 minutes on a common workstation to render the "trapdoor" unusable.
Publicly, AT&T and the NSA poo-pooed the news, saying that no one would want to wait 28 minutes to make a phone call or send a piece of email. Besides, they said, this was the Capstone chip, not the Clipper chip. Privately, though, one NSA spook admitted to Brock Meeks for his online gadfly publication CyberWire Dispatch that the government knew all along that the idea was flawed.
The Net soon came alive with hackers willing to opine that it wouldn't take anything like 28 minutes with a chip designed to defeat the Capstone, and that Capstone was a Clipper clone. Said Net pundit Jeanne DeVoto: " This is just armwaving. Clipper is dead. It has gone the way of the dodo."
By late July, in fact, the National Security Agency and the National Institute of Standards and Technology were seeking an alternative to Clipper - yet they were still dedicated to the idea of developing "wiretap-ready" technology.
In response to concerns about the privacy of new technologies, the Electronic Frontier Foundation has sprung up in the United States, and has spread to Australia, Canada, Ireland, Italy, Japan, and Norway. Just within the past year, these nascent organizations, involving sometimes a single person and sometimes hundreds, have begun tracking problems in their own countries, lobbying for and against new legislation, and bringing press attention to this strange new world of online privacy.
This dance between the law and technology will not end anytime soon. Thousands of eager "cypherpunks" eagerly explore each new technological twist, while the price of computing power, memory, and bandwidth continues to free-fall, and lumbering government departments try desparately to get back the technical power to listen in on private conversations. The long war between the whisper and the ear goes on. For the moment, the whisper is winning, but the ear is gaining fast.