Message-ID: <010302Z03101993@anon.penet.fi> Path: ncar!asuvax!cs.utexas.edu!uunet!pipex!sunic!trane.uninett.no!news.eunet.no!nuug!news.eunet.fi!anon.penet.fi Newsgroups: talk.politics.crypto From: an12070@anon.penet.fi (S. Boxx) X-Anonymously-To: talk.politics.crypto Organization: Anonymous contact service Reply-To: an12070@anon.penet.fi Date: Sun, 3 Oct 1993 00:54:16 UTC Subject: Zimmermann's PGP: "A Cure for the Common Code" Lines: 555 Denver Westword, Vol. 17 Number 5, Sept. 29 1993 Cover Story: Secrets Agent The Government wants to break him, but Boulder's prince of privacy remains cryptic. Contents: A Cure for the Common Code, p.12 Worried about your privacy? Your secret is safe with this guy. by ERIC DEXHEIMER This post brought to you by Information Liberation Front (ILF) Cyberspatial Reality Advancement Movement (CRAM) BlackNet cypherpunks === Late last month, much to the satisfaction of sheriff's deputies in Sacramento County, California, William Steen began serving 68 months in prison for trafficking in child pornography over computers and then attempting to hire a man to kill one of the teenagers who had testified against him. Detectives who worked on the case say the sentence represents an almost entirely gratifying end to the two-year-old effort to track down and convict Steen. The prosecution was not quite perfect, though. Police were unable to nail any of Steen's network of child porn associates, which officials suspect was extensive. Neither were Sacramento County law enforcement officers-- nor outside computer experts, for that matter-- able to read Steen's computer diary, which police think may contain the names of his other teenage victims. The reason is that Steen, of Santa Clara, California, had installed a powerful code on his computer to electronically scramble what he had written. Although experts were quickly able to determine the name of the encoding program-- called Pretty Good Privacy, or PGP-- efforts to break it failed miserably. "The task was given to us to decrypt this stuff," recalls William Sternow, a California computer-crime expert called in on the case. "And to this day we have not been able to do it." Sternow and the other experts-- including the Los Angeles Police Department, which tired to dismantle PGP as well-- probably shouldn't hold their breaths waiting for a breakthrough. It is unlikely that they will crack Steen's diaries anytime soon, probably not in their lifetimes. Forget your cereal-box decoder rings. Pretty Good Privacy, a computer program designed by a short, slightly round Boulder programmer named Philip Zimmermann, is, as far as the current technology is concerned, about as accessible as Fort Knox. While PGP has frustrated the California cops, it has done wonders for its inventor's reputation among a thriving underground network of electronic cowboys. In the two years since he published Pretty Good Privacy, the program has propelled Zimmermann from a struggling Colorado software author missing mortgage payments to something of a folk hero among hackers, both in the U.S. and across the world, where the program has been translated into nearly a dozen languages. "I can go anywhere in Europe," boasts Zimmermann, "and not have to buy lunch." Not everyone wants to feed Phil Zimmermann. Count among his enemies the U.S. Customs Service, which is investigating him for violating export laws. Add RSA DAta Security, a Redwood City, California, company that says it is considering taking him to court for swiping its encoding technology. And of course, top off the list with any number of frustrated law enforcement agencies, from the supersecret National Security Agency (NSA) all the way down to the Sacramento sheriff's department. "Phil Zimmermann? He's a dirtbag," spits out Brian Kennedy, the detective who headed up the Steen investigation. "He's an irresponsible person who takes credit for his invention without taking responsibility for its effect. He's protected people who are preying on children. I hope that someday he'll get what he deserves." === What Phil Zimmermann deserves more than anything this gray morning is a few more hours of sleep. "I was up until four this morning working on the computer," he grumbles with not-very-well-disguised irritation. "Give me 45 minutes to become human." One hour later, this is what Phil Zimmermann looks like, human: a short guy, a little paunchy. He wears large aviator glasses, a heavy beard and an easy elfin grin. Today he is also wearing beige pants, a green shirt, and blue Etonic sneakers. Although separately none of the parts looks askew, for some reason the package still looks rumpled. His living room feels small and is crammed with books, a respectable percentage of which are bona fide, Noam Chomsky-certified leftist tracts. The back room of the north Boulder house serves as Zimmermann's computer lab. Three machines are on-line. Outside light is denied entrance by shaded windows. Books and magazines-- _The_ _Journal_ _of_ _Cryptology_-- carpet the floor in no discernible order. In the southwest corner of the room lies a small mattress, where for the past several days a Toronto college student has slept. The student, whose name is Colin Plumb, learned about the Boulder programmer about a year ago after plucking PGP off a computer network. He composed a letter to Zimmermann expressing admiration for the encrypting software, one of the thousands of pieces of fan mail that have poured into Zimmermann's mailbox and computer since June 1991, when PGP was first published. Now Plumb is here for two weeks as a volunteer assistant, helping Zimmermann update Pretty Good Privacy. He is not the first admirer to make the hajj to Boulder. "I get people here all the time," says Zimmermann. "A month ago I got a visit from a guy from Brazil. He used PGP back in Rio de Janeiro, and he was touring the country and he wanted to meet the guy who invented it." Zimmermann continues: "I get mail from people in the Eastern Bloc saying how much they appreciate PGP-- you know, 'Thanks for doing it.' When I'm talking to Americans about this, a lot of them don't understand why I'd be so paranoid about the government. But people in police states, you don't have to explain it to them. They already get it. And they don't understand why we don't." What we don't understand, at least according to an explanation of Pretty Good Privacy that accompanies the software, is this: "You may be planning a political campaign, discussing your taxes, or having an illicit affair. Or you may be doing something that you feel shouldn't be illegal, but is. Whatever it is, you don't want your private electronic mail or confidential documents read by anyone else. There's nothing wrong with asserting your privacy. Privacy is as apple-pie as the Constitution." Simple stuff, But Zimmermann and PGP have done more than provide an electronic cloak for the steamy computer messages of a few straying husbands. In fact, the publication of Pretty Good Privacy has probably done more than any other single event to shove the arcane-- and, until recently, almost exclusively government-controlled-- science and art of cryptology into the public consciousness. Much of that is inevitable. The explosion of electronic mail and other computer messaging systems begs a megabyte of privacy questions. While a 1986 federal law prevents people from snooping into computer mail without legal authorization, the fact remains that electronic eavesdropping is relatively simple to do. To an experienced hacker, unprotected computer communications are like so many postcards, free for the reading. Encryption systems simply put those postcards inside secure electronic envelopes. This may sound innocuous. But it is highly distressing to those branches of the government that say they occasionally need to listen in to what citizens are saying. In recent public debates in Congress and in private meetings, representatives of the FBI and the NSA have argued vigorously that they need high-tech tools to provide for the public and national security. They contend that this includes the capability to read any and all encoded messages that whip across the ether. To these computocops, widely available encryption in general-- and specifically, PGP-- is dangers. "PGP," warns Dorothy Denning, a Georgetown University professor who has worked closely with the National Security Agency, "could potentially become a widespread problem." To those who increasingly rely on the swelling network of computer superhighways to send, receive, and store everything from business memos to medical records to political mailing lists, however, the idea of a CIA spook or sheriff's department flunky listening in to their conversations and peeking at their mail is chilling. They fear that without basic privacy protection, the promise of the Information Age also carries with it the unprecedented threat of an electronic Big Brother more powerful than anything ever imagined by George Orwell. === When Phil Zimmermann moved to Boulder from Florida in 1978, he had every intention of earning a master's degree in computer science. Instead he went to work for a local software company. And he began fighting the good fight against big bombs. "In the early 1980s it looked like things were going to go badly," he recalls. "There was talk of the Evil Empire. Reagan was going berserk with the military budget. Things looked pretty hopeless. So my wife and I began preparing to move to New Zealand. By 1982 we had our passports and traveling papers. That year, though, the national nuclear freeze campaign had their conference in Denver. We attended, and by the time the conference was over we'd decided to stay and fight." He attended meetings. He gave speeches. He marched on nuclear test sites in Nevada. ("I've been in jail with Carl Sagan and Daniel Ellsberg," he says. "Daniel Ellsberg twice.") He taught a course out of the Boulder Teacher' Catalogue called "Get Smart on the Arms Race." ("The class is not anti-U.S.; it is anti-war," a course summary in the 1986 catalogue explains." In the snatches of free time between nuke battles, Zimmermann continued feeding a lifelong fascination with secret codes. "I've always been interested in cryptology, ever since I was a kid," he says. "I read _Codes_ _and_ _Secret_ _Writings_ by Herbert Zimm, which showed you how to make invisible ink out of lemon juice. It was pretty cool." "When I got to college I discovered that you could use computers to encode things. I started writing codes, and I thought they were so cool and impossible to break. I know they were trivial and extremely easy to break." For Zimmermann, who is 39 years old, writing and breaking codes had always been just a hobby, albeit an increasingly intensive one. Up until 1976, that is, when his hobby became an obsession that would absorb the next fifteen years of his life. That's because, like everyone else who had been dabbling in encryption at the time, Phil Zimmermann was swept away by the revolutionary concept of public-key cryptography and the RSA algorithms. === Secret codes have been used for thousands of years, but they have always operated on the same principle: The words or letters of the message to be encoded-- called the "plaintext"-- are replaced by other words, letters, numbers and symbols. These are then shuffled, rendering the communication incomprehensible. As spies and other secretive sorts began to use computers, the basic idea remained the same. But the substitution and shuffling became increasingly complex. (Just how complex is difficult to grasp. This summer a panel of experts met to evaluate the NSA's most recent encryption system. They concluded that it would take a Cray supercomputer 400 billion years of continuous operation to exhaust all the possible substitutions.) Yet even with the most scrambled substitutions, encryption always suffered from a glaring weakness: A code is only as secure as the channel over which it travels. What this has meant practically is that messages-- whether flown by pigeon or broadcast over a shortwave-- could always be intercepted by the enemy. This was particularly dangerous when it came time to share the code's "key." Traditionally, codes were always encrypted by a key that would garble, say, plain English into unreadable gobbledygook. The encoded message would then be sent to the recipient, who would use the same key to translate the message back into English. The problem with this, of course, is: How do you get the key from one place to another without danger of its being intercepted? After all, once a key is swiped by the bad guys, the entire code is rendered useless. Worse yet, what if you had no idea the key had been stolen, and your enemies continued to freely read messages you thought were protected? This is especially troublesome when you're trying to maintain a large network of secret sharers. Surprisingly, this ancient glitch was not cleared up until the spring of 1975. That's when a Stanford computer junkie named Whitfield Diffie created a crypto-revolution called public-key cryptology, a system simple in theory-- but complicated in practice-- that effectively solved the problem of key sharing. What Diffie did was imagine a system with two mathematically related keys, one public and one private. The public key could be as public as a published address. The private key would not be shared with anyone. The connection was that a message encoded with one key could be decoded by the other. To understand how this works, imagine the keys as public and private telephone numbers. The sender garbles a message with the receiver's public key, obtained from the computer equivalent of a phone book. Once sent, the only way the message can be decoded is with the receiver's mathematically related private key. Since each receiver has his own private key, no one has to share keys, and there is no danger of having the solution to the code intercepted. Equally important, each encoded message could bear the unique signature of its sender. (The sender encodes the message with his private key. The receiver affirms the message's authenticity by using the sender's mathematically related public key to unscramble the communication.) This eliminates the potential for some meddling third part to send a false message. Diffie's idea of two keys instead of one ignited a bomb among the burgeoning community of computer hackers and academic math types, who immediately began toying with public-key encryption. Not surprisingly, it didn't take long for the theory to be applied to real-life codemaking. In 1977 three MIT scientists named Ronald Rivest, Adi Shamir and Leonard Adelman constructed a series of algorithms, or mathematical instructions, that put Diffie's idea into practice. The three men named their public-key encryption system RSA, after their initials. They patented the algorithms and formed a company, RSA Data Security. Today the company practically enjoys a monopoly on public-key encryption. It puts out an eye-catching advertising pamphlet ("RSA. BEcause some things are better left unread." and sells millions of dollars' worth of encoding packages (one example: BSAFE 2.0). RSA's president is D. James Bidzos. He is not lining up to buy lunch for Phil Zimmermann. In fact, he claims that Zimmermann is little more than a poseur whose only real contribution to cryptology was to swipe RSA's technology. "Phil seems very eager to let people believe what he wants them to believe," complains Bidzos. "He like to perpetuate the idea of his being a folk hero." === Phil Zimmermann says that while he became fascinated with public-key encryption in the mid-1970s, he didn't begin seriously contemplating designing a useful application until 1984, when he was researching an article about the subject for a technical magazine. In 1986 he began fiddling with the RSA algorithms-- what he describes as "RSA in a petri dish." He says he enjoyed some mathematical successes, but that his work was still a far cry from any program that could be used to encode information." After dabbling in crypto-math and computers for four years, Zimmermann decided at the end of 1990 to construct a workable encoding package. In December, he says, he began working twelve-hour days exclusively on what was to become pretty Good Privacy. The work took its toll-- he neglected his software consulting business and missed five payments on his house-- but by the middle of 1991, the program was ready to go. In June Pretty Good Privacy was released over the Internet as software free for the taking. It was faster and simpler to use than other public- key encryption programs on the market, and the price was right. The feedback was almost instantaneous. Thousands of people quickly downloaded PGP and began using it to encrypt their own messages. Although PGP didn't contribute a lot to the theory of encryption, it did make cryptology usable and available to the average computer jock, says David Banisar, an analyst for the nonprofit Computer Professionals for Social REsponsibility in Washington, D.C. "Phil didn't invent the engine," he says, "but he did fit it inside the Ford." Indeed, the father of public-key cryptology himself says Zimmermann's proletarian privacy program is the closest thing yet to what he had in mind when he invented public-key encryption nearly two decades ago-- a nongovernment encoding system that would give the average computer user the means to communicate without fear. "PGP has done a good deal for the practice of cryptology," says Whitfield Diffie, who now works for Sun Microsystems near San Francisco. "It's close to my heart because it's close to my original objectives." In perhaps the greatest testimony to Zimmermann's program, even those who condemn the programmer for irresponsibly releasing PGP continue to use his software. "It's a great program," concedes Sacramento computer expert Sternow. "We recommend in our training to cops that they use it to encrypt their stuff." Sternow estimates that more than 500 law enforcement officers currently use PGP. PGP also spurred a loose-knit California-based group of computer users with a passion for cryptology to form a new organization to carry the torch. The group, whose members call themselves the Cypherpunks, espouses an unabashed libertarian philosophy when it comes to electronic privacy-- specifically, that privacy is far too crucial a civil right to be left to the governments of the world, and that the best way to head off government control of cryptology is to spread the capability to shroud messages to everyone. "Phil showed that an ordinary guy just reading the papers that already existed could put together an encryption system that the Nation Security Agency could break," says John Gilmore, one of three founders o the Silicon Valley-based Cypherpunks. "It took a certain amount of bravery to put this out, because at the time the government was talking about restrictions on cryptography." James Bidzos failed to see Zimmermann's courage, however. In fact, all he saw was theft. after concluding that Pretty Good Privacy was based on RSA's patented algorithms, he placed a call to Boulder. Basically," he recalls, "we said, 'What the fuck?' " Bidzos also contends that Zimmermann hardly wrote the program out of altruism, even through Pretty Good Privacy is technically free. "The documentation he distributes with PGP is misleading," he says. "It does give the impression that Zimmermann is a hero hell-bent on saving you from the evil government and an evil corporation. Gee, strike a blow for freedom." Yet, Bidzos continues, "he did this with every intention of making money. It was clearly to make money, no doubt about it. He told me just before he released it, 'Hey, I've been working on it for six years, I've put my whole life into it, I'm behind on my mortgage payments and I need to get something out of it." Bidzos says he approached Zimmermann again several months later after PGP was published and it was clear the free privacy program was not going to go away anytime soon. "We told him that if he stopped distributing PGP, we wouldn't sue, and he signed an agreement," Bidzos recalls. "He was very quick to sign it. But he's been violating the agreement ever since he signed it." Zimmermann replies that at one time he did entertain the idea of making some money off PGP. But he insists he gave that up before the software package was published. "I decided to give PGP away in the interests of changing society, which it is now doing," he says. "The whole reason I got involved was politics. I did not miss mortgage payments in the hopes of getting rich. Just look at my bookshelf. I'm a politically committed person with a history of political activism." Zimmermann adds he's uncertain whether he's violated any of RSA's patents, but he contends that if he did, the law doesn't make much sense to him. "I respect copyrights," he says. "But what we're talking about there is a patent on a math formula. It's like Isaac Newton patenting Force = Mass x Acceleration. You'd have to pay royalty every time you threw a baseball." He also acknowledges that he signed a nondistribution agreement with RSA Data Security for Pretty Good Privacy. But he insists that the has abided by it-- although admittedly only in the strictest legal sense. For example, while Zimmermann says he doesn't update or distribute PGP himself, he concedes that he freely gives direction to a worldwide "cadre of volunteers," who then implement the advice. The legal problems stemming from Zimmermann's invention don't end with James Bidzos and RSA. In February two agents from the U.S. Customs Service flew to Boulder to meet with Zimmermann and his lawyer, Phil Dubois, According to Dubois, the two agents said they were investigating how PGP had found its way overseas, a violation of U.S. law forbidding the export of encryption systems. Contacted at their San Jose office, the agents declined to comment on the investigation. Yet there is little doubt as to the agency's intent. On September 14, Leonard Mikus, the president of ViaCrypt, and Arizona company that recently signed a deal with Zimmermann to distribute a PGP- like encryption package, received a grand jury subpoena asking him to turn over the U.S. Attorney's office any documents related to PGP and Phil Zimmermann. Two days later the Austin, Texas, publisher of "Moby Crypto," a software encryption collection that includes PGP on it, received a similar subpoena. The subpoena demanded that the company, Austin Codeworks, turn overall documents related to the international distribution of "Moby Crypto," as well as "any other commercial product related to PGP." The San Jose-based assistant U.S. attorney who signed the subpoenas, William Keane, acknowledges only that since subpoenas have been issued, a federal grand jury investigation is in process. Beyond that, he says, "I can't comment on the investigation." Zimmermann acknowledges that with thousands of people copying and distributing PGP, it was inevitable the program would make its way to Europe and Asia. But he adds that he had nothing to do with exporting Pretty Good Privacy-- and says he couldn't have prevented it if he tried. "When thousands and thousands of people have access to it, how could it not be exported?" he asks. Adds Dubois: "The law just can't keep up with the technology. Somebody in Palo Alto learns something, and pretty soon somebody in Moscow is going to know about the same thing. There's nothing you can do about it." === No that the U.S. government hasn't made a very serious effort to do something about the spread of unofficial encryption systems. Indeed, until very recently, governments have enjoyed what amounted to an exclusive franchise for the science of codes and codebreaking. Advances have been made in fits and starts, with much activity occurring during times of national tension and war. In that past forty years, Washington's attraction to encryption has been kept humming by the spy- fest of the Cold War. Because the government has always controlled the medium of codes, it has controlled the message as well. In _The_ _Codebreakers_, a 1967 book widely considered the definitive history of cryptology, David Kahn wrote that the U.S. government hasn't been shy about exercising censorship and grand-scale privacy invasions in the name of breaking enemy codes, perceived or real. Fearful of encoded messages slipping to and from traitors, for instance, the U.S. government by the end of World War II had constructed a censorship office that employed nearly 15,000 people and occupied 90 building throughout the country. These censors open a million pieces of versus mail a day, listened in on telephone conversations and cast a suspicious eye on movies and magazine articles that flooded across their desks. The code watchdogs were not content simply with intercepting and examining communications, though. Officials also found reason to ban some communications even before they could be written. Incomplete crossword puzzles were pulled from letters in case their answers contained some secret code. Chess games by mail were stopped for fear they concealed directions to spies. Knitting instructions, who numbers might hide some security-threatening message, were intercepted. The government's interest in controlling secret codes did not evaporate with the end of World War II, or even with the thawing of the Cold War. RSA Data Security's Bidzos says the inventors of the RSA algorithms were approached by the NSA in the mid-1970s and discouraged from publishing their discovery. And Washington still classifies encoding systems as munitions, right alongside tanks and missiles. As a result, the export of any encryption system is against the law, considered a breach of the national security. As technology has surged forward, lawmakers have tried to maintain a grip on encryption through legislation. In 1991 a version of the U.S. Senate's Omnibus Crime Bill contained a provision that would have effectively mandated that any private encoding system contain a "back door" that law enforcement agencies could enter if they suspected any misdeeds by the sender or receiver of a message. The clause was pulled after an uproar from computer users, data security companies and civil liberty organizations. Despite the failure of the 1991 bill (as well as a 1992 FBI-sponsored version that would have outlawed the use of tap-proof cryptology over digital phone systems), the government has not given up on its attempt to control encryption. Rather, it has simply shifted strategy. Six months ago the Clinton administration announced plans to flood the market with the government's own public-key electronic voice-encoding system, called, alternative, "Clipper" or "Skipjack". The catch: An as- yet unnamed federal agency or agencies would hold the private keys in case any legally appropriate eavesdropping was necessary. The administration has stopped short of saying it will outlaw private encoding devices and mandate the use of the new Clipper system. "The standard would be voluntary," assures Jan Kosko, a spokeswoman for the National Institute of Standards and Technology in Maryland, which teamed up with the NSA to develop the system. That said, officials acknowledge that the federal government will smile on those companies that choose Clipper over other, private encryption systems. If, for example, a private company is seeking to do business with a federal government agency requiring encoding, that company would be well advised to use Clipper if it wants to win contracts. "A manufacture not using it," Kosko points out, "could not compete very well" for federal contracts. On the same day the administration revealed its intention to implement Clipper, AT&T announced it would use the system in its new secure- telephone product line, thereby becoming the first company to agree to spread the government's encryption throughout the country. And, while AT&T will continue to sell other, non-government-approved encoding devices for its phones, the new Clipper model will sell for less than half the price of AT&T's in-house encryption model, according to David Arneke, a spokesman for the company's Secure Communications System division in North Carolina. He says the first models-- which with a price tag of $1,200 will appeal mostly to law enforcement agencies and businesses hoping to keep their industrial secrets secret-- should hit the shelves by the end of the year. ------------------------------------------------------------------------- To find out more about the anon service, send mail to help@anon.penet.fi. Due to the double-blind, any mail replies to this message will be anonymized, and an anonymous id will be allocated automatically. You have been warned. Please report any problems, inappropriate use etc. to admin@anon.penet.fi.