Encryption is one of those words in the contemporary lexicon that is freely tossed about to describe something that is part mystery and part act of faith. For millions of computer users, encryption is little more than a hopeful abstraction, steadfastly assumed to protect our most confidential data.
Few people outside of those directly concerned with data security need to understand how encryption works. Fewer still know much about its origins or of the colossal obstacles surmounted by brilliant, if largely anonymous, mathematicians. But to study the history of encryption is to enter a world of intrigue, espionage, and mathematical wizardry. For thousands of years, people have found it necessary to disguise their communications. And for an equal number of years, those with patience and skill have found ways of deciphering hidden messages. Encryption is almost as old as secrets, and its evolution is superbly chronicled by Simon Singh in The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography. The earliest techniques for sending confidential messages were more a matter of cleverness than coding. Singh recounts an incident chronicled by Greek historian Herodotus in which a messengers head was shaved and the message written on his scalp. The sender then patiently waited for the messengers hair to regrow before he dispatched him with the missive. This was, as Singh notes, clearly a period of history that tolerated a certain lack of urgency.
Giving new meaning to the word urgency, the Chinese inscribed messages on fine silk, rolled into tiny balls, then covered with wax. Messengers were obliged to swallow the container before swiftly setting off to deliver its contents. Although Singh delicately avoids speculation, this system had some obvious retrieval drawbacks.
But hiding a message, no matter how adroitly, leaves the sender vulnerable to discovery or betrayal. Cryptography was developed to disguise not the existence but the meaning of a message so that even if it were intercepted, it cannot be understood.
Disguising communications employs two principal methods: transposition and substitution. Transposition involves rearranging the order of the letters that comprise a message. Its virtue is simplicity and a staggering amount of possible encryption choices, even for a very short message. For example, the message Meet Paul at the train station at midnight contains only 35 letters, but, as Singh notes, there are more than 50,000,000,000,000,000,000,000,000,000,000 distinct arrangements of them. The worlds entire population working day and night would require more than a thousand times the lifetime of the universe to check all the arrangements. That, of course, was before computers significantly shortened the task.
Substitution entails replacing each letter of a text with a different letter, so that meet Paul may appear as bvvq Rmwd. Jumbling the letters of the alphabet can create a ciphertext of even greater variety than transposition. But since certain letters are used more often than others, educated guesses can be made according to the frequency and positioning of characters. When transposition and substitution are used in tandem, however, they make a formidable cipher. The encryption procedure itself is called the algorithm; how the algorithm is applied becomes the key. As the name suggests, the key is the essential component for unscrambling an encrypted communication.
Before World War II, the Germans developed a brilliant encryption device called Enigma, a mechanical computer that employed three levels of transposition and substitution believed (erroneously) to be unbreakable. But one of the difficulties in producing complex encryption is that both the sender and the receiver had to use the same key to encode and decipher the message. Every Enigma operator had to be issued a monthly code book that contained daily encryption keys. During the war, with troops constantly on the move and ships and submarines patrolling the sea lanes, disbursement became a logistical nightmare.
In cryptography circles, this is known as the problem of key distribution. The dilemma is how to transmit a key to the recipient without compromising the message. If the key is transmitted along with the message, it can be intercepted and the message decoded. One option is to deliver the key separately, perhaps by messengera viable solution if transmissions are infrequent and recipients are nearby. But as the number of transmissions grows and becomes geographically dispersed, dispatching messengers becomes highly inefficient and cost-prohibitive.
With the growth of business computing, the problem of key distribution intensified. Banks, Singh recounts, would dispatch riders...across the world with padlocked briefcases to deliver decryption keys. But the reliance on third parties as delivery vehicles for decoding private communications was rightfully seen as both oxymoronic and a security risk. For decades, mathematicians labored unsuccessfully to solve the quandary of key distribution, and many concluded it was unsolvable.
Nonetheless, a solution was eventually devised in the mid-1970s, and Singh describes it as the greatest cryptographic achievement in over 2,000 years. The resolution employed so-called one way mathematical functions, operations that are simple to perform but difficult or impossible to reverse. Singh likens these functions to mixing two colors of paint: easy to do, hopeless to undo. Imagine that the sender and the recipient both have a gallon of white paint, to which each adds a secret color; the sender adds orange and the recipient adds purple. They exchange mixtures, to which each then adds his own secret color. They now both have identical mixtures. The end product is the key by which a message can be both encrypted and decrypted. Even if an unscrupulous party learned that white was the base color, or was able to intercept the exchange of mixtures, it would do him no good without knowing the two secret colors.
This process, however, had limitations; namely, the sender had to contact the recipient to exchange the results of mathematical calculations, which, when processed through a series of modular arithmetic functions, would yield the key. The breakthrough was that the information could be exchanged remotely, over an unsecured phone line, for example, and it would be meaningless if intercepted.
The method, although revolutionary, was labor-intensive and could not hope to sustain the volume of encrypted data that now flows across the Internet. Imagine having to contact every recipient to work out an encryption/decryption scheme each time you wanted to send encrypted emails or a credit card number to a vendor.
The solution required an asymmetric cipher, one that did not require both parties to have an identical key and therefore eliminated the need to exchange information. Such a theory had been proposed by others, but Ron Rivest, a computer scientist, had a revelation one night after drinking significant amounts of...wine. His insight marked the birth of public key cryptography, a system that allowed a person to post a public encryption
key that would be available to anyone wishing to send him a coded message, while the private decryption key would remain known only to him.
The system required multiplying two prime numbersthe larger the betterthe results of which become the public key, while the original prime numbers act as the private decoding key.
On the surface, reversing the multiplication of two numbers seems a fairly trivial task. But factoring prime numbers is, as Singh points out, very time consuming. Just how difficult is it? When the cipher was introduced in a Scientific America article in 1977, the magazine offered a $100 prize to anyone who could decode a ciphertext of modest length. The author even provided the public key used to encrypt it. Singh reports that it took a team of 600 volunteers 17 years to break the cipher.
Public key cryptography is the basis for the fully automated encryption systems that are now available for use on the Internet. One system, developed in the early 90s, that achieved a level of notoriety was PGP, which modestly stands for Pretty Good Privacy. Actually, the level of privacy was so good that the National Security Agency, which employs more mathematicians and computers than any agency in the world, demanded a limit on key lengths, fearing it had insufficient horsepower to crack intercepted ciphers.
From a national security perspective, algorithms were becoming annoyingly sophisticated even before PGP. In 1976, IBM announced its Data Encryption Standard (DES), which caused an uproar in security agencies. Singh likens the encryption process to kneading a slab of dough:
First, the message is translated into a long string of binary digits. Second, the string is split into blocks of 64 digits, and encryption is performed separately on each of the blocks. Third, focusing on just one block, the 64 digits are shuffled, and then split into two half-blocks of 32, labeled Left0 and Right0. The digits in Right0 are then put through a mangler function which changes the digits according to a complex substitution. The mangled Right0 is then added to Left0 to create a new half-block of 32 digits called Right1. The original Right0 is relabeled Left1. This set of operations is called a round....This
process is repeated until there have been 16 rounds in total.
Algorithms became even more complex in the ensuing years. But the long battle between code-makers and code-breakers is heading toward a climax. In the near future, the state of computing may reach a point of impasse where no prior encryption would be unbreakable and all future encryption would be absolutely undecipherable. A quantum computer would be capable of massive, nearly instantaneous parallel processing. No cipher would be complex enough to confound it. And to encrypt messages, quantum computers would use polarized photons, whose behavior is unpredictable and therefore impossible to divine.
Governments naturally fear total, inviolate privacy. Without the ability to snoop, governments lose control of their citizens. A quantum computer, Singh predicts, would jeopardize the stability of the world. But if stability comes at the cost of privacy, perhaps its overrated.
LATEST COMMENTS
MC Press Online