Neil J.A. Sloane has spent much of the past two decades contemplating the best way to stack oranges on a fruit stand. He isn't a grocer with an obsession for order but a researcher at AT&T Bell Laboratories who specializes in a major branch of mathematics called number theory. That means he doesn't look at numbers the way most people do.
To many of us, for instance, computer data are just characters on a screen. To programmers, they're the ones and zeros that represent information. Sloane chops these streams of ones and zeros into chunks and turns them into points in space. Around each one he draws an imaginary sphere, like an orange. Then, he rearranges his oranges to find ways of stacking them closer together. Just recently, Sloane's exercise led to a practical advance: He invented a way to squeeze so much more data into a computer program that it promises to slash the cost of transmitting data more than 10%, without compromising reliability, accuracy, or speed.
MORTGAGE MIGRAINE. This kind of something-for-nothing gain is typical of the rising impact that number theorists are having in industries as diverse as computers, communications, agriculture, and pharmaceuticals. Simply by juggling numbers, they are coming up with better ways to encrypt information and crack codes. They're also devising shortcuts to solving problems once thought too complex to compute quickly--everything from figuring the value of mortgage-backed securities to calculating precisely the right mix of ingredients for chicken feed. And they're helping engineers slash the cost of testing industrial processes to make sure they'll work just right. "People have always thought of number theory as the purest form of mathematics," says Ronald L. Graham, a Bell Labs mathematician. "Now, it is the most applied of all."
Number theory, which is simply the science of inventing new ways to manipulate whole numbers, originated in ancient Greece. For centuries, it provided grist for intellectual games, but little in the way of better mousetraps. That is, until the advent of computers that communicate in the language of whole numbers. The spread of ever-more-powerful computers and networks raises possibilities--and poses daunting challenges--that only number theory can address.
Take derivatives, financial products whose value is tied to fluctuating financial indicators. Mathematically, pricing derivatives is as big a challenge as landing a Space Shuttle. In fact, clients of investment banks often have to wait overnight to get a quote on what it costs to invest in a derivative such as collateralized mortgage obligations (CMOs), whose value is linked to mortgage interest rates. And that calculation can be obsolete in an instant if rates change. The problem is that calculating the value of each share in a pool entails solving an equation with many variables, such as thousands of monthly mortgage payments and data on how rapidly people refinance or pay off mortgages. These figurings require millions of calculations that tie up computers for hours.
That's why Goldman, Sachs & Co. and other investment firms took notice when number theorist Joseph F. Traub found a shorter way of calculating derivatives, which are based on so-called "multidimensional integral" equations. Traub, a professor of computer science at Columbia University, can approximate a solution for these equations by solving a tiny subset of all the elements--only 10,000, vs. 1 million or so for previous techniques. In tests on the Street, a program developed by one of Traub's graduate students, Spassimir H. Paskov, has calculated the price of CMOs up to 100 times faster than the method Goldman was using, Paskov says. Goldman won't discuss the trial, but Citibank reports similar results. The program could apply to other complicated financial products, and Paskov already has one job offer and expects others before he finishes his PhD in the fall.
Coding theory, one of several branches of number theory, is the key to another advance: new methods of transmitting computer data efficiently and free of error (table). Errors--unintended changes in the data--seem inevitable during transmission. Coding technology prevents them by crunching sequences of data with an algorithm, a step-by-step process for manipulating numbers that is used to add a few extra ones to the data sequence. Upon receipt, these extra numbers are used to check the rest of the sequence to make sure it was transmitted correctly. If not, the transmission is repeated.
NEW CONVERTS. Trouble is, the early methods for doing this have become obsolete, thanks to faster transmission and new types of interference peculiar to wireless and cellular systems. So "there has been a renaissance in coding theory in the past five years as people realized that the algorithms developed in the '50s and '60s are no longer sufficient," says Bell Labs mathematician Jeff Lagarias. What's needed now are algorithms that overcome the new interference but don't require lengthy calculations at either end of the transmission. Neil Sloane achieved that goal with his method for packing "oranges." It enabled him to invent a code that keeps the calculation at either end short, while adding fewer extra numbers for each sequence of data. That means he can squeeze 12.5% more data into a transmission. Hughes Aircraft Co., whose researchers also hit on the technique, plan to exploit those savings by using the code in communications equipment.
The same mathematics also enable engineers to pare down the number of times they test industrial processes. To save money, engineers want to run as few tests as possible. But developing the best process requires finding the optimum combination of half a dozen or more variables--different chemicals and temperatures used in making silicon chips, for instance. Typically, engineers use mathematical formulas to figure out which combinations of variables they need to test. To speed up this effort, Sloane incorporated his packing mathematics into a computer program called Gosset. Using it, engineers can prove a process with 23% to 50% fewer tests, according to the Experiment Strategies Foundation, a Cleveland-based consultant in testing procedures.
Such results are winning converts: Novis International uses the program to come up with better mixes of chicken feed, Eastman Kodak to help produce the coatings on photographic film, and Monsanto in drug manufacturing. Annette Hagewiesche, a researcher at Genentech Inc., uses Gosset to help design tests of the ability of new processes to purify protein-based drugs, such as Activase or TPA. "It makes it much faster to screen different tests, which saves time and money," she says.
Number theory is also the basis for writing and cracking encryption codes, which make data unintelligible to electronic snoopers. Public encryption schemes use a system called RSA to encode messages. It involves feeding a message into an algorithm, which churns out the scrambled version. Inside each encryption algorithm is a number, called a modulus, that does the scrambling. Like a numerical blender, it produces gibberish.
The modulus is the Achilles' heel of RSA encryption schemes. A snoop intent on cracking the code must first factor the modulus--that is, find two prime numbers that, when multiplied together, produce the modulus. Since anyone with high-school math knows how tm factor, the security of a code relies on making the modulus so large that an unrealistic amount of computer time is needed to calculate the two primes.
In April, though, number theorists Arjen K. Lenstra and his colleagues at Bell Communications Research Inc. made a leap in code-cracking. Using a recent and unusually efficient algorithm for determining prime factors, they broke a 129-digit code in just eight months. Since most encryption systems now use 155 digits, most data are still safe from theft for a while. But to test the limits of number theory, Lenstra's team intends to try a better code-breaking algorithm, called a number field sieve. He says it will take roughly nine months to crack 155-digit codes.
Mathematicians fear that even faster methods of factoring may soon be invented. Whether that happens will depend ultimately on whether there is a limit to how easily prime factors can be found. Most theorists believe that the number of steps it takes to find prime factors is inherently complex--that it will always grow exponentially with the size of the number. But no one knows for sure. "The complexity of factoring is key to encryption," says Enrico Bombieri, a mathematician at the Institute for Advanced Studies in Princeton, N.J. If factoring is inherently complex, small increases in the number of digits in the modulus would so increase the difficulty of cracking it that security could be protected. If not, says Bombieri, "someone could come up with a fast way of breaking all of the codes."
RANDOM MISCHIEF. While number theorists probe that possibility, they're also helping make computer simulations more accurate. Computer programs known as number generators spew out series of what are called pseudo-random numbers, which engineers, scientists, and economists plug into computer models to simulate random events. For instance, they may want to check out the occurrence of calls on a telephone network to determine how well a given network configuration handles various levels of traffic. Engineers also use random numbers to approximate solutions to problems too complex to calculate in a straightforward way.
There's one hitch, however. A few years ago, mathematicians realized that the numbers coming out of these generators weren't random: They formed definite patterns that could invalidate calculations or simulations. George Marsaglia, a professor of statistics at Florida State University at Tallahassee, fell victim last summer when his own random-number algorithm was caught skewing the results of experiments simulating the magnetism of materials.
To all appearances, says Persi Diaconis, a professor of mathematics at Harvard University, there was no way around the problem. Then recently, Fan R.K. Chung, a Bellcore fellow, hit upon a way of testing whether a series of numbers is truly random. While working on better ways to find connections between two points in a network, a key in designing efficient communications systems, Chung stumbled across a mathematical relationship between these networks and random-number generators. Her discovery could eventually improve the number generators, which, in turn, could help make computer problem-solving more reliable.
The continued manipulation of numbers promises more practical dividends in the years ahead--in everything from creating new materials to designing new materials. Peter Sarnak, a professor of mathematics at Princeton University, recently concocted a technique that shows how to interconnect some 10 billion phone lines. It's far too complex to apply to telecommunications networks. But experts speculate that the trick might come in handy for designing computers that mimic the human brain. As the world goes digital, it seems, the musings of number theorists will only become more relevant.
NUMBER THEORY GETS PRACTICAL
Number theory, once the esoteric study of what happens when whole numbers are
manipulated in various ways, is becoming a
vital practical science that is helping solve tough business problems. Here are some of the ways:
The study of how to come up with numbers that can be used to simulate random events, such as the flow of foot traffic through an airport terminal. Scientists also use random numbers to attack problems that are too complex to solve precisely, such as exactly how air flows over an airplane wing.
Number theorists are increasingly obsessed with coming up with quick ways of finding prime numbers, which can only be divided by themselves or one and still yield a whole number. These are the basis for, among
other things, cryptography--encoding computer data to protect it from theft.
To solve problems, mathematicians often invent algorithms, or step-by-step procedures, that are repeated many times until a solution pops out. Complexity theory is the study of how many times a given procedure must be repeated to get an answer. This is useful in designing software that will let computers do their jobs more quickly.
This aspect of number theory determines how sequences of numbers can be best used to represent information. By coming up with new ways of manipulating these sequences, mathematicians are devising more efficient computer codes to transmit computer data with fewer errors or glitches.