It’s a bit more complicated than this
Forget your times tables – mathematicians have found a new, faster way to multiply two numbers together. The method, which works only for whole numbers, is a landmark result in computer science. “This is big news,” says Joshua Cooper at the University of South Carolina.
To understand the new technique, which was devised by David Harvey at the University of New South Wales, Australia, and Joris van der Hoeven at the Ecole Polytechnique near Paris, France, it helps to think back to the longhand multiplication you learned at school.
We write down two numbers, one on top of the other, and then painstakingly multiply each digit of one by each digit of the other, before adding all the results together. “This is an ancient algorithm,” says Cooper.
If your two numbers each have n digits, this way of multiplying will require roughly n2 individual calculations. “The question is, can you do better?” says Cooper.
Lots of logs
Starting in the 1960s, mathematicians began to prove that they could. First Anatoly Karatsuba found an algorithm that could turn out an answer in no more than n1.58 steps, and in 1971, Arnold Schönhage and Volker Strassen found a way to peg the number of steps to the complicated expression n*(log(n))*log(log(n)) – here “log” is short for logarithm.
These advances had a major impact on computing. Whereas a computer using the longhand multiplication method would take about six months to multiply two billion-digit numbers together, says Harvey, the Schönhage-Strassen algorithm can do it in 26 seconds.
The landmark 1971 paper also suggested a possible improvement, a tantalising prediction that multiplication might one day be possible in no more than n*log(n) steps. Now Harvey and van der Hoeven appear to have proved this is the case. “It finally appears to be possible,” says Cooper. “It passes the smell test.”
“If the result is correct, it’s a major achievement in computational complexity theory,” says Fredrik Johansson at INRIA, the French research institute for digital sciences, in Bordeaux. “The new ideas in this work are likely to inspire further research and could lead to practical improvements down the road.”
Cooper also praises the originality of the research, although stresses the complexity of the mathematics involved. “You think, jeez, I’m just multiplying two integers, how complicated can it get?” says Cooper. “But boy, it gets complicated.”
So, will this make calculating your tax returns any easier? “For human beings working with pencil and paper, absolutely not,” says Harvey. Indeed, their version of the proof only works for numbers with more than 10 to the power of 200 trillion trillion trillion digits. “The word ‘astronomical’ falls comically short in trying to describe this number,” says Harvey.
While future improvements to the algorithm may extend the proof to more humdrum numbers only a few trillion digits long, Cooper thinks its real value lies elsewhere. From a theoretical perspective, he says, this work allows programmers to provide a definitive guarantee of how long a certain algorithm will take. “We are optimistic that our new paper will allow us to achieve further practical speed-ups,” says van der Hoeven.
Harvey thinks this may well be the end of the story, with no future algorithm capable of beating n*log(n). “I would be extremely surprised if this turned out to be wrong,” he says, “but stranger things have happened.”
For more such insights, log into www.international-maths-challenge.com.
*Credit for article given to Gilead Amit*