The Theory That Would Not Die: Yale publish new book on Bayes’ Theorem

The Theory That Would Not Die

The Theory That Would Not Die

What do Google’s prototype robotic cars, email spam filters and military code cracking have in common? They all utilise Bayes’ Theorem, a seemingly simple formula invented in the 18th century that has since been adopted by scientists, codebreakers and computer programmers alike. Earlier this month Yale University Press published The Theory That Would Not Die, a book that traces the history of this influential and controversial formula.

Bayes’ formula, named after the 18th century Presbyterian minister and amateur mathematician Thomas Bayes, appears to be a straightforward, one-line theorem: by updating our initial beliefs with objective new information, we get a new and improved belief. Or to write it as an equation:

Initial Beliefs + Recent Objective Data = A New and Improved Belief

Thomas Bayes

Thomas Bayes (1702–1761)

To its adherents, it is an elegant statement about learning from experience. To its opponents, it is subjectivity run amok. Despite this apparent subjectivity, the theorem’s champions insist that it helps to bring clarity when information is scarce and outcomes uncertain. This was tested to great effect by the French mathematician Pierre-Simon Laplace, who helped develop Bayes’ initial theorem into the modern form we understand today. Laplace applied the equation to try and ascertain why slightly more boys than girls were born in Paris in the late 18th century. By recalculating the equation whilst collecting 30 years of demographic data from around the world, Laplace was able to conclude that the boy-girl ratio is universal to humankind and not unique to Paris.

Since this time, Bayes’ theorem has had countless modern applications: codebreakers used it to break Germany’s Enigma code during the Second World War, the US navy used it to calculate the paths of Soviet nuclear subs during the Cold War, and  today the theory drives email spam filters, helps operate Google’s prototype robotic cars, and enables biologists to decode DNA.

Google's Robotic Cars

Google's Robotic Cars

Despite these applications, the theory was not without intense criticism and scepticism. This is addressed in The Theory That Would Not Die: How Bayes’ Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy (published this month), a riveting account of how Bayes’ theorem ignited one of the greatest scientific controversies of all time. In the first-ever account of Bayes’ rule for general readers, Sharon Bertsch McGrayne explores this controversial theorem and the human obsessions surrounding it. She traces its discovery by Bayes’ in the 1740s through its development into its modern form by Laplace. She reveals why respected statisticians rendered it professionally taboo for 150 years – at the same time that practitioners relied on it to solve crises involving great uncertainty and scanty information, and explains how the advent of off-the-shelf computer technology in the 1980s proved to be a game-changer. The Theory That Would Not Die aims to provide a vivid account of the generations-long dispute over one of the greatest breakthroughs in the history of applied mathematics and statistics. A must-read for those interested in the history of technology and science.

The Theory That Would Not Die is available now from Yale University Press

Related Links:

Official Book Page

Sharon Bertsch McGrayne’s website

1 Comment

  • Reply May 19, 2011

    Charles Freeman

    I understand that much of the defence of Christianity put forward by the Oxford theologian/philosopher Richard Swinburne is based on the use of Bayes’ theorem. I also understand that Swinburne has been strongly criticised for his misuse of the theorem. It would be interesting to know if the author, or any other readers of this post, can address this problem or have strong views on it. I found Swinburne’s conclusion that the physical resurrection of Jesus was 93 per cent certain totally unconvincing! Charles Freeman.

Leave a Reply