Who Invented the Computer? Alan Turing

Who Invented the Computer? This is the sixth installment in our ongoing series.

 

Alan Turing played a key (and tragic) role in the development of computer science.

Born in 1912 in London, England, Alan Turing showed uncommon signs of high intelligence at a young age. Unfortunately for him, not everyone recognized his mathematical acumen — his prep-school headmaster went so far as to write to Turing's parents that, when it came to the realm of numbers, young Alan "underperformed" and seemed "absolutely uninterested in the subject."

 

Turns out that Turing's disinterest was in doing math on the easy setting — he was absolutely enthralled with high-level concepts. He enjoyed grappling with advanced problems and was especially proficient at reducing difficult concepts to easily understandable explanations.

 

At age 15, when most young men are more interested in sports and leisure, Turing condensed Einstein's Theory of Relativity in writing for his mother, boiling down concepts as vast as time and space into a concise and easily graspable format.

 

In 1931 Turing entered King's College (University of Cambridge) in Cambridge, England on scholarship and immediately set about making a name for himself. His 1934 dissertation proved the Central Limit Theorem, one of the key concepts in probability theory.

 

Upon graduation, Turing accepted a fellowship at the school. Appointment to the prestigious post at such a young age — he was just 22 — led to a ditty oft repeated among the student body and faculty, "Turing must have been alluring, to get made a Don so early on."

 

Decision master

 

In addition to his regular school duties Turing found time to answer one of the 20th Century's most important unsolved mathematical challenges — the Decision Problem ("entscheidungsproblem" in German).

 

The brainchild of German mathematician David Hilbert, the Decision Problem asks, "Does an algorithm exist that can take, as input, a statement written in formal logic, and produces a 'yes' or 'no' answer that is always accurate?" Hilbert's opinion was that there were no unsolvable problems.

 

This question was extremely important for the field of mathematics because if such an algorithm did, indeed, exist and could be discovered, then it would make solving highly advanced mathematical questions much easier.

 

Working independently on the other side of the pond, American mathematician Alonzo Church answered Hilbert's query in the negative via a complex system of calculations he developed and named Lambda Calculus. The downside to Lambda Calculus was that, for most people, it was difficult to apply and understand.

 

Turing in a 1936 paper, On Computable Numbers, with an Application to the Entscheidungsproblem, simultaneously arrived at the same conclusion, but via a much simpler and understandable method — he conceptualized a hypothetical machine capable of computing anything that was computable.

 

This imagined Turing Machine, is considered one of the 20th century's most influential mathematical abstractions, the precursor to the modern computer and a cornerstone of computer science. After completing a Ph.D. in mathematics and cryptology at Princeton University in New Jersey, Turing returned to Cambridge, assuming his regular duties and working part-time at the Government's Code and Cypher School (GC&CS).

 

Fighting the Nazis

 

What seemed destined to be a quiet career in mathematics changed dramatically when Nazi Germany invaded Poland in 1939, setting off World War II. When war broke out, Turing immediately volunteered to work at Bletchley Park, the GC&CS's wartime station, where he became one of its most renowned figures for his work in cracking the codes of Germany's famous Enigma machine.

 

Enigma, the most advanced encryption device then in existence, was developed in Germany at the end of the First World War in 1918. Since that time, it had been used by the German government for their commercial, diplomatic and military communications.

 

The Enigma device's operation was simple and ingenious: plain text typed into the machine was encrypted via electromechanical rotors and a plugboard. Every letter entered would be scrambled to represent a different letter on an attached display. To decipher messages, one needed to know the exact settings of each rotor and its connection to the plugboard.

 

The device was so sophisticated that the number of possible settings for the rotors and plugboard on any given day exceeded 15 raised to the 19th power — that's 15 followed by 19 zeroes. Without the encryption code, German communiques were impossible to decipher.

 

By exploiting a flaw in Enigma's design, however — that no letter would ever represent itself — Turing's team constructed an electro-mechanical machine capable of quickly determining Enigma's daily settings. They named their creation Bombe. Decoding German naval messages enabled the Allies to better defend against U-boat attacks on food and munitions convoys coming from North America.

 

Not only did Bombe enable Britain to keep her people fed, but experts generally agree that by cracking Enigma's codes the war was shortened by at least two years saving hundreds of thousands of lives on both sides of the conflict. For his efforts, Turing was awarded the Order of the British Empire by the Government.

 

After the war Turing continued to make significant contributions to the advancement of computing. Most noteworthy among them, perhaps, was advancing the concept of artificial intelligence (AI) in his 1950 essay, Computing Machinery and Intelligence.

 

The advent of electronics led to the design and construction of more powerful computing devices. Aware of the increased power of such devices, Turing conceived of a machine capable of simulating the mental processes of a human brain.

 

He believed that, what the brain accomplished via neurons, a machine could do through transistors. To prove the idea, he proposed a famous experiment to test ability of a machine to exhibit intelligent behavior, the Turing Test.

 

Turing's concept of a machine that could think like a human opened the door for many of the era's leading technology proponents to explore and debate possibilities. It also scared a lot of people who feared the consequences of independent machines that could learn. (It's an idea that still makes us queasy today, as the unending stream of books, movies, and television about malevolent computers will attest.)

 

A life cut short

 

Sadly, Turing did not live to see his conception of a thinking machine brought to fruition. He committed suicide on June 7, 1954 by biting into an apple laced with cyanide. His suicide is believed to have been brought on by a complicated and barbaric prosecution for "gross indecency."

 

Though his years at Bletchley Park had included a proposal of marriage and short-lived engagement to a colleague and fellow mathematician, Joan Clarke (later Murray), Turing was a gay man in a time when gay men were not just discriminated against, but actively criminalized under British law. Turing's loving and sexual relationship with a younger man was uncovered by British authorities during the investigation of a break-in at his home.

 

The 39-year-old Turing's disclosure of his relationship with 19-year-old Arnold Murray eventually led to criminal proceedings against both men. Acting on both legal and familial counsel, Turing entered a guilty plea and, following his conviction, accepted probation instead of imprisonment. As a condition of his probation, he was forced to undergo a series of injections intended to eliminate his sexual urges.

 

The treatment was carried out over an entire year. About a year after completing his probation, Turing took his own life. Biographers later speculated that his chosen method of killing himself was intended to re-enact the death of Snow White as depicted in the Walt Disney film Snow White and the Seven Dwarfs, a favorite of Turing's.

 

In the course of his all-too-brief career, Alan Turing significantly advanced the field of computer science and the concept of AI. Generally accepted as the father of computer science and artificial intelligence, Turing's legacy is draped in honors:

 

- In June 2007, a life-size statue of him was unveiled at Bletchley Park.

- The Princeton University Alumni Weekly named Turing Princeton's second most significant alumnus, behind James Madison.

- Time magazine named him one of its "100 Most Important People of the 20th century."

- In 2002 a BBC nationwide poll ranked him 22nd on its list of the "100 Greatest Britons."

- In 2019, the Bank of England announced that Turing would appear on the United Kingdom's new £50 note. He was chosen from a list of nearly 1,000 candidates nominated by the general public, beating out such fellow luminaries as theoretical physicist Stephen Hawking and mathematician Ada Lovelace.

 

Turing once predicted that, by the year 2000, AI would be advanced enough to pass the Turing Test. While there have been a number of well-documented experiments, to date, no AI has convincingly done so. (Setting aside the character played by Alicia Vikander in the2015 film Ex Machina.)

 

MORE HISTORIC HACKS
Would you like more insight into the history of hacking? Check out Calvin's other articles about historical hackery:
About the Author
Calvin Harper

Calvin Harper is a writer, editor, and publisher who has covered a variety of topics across more than two decades in media. Calvin is a former GoCertify associate editor.