Who Invented the Computer? Dennis Ritchie and C
Who Invented the Computer? This is the thirty-fifth installment in our ongoing series.
In 1843, Augusta Ada King, Countess of Lovelace, sat down at her desk and scratched out an algorithm, along with detailed instructions on how to program Charles Babbage's Difference Engine to compute Bernoulli numbers. Although Babbage's machine never left the drawing board, Lovelace's algorithm has gained lasting fame as the first-ever programming language.
Unfortunately, since actual programmable computers would not exist for more than a century, the concept of programming languages lay dormant. That all changed in 1947 with the advent of Assembly Language, a low-level programming language designed for use with the Electronic Delay Storage Automatic Calculator (EDSAC), the purpose of which was to solve large differential equations.
Assembly Language was complex and difficult to understand. Furthermore, as a low-level language, large amounts of code were required to make a computer perform even the simplest of operations. Computer users quickly realized the need for a more efficient and easier-to-use programming language.
During the next two decades, several new programming languages were developed. In 1952, British engineers developed Autocode, a simplified coding system for digital computers. Five years later, IBM took a big step forward in programming languages when it released FORTRAN, a general-purpose language that was especially suited to numerical computation and scientific computing. FORTRAN proved so effective that it is still in use today.
Admiral Grace Hopper earned the nickname "Grandma COBOL" in 1959 when a Department of Defense committee modified FLOW-MATIC, a programming language previously created by Hopper, to design a Common Business Oriented Language, or COBOL. COBOL was an early high-level programming language designed to run on various types of computers
In 1963, in an effort to make computer use easy for students in nonscientific fields, Dartmouth professors John Kemeny and Thomas Kurtz created BASIC, a language consisting of simple English words and phrases that greatly simplified the process of writing computer programs.
While previous programming languages were good, there was still a need for a high-level language that combined functionality with a broad mix of comprehensive features. The search for such a language would end with Dennis Ritchie (1941-2011), a computer scientist working at AT&T's Bell Labs.
UNIX and B Programming Language
In the early 1970s, Ritchie and his colleagues were in the process of developing the multi-user operating system UNIX. While writing the operating system, the team was using Assembly Language and another language called "B" that Ritchie and fellow scientist Ken Thompson had written previously.
As a high-level programming language, B enabled Ritchie's crew to progress rapidly since they did not need to write excessive amounts of code. Because it was also a typeless language, B had the added advantage of being able to store any type of data — but it could also be unreliable due to the inability to check for typing errors in the syntax.
Frustrated with B, Ritche decided to give the program a few tweaks. He ended up retaining almost all of the program's syntax and structure but also added a couple of new features: data types, a simple classification system to give meaning to underlying information stored by the computer; and the capability to represent structured data.
The tweaking effort took Ritchie two years and was extensive enough that in the process he actually created an entirely new programming language, "C" that was better in every repect than its predecessor B.
As a general purpose programming language, C was much closer to human language than machine code and proved ideal for the UNIX operating system as well as all other UNIX applications. C would go on to prove its value in system administration, language compilers, databases, programming networks, and for creating embedded software.
Surprisingly, while users found C to be a simple yet powerful tool for programming, the language wouldn't be documented until 1978, when Ritchie teamed up with Brian Kernighan to write and release a book, The C Programming Language. The book would become an informal reference for programmers using C and continues to be a bestseller.
While more powerful programming languages have been developed, C continues to be one of the most popular. Although it does have several significant limitations — including the absence of constructors and destructors, an inability to perform Run Time Type Checking, and inefficient memory management — there are still plenty of reasons for new programmers to invest the time required to learn C.
Some of C's more useful characteristics include:
Portability — C works well on multiple platforms including Mac operating systems, Windows, Linux, Android and Linux.
Comprehensibility — With far fewer keywords and symbols to learn, C is easier to understand for new programmers. Knowing C will also make learning its variants, C+, C++ and C# much easier.
Readability — Programs are easy to write in C since syntax errors are easier to spot and correct.
Faster Execution — Since C has fewer instructions, applications are performed more quickly.
Dynamic Memory Allocation — This gives C needed flexibility since users don't need to reserve specific amounts of memory before executing applications.
Also, because C has never gone out of fashion or fallen into disuse, it has widespread and robust (if informal) support. There is a large and active community of developers who continue creating new tools and libraries for C.
While the computing world has undergone massive evolutions during the last 50 years, the C programming language has weathered all changes and remains comfortably ensconced in its niche. Versatility makes it the right tool for many programming tasks. Nothing lasts forever, but it doesn't look like C is going to disappear anytime soon.