For their work on programming languages, Alfred Aho and Jeffrey Ullman have received the Turing Award, computer science's Nobel Prize.
Aho and Ullman co-authored a series of seminal textbooks on programming languages, data, and algorithms that influenced the thought of a generation of computer scientists over the course of more than three decades of collaboration. They are best known for their work on compiler theory, which is a type of software that transforms instructions from an abstract programming language, such as SQL, into machine code that the computer can execute.
Most programmers today don't have to think about how a computer's circuits are actually programmed or how information is routed through the system, thanks to theories Aho, now 79, and Ullman, 78, helped build.
“Aho and Ullman established bedrock ideas about algorithms, formal languages, compilers, and databases, which were instrumental in the development of today's programming and software landscape,” said Jeff Dean, a long-serving Google engineer and executive who is now the company's senior fellow and senior vice president of Google Research and Google Health.
The Association for Computing Machinery bestows the Turing Award on a yearly basis (ACM). The winners will split a $1 million prize, which Google will contribute to. Alan Turing, a British mathematician who founded the foundations of modern computing, is the recipient of the award.
After receiving a Ph.D. from Princeton University, this year's Turing winners started working together at Bell Labs in the late 1960s. Initially, they focused on improving algorithms and translating programming languages. While Ullman left Bell Labs in 1969 to return to academia, eventually settling at Stanford University as a professor emeritus, the two continued to collaborate on books.
They published The Design and Study of Computer Algorithms in 1974, which went on to become the standard textbook for algorithms courses for over a decade. The book was instrumental in grouping individual algorithms into more general design categories, and it has had a lasting impact on the field.
Three years later, Aho and Ullman published Principles of Compiler Design, a classic that has taught generations of students how to write compilers and learn about computer language theory. The textbook was nicknamed "the Dragon Book" by computer science students because it featured a dragon on the cover, similar to a magical tome Harry Potter would bring around Hogwarts, except Harry Potter hadn't yet been invented. Ullman says, “I am constantly told that placing this amusing cover on the Dragon Book attracted students to study computer science.” Aho, who spent more than 30 years at Bell Labs and is now a computer science professor emeritus at Columbia University, says that while there, he saw first-hand the value of developing programming languages that would function well for people who wanted computers to do work in a particular area, such as mathematics, chemistry, or typesetting, without having to be experts in those fields.
The art of programming languages, according to Ullman, is “enabling the programmer to tell as little as possible while making as much happen as possible.”
Both Aho and Ullman agree that the influence they have had on students they have taught and mentored, several of whom now hold senior positions at major technology firms and have developed several programming languages, is the most rewarding aspect of their careers. Google co-founder Sergey Brin was one of Ullman's Ph.D. students.
According to Aho, it was critical to provide students with a foundation in the theory of programming languages and algorithms, in part because programming languages are subject to evolving fashion. He observes that “the first programming languages taught to students in academia shift all the time.” “It used to be C or C++, then Pascal, and now Python appears to be the most common programming language. Who knows what will happen in ten years, a hundred years, or a thousand years?” “We both assume fundamentals and abstractions have more staying power than modern technology,” he says.
________________________________________________________________________
To help their work, Newsmusk allows writers to use primary sources. White papers, government data, initial reporting, and interviews with industry experts are only a few examples. Where relevant, we also cite original research from other respected publishers.
Source: Fortune
Comments