• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. Jack Dongarra Receives Turing Award
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
turing award
supercomputer

Jack Dongarra Receives Turing Award

5 Apr 2022 • 4 minute read

 breakfast bytes logojack dongarraLast week, the ACM announced that Jack Dongarra was being honored with the 2021 Turing Award, considered to be the equivalent of the Nobel Prize for Computer Science, and comes with the same million dollars. His name might not be that familiar, but the software libraries he was responsible for are better known: LINPACK, BLAS, LAPACK, ScaLAPACK, PLASMA, MAGMA, and SLATE. LINPACK, in particular, is well-known even to people who have never used it since it is almost 50 years old, and it is also the basis for the benchmarks for supercomputers. It was originally written in Fortran (the first programming language I learned). Wikipedia describes it as:

LINPACK is a software library for performing numerical linear algebra on digital computers. It was written in Fortran by Jack Dongarra, Jim Bunch, Cleve Moler, and Gilbert Stewart and was intended for use on supercomputers in the 1970s and early 1980s.

I have written a few times about supercomputers:

  • Supercomputers (mostly about TaihuLight)
  • Deep Learning and the Cloud (about Summit)
  • Japanese Arm-Powered Supercomputer Takes the TOP500 Crown (about Fugaku)

 Supercomputers are considered to have started with the Cray-1, designed by legendary computer architect Seymour Cray (the photo is the one in the Computer History Museum). I have no idea why Cray never won the Turing Award, maybe because he was too much of a computer architect and not enough of a computer scientist. But that didn't stop Hennessy and Patterson from being honored for creating RISC computing. See my post, Hennessy and Patterson Receive the 2018 Turing Award. Actually, that title is wrong since the award, usually announced in March, is numbered for the previous year, so that was technically the 2017 Turing Award. But Cray's computers of the era are nothing today. Nowhere has the progress of Moore's Law been so apparent:

More than 80 Cray-1s were sold over the next few years at a price tag between $5 million and $8 million. This behemoth of a computer offered 80 MHz processing speed, a vector processor, and integrated circuits. By comparison, one of the earliest iPhones, the 3GS, offered 600 MHz processing and sold unlocked for $699.

Although I am (or at least was) a programmer, I have no experience in programming supercomputers. Here's a paragraph from a description of the Summit supercomputer:

Each node in Summit has two 22-core IBM Power9 CPUs and six NVIDIA Tesla V100 accelerators. But there are 4,608 of these nodes (that number turns out to be 4,096 + 512, so it's not quite as weird as it looks to computer scientists who only count in powers of 2). So I make that 202,752 Power9 cores and 27,648 NVIDIA Volta GPUs. There's also a little memory...10 petabytes, along with 250 petabytes of storage. Peak performance is 200 petaflops. The DoE is planning an exaflops level machine for 2021.

Obviously, one of the challenges is that supercomputers gain their power from having enormous numbers of CPUs and, these days, GPUs with their own multiple processors. So taking some computational task like weather forecasting—or signal integrity analysis—involves making it work on such a fabric. Here's a quote from the ACM's announcement:

Dongarra’s major contribution was in creating open-source software libraries and standards which employ linear algebra as an intermediate language that can be used by a wide variety of applications. These libraries have been written for single processors, parallel computers, multicore nodes, and multiple GPUs per node. Dongarra’s libraries also introduced many important innovations, including autotuning, mixed precision arithmetic, and batch computations.

Summit is the supercomputer at Oak Ridge National Laboratory. The announcement of the award says that Jack "also holds appointments with Oak Ridge National Laboratory," so I guess he is heavily involved with it. His day job is University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee.

One quote from the ACM's announcement is:

Moore’s Law produced exponential growth in hardware performance. During that same time, while most software failed to keep pace with these hardware advances, high-performance numerical software did—in large part due to Dongarra’s algorithms, optimization techniques, and production-quality software implementations.

I would like to challenge that on behalf of the EDA industry. Without EDA software from Cadence (and others) "keeping pace," there wouldn't even be any of these "hardware advances." When I started my career, a large chip was 10,000 gates, say 40,000 transistors. Today, it is over 100 billion transistors. And we don't even use supercomputers. When I started in EDA, there probably weren't enough memory chips in the entire world to load up a large modern design.

Gabriele Kotsis, ACM President, announcing the award, said:

Today’s fastest supercomputers draw headlines in the media and excite public interest by performing mind-boggling feats of a quadrillion calculations in a second. But beyond the understandable interest in new records being broken, high-performance computing has been a major instrument of scientific discovery. HPC innovations have also spilled over into many different areas of computing and moved our entire field forward. Jack Dongarra played a central part in directing the successful trajectory of this field. His trailblazing work stretches back to 1979, and he remains one of the foremost and actively engaged leaders in the HPC community. His career certainly exemplifies the Turing Award’s recognition of ‘major contributions of lasting importance.’

Learn More

Read the announcement on the ACM website.

 

Sign up for Sunday Brunch, the weekly Breakfast Bytes email.

.