a story lives forever
Register
Sign in
Form submission failed!

Stay signed in

Recover your password?
Register
Form submission failed!

Web of Stories Ltd would like to keep you informed about our products and services.

Please tick here if you would like us to keep you informed about our products and services.

I have read and accepted the Terms & Conditions.

Please note: Your email and any private information provided at registration will not be passed on to other individuals or organisations without your specific approval.

Video URL

You must be registered to use this feature. Sign in or register.

NEXT STORY

Biology and technology become increasingly alike

RELATED STORIES

Programming is not what it used to be
W Daniel Hillis Scientist
Comments (0) Please sign in or register to add comments

Now basically, almost every big data computation on the Web or on the Cloud is done with this MapReduce paradigm, so any time you process big data, you use this method. It's a very simple method of parallel programming. The more general method of parallel programming that we developed did more complicated things than reduced. It did things like scan, where you got the partial products and so we actually developed a bunch of parallel programming techniques for the connection machine that many of them have been kind of forgotten. So it's sort of strange seeing technology go backwards, but on the Connection Machine you could take a FORTRAN program that was written as a sequential program and automatically compile it so that it would run in parallel on 64,000 processors. And as far as I know, that really doesn't exist now on the Cloud. So in some sense, the technology of parallel programming has kind of gone backwards. One thing that's happened is that in the early days of computers, the same people who designed the hardware did the programming. So there were many computers where I designed down to the transistor, actually the chip, I wrote the assembler, I wrote the compiler. You know, I wrote the operating system. So I kind of knew everything in the machine. And so when you programmed something, you knew exactly what operations were happening and how long it takes. An awful lot of people that program these days, because the machines have gotten more complicated, they're designed by lots of different people, so the people that designed the compilers are not the same people that designed the assembler, not the same people that designed the hardware, not the same people who are writing the programs. None of them really have a full knowledge of exactly what's going on at the next level. So a lot of people who are programmers these days don't have a real clear sense of exactly how their programs are getting executed. Which is one reason why many computer programs haven't gotten that much faster, even though computers have gotten millions of times faster.

So for instance, take a word processor. When computers did less than a million instructions a second, I used to use a word processor and it was generally fast enough, but occasionally would be annoyingly slow when I wanted to move a lot of text or something like that. Now computers are thousands of times faster, if not millions of times faster than they were in those days. And yet the programs that I use are often just a little bit too slow. And most of that is not because they're doing anything more complicated, but because the programming has gotten much less efficient. So somebody who writes a program in a high-level language may use an instruction that does something extremely complicated that they're not aware of in terms of how it's translated into the operations of the machine. And in fact often the programmers aren't even aware of like which computer is actually running the instruction. They just have some API and it does something and they don't really know how it does it. And so programming is no longer what it used to be. What it used to be is total control of the system that, although you didn't always think it through in detail, you always could think it through in detail. Whereas now it's more like magic, in a sense. I mean, the programmers know certain incantations and they know they have this effect and so they sort of cast spells without really a detailed understanding of what makes that spell work. And it sort of has to be, because the machines have gotten so complicated that no single human can understand them anymore.

W Daniel Hillis (b. 1956) is an American inventor, scientist, author and engineer. While doing his doctoral work at MIT under artificial intelligence pioneer, Marvin Minsky, he invented the concept of parallel computers, that is now the basis for most supercomputers. He also co-founded the famous parallel computing company, Thinking Machines, in 1983 which marked a new era in computing. In 1996, Hillis left MIT for California, where he spent time leading Disney’s Imagineers. He developed new technologies and business strategies for Disney's theme parks, television, motion pictures, Internet and consumer product businesses. More recently, Hillis co-founded an engineering and design company, Applied Minds, and several start-ups, among them Applied Proteomics in San Diego, MetaWeb Technologies (acquired by Google) in San Francisco, and his current passion, Applied Invention in Cambridge, MA, which 'partners with clients to create innovative products and services'. He holds over 100 US patents, covering parallel computers, disk arrays, forgery prevention methods, and various electronic and mechanical devices (including a 10,000-year mechanical clock), and has recently moved into working on problems in medicine. In recognition of his work Hillis has won many awards, including the Dan David Prize.

Listeners: Christopher Sykes George Dyson

Christopher Sykes is an independent documentary producer who has made a number of films about science and scientists for BBC TV, Channel Four, and PBS.

Tags: parallel programming, Connection Machine, FORTRAN, Cloud, word processor, computer programming, operating system, technology

Duration: 4 minutes, 32 seconds

Date story recorded: October 2016

Date story went live: 05 July 2017