While people have doubted the continuation of Moore's law before - and been wrong - I do think it's different this time, exactly because of how close we're getting to feature sizes where we basically measure things in individual atoms. If Moore's law slows down and then stops, that will seriously change the computer industry.
At the same time, I think that the thing that has really driven technical innovation has always been communication, and how that enables people to more easily combine and accumulate information. So while electronics manufacturing and the current computer industry may be facing a hiccup, I suspect it will mean that other areas will take up the slack. Because if there is something we've been getting better at lately, it's been exactly that "communication" part. Linux is obviously one example of that, and I don't think we're facing any fundamental slow down.
Q. Where do you see technology innovations today and in the near future?
Computing is still a big deal and I think the fact that it still (at least for a while) is just getting cheaper and more powerful, will drive a lot of innovation. Just the fact that you can economically do form-factors that just weren't realistic before (small, energy-efficient, connected and mobile) really does mean that there's potential for fairly smart devices everywhere.
One thing I personally love, and is starting to happen, is how having all of those cheap mobile devices ends up actually giving us lots of real-time information. You can already get traffic information that isn't from some centralised city "measure number of cars going through a particular intersection", but that is aggregated from the data of people driving with their cellphones.
There are obviously all the privacy issues, but this is the kind of "everyday boring information" that can actually affect your life, and that used to be hard to really get real-time. The fact that we are now starting to have navigation devices that take traffic information into account, and it really works (it's not new, but it's actually starting to work in practice these days), is just the first step. I think there's a lot of this kind of "ambient data" that can drive interesting applications.
It won't result in anything flashy like jetpacks or flying cars, but it will be the small details in how you interact with your environment (and vice versa). And a lot of it will really be about humans, not the technology itself. The technology just makes things possible.
It's not just that you're carrying a smartphone around: the fact that you carry a versatile and interactive device in your pocket ends up also raising the bar for what you expect from the technology around you. ATM's with ugly text and bad touch screens just look clunky to you, don't they? And why is the radio in your car so stupid? And why am I standing here waiting for the light to turn green in order to cross the road, even though there are no cars anywhere?
So there's a lot of places where the spread of cheap computing can make these kinds of small subtle changes, it just takes a long time to really permeate everywhere.
That said, if I went to university today, I would still probably go into computer science just because I think computers are fun. But from a "big innovation" standpoint, computing these days is more of a tool than, necessarily, the main subject. Computing is a big tool for things like really understanding bio-chemistry, and driving personalised medicine, for example.
I'm also still hoping for major steps in energy. Maybe they'll finally prove the old joke wrong: "[nuclear] fusion is twenty years in the future, and always will be".
Linus Torvald was in conversation with Technology Academy Finland, the body that awards the Millennium Technology Prize, which he won this year.