Each year we run more devices, faster chips, more connections. Photo: Getty Images
Q What is the biggest challenge in computing?
A An easy answer would be quantum computing but we have so much to learn before that is possible. A far more pressing challenge is electricity.
In 2008, I spent time with IBM's top scientists in California. For them, electricity was the problem. How to make chips use less of it? How can we tackle the bills hitting IT departments? How can we transmit it, store it, not waste it?
Our insatiable demand for processing power shows no sign of abating, and the IT industry needs to respond. Each year we run more devices, faster chips, more connections, more applications. Demand grows, but not the electricity.
At a small scale, we experience the challenge of our handsets such as recharging them regularly and the unpleasant heat after long calls. We dare not leave Wi-Fi on, or screens go dim after a few minutes. Manufacturers would love to pack in more features and gadgets, but the batteries would drain too fast.
The annual cost of piping electrons into big data centres already exceeds the cost of buying and maintaining equipment. Some big IT companies are already being barred from sucking more power from the grid. We are hitting major limits that change economics, and they overlap our big environmental challenges. Clever people are shifting computer centres to cold countries such as Iceland, and combining them with urban heating schemes.
Electricity is also why computer centres are shifting to ''the cloud''. Distributing the load across millions of computers over the internet means the machinery is employed more efficiently, thus delivering more bang per buck. Or, indeed, bang per watt.
■ Response by Dr Bruce McCabe, a novelist who lectures on the future, and advises companies how to survive it. He will speak at National Science Week on August 18 at the ANU.
Brought to you by the Fuzzy Logic Science Show, 11.30am Sundays on 2XX 98.3FM. Send your questions to AskFuzzy@Zoho.com.