Can energy usage data tell us anything about the quality of our programming languages? Last year a team of six researchers in Portugal from three different universities decided to investigate this question, ultimately releasing a paper titled “Energy Efficiency Across Programming Languages.” They ran the solutions to 10 programming problems written in 27 different languages,…
With some exceptions, this problem is a hard one to sort out – yet it matters when using battery operated portable devices.
The paper took a hard look at the common assumption that a faster program will always use less energy, pointing out that it’s not as simple as the law of physics that says E(nergy) = T(ime) x P(ower).
I proved this in my Masters in Software Engineering thesis eight years ago. Software developers have long operated on the belief that a “fast” and efficient program would use less energy. But ultimately it depends on how their high level code is translated into compiled code or pseudo-code, or interpreted, and the underlying implementation of any virtual machines that execute the pseudo-code.
Paradoxically, less efficient algorithms can indeed use less electric energy – it ultimately depends on how the hardware is put to use. And in the case of pseudo-code – this means a power optimized app running on one Android phone might even be less – or more – efficient when run on a phone from a different manufacturer using a different virtual machine to execute the pseudo code!
There is a complex, and non-obvious trade-off between algorithm efficiency, memory usage, and power consumption – and it varies by language, and by device.
On phones, the big power users tend to be the display, and GPS, and the camera – and of course, the CPU. Most power reduction strategies work to keep hardware turned off, or in a low energy state, as much as possible.