2+2? Approximately 4

Researchers at the Queen University, Belfast, have been awarded one million pounds (UK) to develop an approximate computer, a machine that when asked how much is 2+2 would say "somewhere in the range of 4".
Actually, it is a bit more complex. A good portion of computer processing is dedicated to make sure that 2+2 always makes 4. People won’t accept anything less. Components in a computer can run at different speed, like your car. You press the pedal and the car goes faster. At the same time it becomes a little less reliable so you are better off not flooring the pedal. In a computer you can increase the throttle by increasing the voltage. It will run faster (well there is a bit more of tweaking needed but you get the gist) but it will also become less reliable.
Additionally, at hardware level (in the chip itself) there are self correcting codes that ensure 100% reliability but this is also costly since it chews on performance.
There are, however, several problems where absolute correctness is not needed, what matters most is the ability of crunching huge amount of data. Think about modelling the climate or modelling the flow of blood in your veins. Getting wrong the whereabouts of a single air molecule or blood cell won’t make any difference in the overall picture. But still computers are "exact". Teaching them to be approximate is a whole new game and it is what several researchers are trying to do. By the way, our brain is an example of approximate processing, you don’t know exactly the size of that door but you still manage to walk through it.
The Scalable, Energy-efficient, Resilient and Transparent software adaptation (SERT) project at the Queen University aims at doing exactly this. Researchers believe they will be able to increase the processing capacity of today’s supercomputer, without having to change the hardware. Our brain can learn to process data faster, and it does that through software (cumulated experience derived from processing signals) however in this case the brain rewires itself. And it has been shown that the smarter (faster) it gets the less it processes the signals. fMRI of my brain and one of a basketball professional both engaged in throwing the ball through the hoop will show that my brain is working much harder than the one of the professional basketball player (and yet with lesser results!).
The SERT project is starting in March 2015, so it is too early to see if it will deliver. It is interesting to see how software can be used to change hardware performance at the core level.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.