Just run onto an interesting "flash back" article on the evolution of storage and processing celebrating the 50 years of the Moore’s Law.
Although it does not contain anything new, it is about history not about the future, seeing all the past flowing by in just two minutes (the time it took to read the article and look at the infographics) is somehow fashinating.
Back in 1956, that is 60 years ago (almost) IBM managed to create a huge storage capacity: 5MB, for its RAMAC computer (Random Access Method of Accounting and Control). And that was made by piling up 50 magnetic disks each 24 inches in diameter with an access time of 600ms (over half a second!) at a cost of 165,000$.
Now, in 2015, you can buy for your home computer 6TB storage hard disk at a cost of few hundred $ and it can stay in the pal of your hand. Iv you take into account capacity, size, speed and cost the performance increase is in the order of a trillion times (thousands of billions).
Take a look at the infograph telling the story of processing power increase. Here the representation points to the "computers" that had/have a certain thresholds capacity measured in floating point instructions per second.
Clearly the increase in processing power has been staggering from thousands of FLOPS to quadrillions of FLOPS (an increase by a factor of 100 billions) but what has hit my curiosity is to notice the spread of devices that have "acquired" processing power and the amount of processing power they have. An Apple watch has a processing power of some 4 billion FLOPS: it rests on you wrist and it crunches 4,000,000,000 instructions per second!
Again, no real news here but seeing it all under different perspectives makes you think about the future we have created and the future still to be invented.