Crunching Big data

Processing speed has been increasing with an amazing regularity, the Moore’s law, and today we can crunch with a lap top what required a Supercomputer just 20 years ago. However, the number of data is growing at a faster speed and extracting meaning from these data, the Big Data, requires a crunching capacity that is increasing faster than the increase in processing speed!
Hence the interest in this news of a team of researchers at the MIT that have found a way to leapfrog today’s processing capacity in the area of crunching Big Data.
If data are stored in conventional hard drives the access time is measured in ms. On the other hand, if data get stored on flash memory the access time is in the order of µs, that is 1,000 times faster. This is what researchers have done. Each flash data storage is connected, on the same board, to a Field Programmable Gate Array Chip (FPGA) that can retrieve data and perform some operations on them.
The boards are connected one another through a very fast serial network with low latency letting data move from one node to the other in ns.
They have developed a storage prototype network with 16 nodes and a capacity of up to 32 TB. That would provide a good crunching support to people having to analyse Big Data. The idea is that you harvest data from several sources, store them into this storage-crunching device for your analyses and then, once done, move the data to more conventional storage. It is a bit like the concept of using cache in a computer for faster processing. Indeed we are already seeing this kind of approach being used in mass market products, like the latest MacPro from Apple.
The catch, obviously, is in the cost. Magnetic disc storage is way cheaper than flash memory storage but we know that technology evolution is bringing the cost down very rapidly.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.