Exploiting deep links

Have you ever thought about the roots behind the progression from 1.0 to 2.0, 3.0 and so on? It seems to apply to several areas and it is not about creating a new version of something (as it was in the past, but then we did not used this kind of labelling to name a new version) but rather to grow from a paradigm to another that is somehow embedding the previous one taking it a step "higher".
What is happening, at least this is how I see it, is that we are creating more powerful abstractions and this is so "human-like", and so powerful in energy saving terms.
Rather than dealing with all the subtleties of a complex environment we abstract a few relevant aspects and work on these as our primary building blocks. It is what happens, as far as we can tell, in our brain where the workings of million of neurones lead to an emerging property that becomes the signal considered at the higher level of abstraction.
The reason why we did not took they approach in the last century is that the complexity of our systems where not so high as to generate these emerging behaviours. This is now happening more and more. 
Big data is a clear example where data generated at the sensor level once aggregated and connected create an emerging behaviour that in turns can be further aggregated.
A similar thing is happening at the level of services. They used to be just a few, but now with the avalanche of Apps they are in the millions and their usage brings them into the Big Data domain. No wonder then somebody can start to think how to leverage this complexity by abstraction, a process that decreases the complexity and along with it increases the efficiency.
Apps have changed the way we look at the Web and interact with it. Rather than searching for a data we rely on Apps to create information out of data available on the Web. This is both narrowing the Web we actually access and creating a new Web. It is something that companies are trying to exploit (why else would Booking prompt you to use their App rather than their website…) and Google is watching carefully since it is basically chewing on its turf. They have started to work with Apps developers offering them deep links to Google so that they can keep providing customised ads.
We are seeing Apps that are working as an interface that will take care of delivering you a service/information by accessing other Apps, thus creating an abstraction.
Quixey is one of these. They have invested 74 million $ and 150 people to deliver a new approach to search for services and information.
Quixey create and index of information/data sources based on Apps available on the app stores (on all of them) and access them using "deep links", special accesses that developers build into their applications. These are sort of hyperlinks that provide access to a specific function inside an application.
The level of effort and investment clearly tells this is not "yet another app".
What intrigues me is this abstraction process that is made possible by an ever more complex ambient in the cyberspace, and that will eventually create functions that mimic our own intelligence.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.