TTM 2014: Future of Networks

As engineers in telecommunications, and more generally in networks, we are facing often requests from friends to “fix” their connections, to help in getting rid of spam, to help in dealing with new “pieces” that refuse to connect to their existing appliances.
This clearly shows how much “networks” have become part of everyday life. Internet by 2035 is likely to be a utility, similarly to other utilities we are taking for granted, like water, energy, transportation…
Indeed all these utilities are crucial infrastructures and they have a lot in common. You seldom start from a green field, rather you upgrade them by including new parts thanks to interfaces that remain stable over very long spans of time. Think about the interface you use in accessing the power (electrical) infrastructure: the socket and the plug.
Some infrastructures can see local upgrades and replacement, like powering your home with solar panels, others have to be managed globally, like transportation and, so far, communications.
The Internet has been recognized in the US, and in several other Countries, as a core civil infrastructures.
A crucial aspect, as for any other infrastructure, is the interface. And the evolution of interfaces in telecommunications, happens over a long span of time. A new wireless system is deployed over a period of 10-15 years. But Internet exists both at the physical layer (cables) as well as at bit layer and here the evolution of interfaces may be faster, 1 to 3 years (for successful apps) or may be as slow as the physical layer for network wide interface/protocols (IPv6 started in 1992 and has reached a 10-25% deployment after 20 years) and at market level (smartphones appeared in the early 90ies, the iPhone in 2007).
Another aspect of evolution over the last century has been the convergence of industries that have found in the Internet a common medium for business communications, entertainment and residential voice communications. In these last 25 years we have seen another important evolution, from fixed to nomadic to mobile. And all this is now coalescing into a much flatter network where the edges are not as different from the core and where “everything” can be on the Internet.
Indeed, sensors, be them public, semi private and private, are becoming part of the network.
In parallel we have seen and increase in expectations, from “does it work?” to “how fast?” to “how reliable and safe?” to “how easy?” and “how cheap?”.
In this growth one can ask where is the end, what are the boundary constraints. We can look at broadband and we can say that its usefulness in terms of increased capacity is limited by the human perception. 4k video seems to be at the edge of human perception and it requires about 15/20Mbps (taking into account 3D) –this was the figure given by the speaker. Actually a good 4k requires a bandwidth in the order of 50Mbps, stronger compression results in perceivable quality loss. Considering a 5h consumption per day it means 2TB per month per household, compare this with todays GB: an increase of 1,000 fold –again, these were the figures given by the speaker, if you do the math that yields to 7.5-10.8TB per months, hence to a 3,750-5,000 fold increase; however the point still holds.
How can we upgrade todays’ networks to scale three orders of magnitude? Generally speaking all new transmission technologies work best over short distances, so the trick is to decrease the “last mile” to “last hundred yards” and less.
In wireless we have seen a new “G” every 10 years. We are now looking at 5G and that, whatever it might mean, brings us to the next decade, whilst a 6G will see the light in the 2030ies.
The 5G is likely to push the speed envelope to the 100Mbps (speed in wireless have to be taken with a grain of salt: 4G can provide 100Mbps in theory,, in practice it does not) although this is unlikely to enable a different sort of applications (we do not see any mass market applications that needs more than what can be provided by 4G).
The overall capacity over short range will be increased and that means more people can access 4G type applications. For this, frequencies of the order of 24GHz will be used and that means much smaller cells. In a way, as it was observed by another speaker, 5G is another hot spot system, it will flank 4G. So far we haven’t seen a technological change on the horizon, comparable to the one we had in previous Gs generation (TDMA, WCDMA, OFDMA). 5G is therefore more about aggregation of existing technologies than the deployment of something radically new. In my opinion the disruption coming with 5G is not rooted in technology (although technology linear evolution in the smart phone processor makes possible a much greater control of the access by the terminal) but in the market and players, with the emergence of new, not established, players. Notice how aggregation does not stand for convergence, actually diversity will be driving the evolution and even more so in the 6G. The point will be to provide a seamless experience to the users independently of the technologies being used.
As we have seen application functions becoming decoupled from the telecom operators in the last 5 years, we are likely to see network functions becoming decoupled from the network operators (SDN and NFV may prove to be a two edges sword).
By 2035 the situation in the US will see a stratification tied to the willingness to pay in three layers: 1Gbps, 50Mbps and 5Mbps. This is a change from the normal subdivision between urban and rural environment as a discriminant for bandwidth speed.
At present there is little incentive to pay for over 20Mbps of bandwidth.
The deployment and enhancement of infrastructures to deliver that 3 orders of magnitude increase in capacity is likely to be achieved in different ways in different parts of the world, with different mix of private and public investment. In turns, this may impact the affordability and the reliability in different field. As the speaker said, we hope that Internet will not be viable just for showing pictures of your cat, but that it can indeed support economic progress and critical information and services. In general the feeling is that as a person from 1994 will be at ease today, so a person of today will not have any problem in recognizing what will be in 2035. However, see my considerations in the first post of this series and notice the many unexpected twists we have seen over the last 20 years.
As we have seen in the past, a rebalancing of functionalities between the edges (the terminals) and the network, we can expect such rebalancing to go on in the coming 20 years. However, the driver will no longer be the cost efficiency in doing something centralized or decentralized, rather it will be driven by social and psychological considerations.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.