Home / Blog / Tech for Autonomous Systems – Self-Evolving Capabilities V

Tech for Autonomous Systems – Self-Evolving Capabilities V

One of the cube that can self assemble creating a replicating robot. Credit: Cornell University

Self-replication Technologies

Life is -by definition- a self replication technology. From viruses and bacteria to complex organisms, each form of life exists because it has found a way to replicate itself (and in this process to get fitter with its environment).

Inanimate objects, like artefacts, do not have the capability to replicate themselves. The differentiation between a living organism and an object gets fuzzier as we approach a molecular level. Here the difference between a virus and a complex molecule, like a protein, is not overwhelming. Indeed, as viruses have to leverage on the external ambient (a cell, a bacteria) to replicate so a protein need to leverage on an external mechanism to replicate. This is what happens over and over in our cells.

Here the point is not to discuss if a protein may be considered as “alive” or not, rather if the same mechanisms that lead to a protein replication may be applied to more complex objects leading to replication capabilities. Notice, however, that this creates the ethical issue of deciding if a self replicating machine should be considered alive or not.

At molecular level researchers have already created replicating strings of DNA, and this has led to self-replicating DNA computers, as an example. The whole sequencing of the genome is an example of replication at work.

Smart materials are being studied to offer replication capabilities. Robots have been designed since last decade and significant work in this area is steered by space exploration where self replication is considered very important.

Work at Cornell University on replicating robots made by cubes that can self-assemble, is pointing out to the rise of a new science, that of self-replication, based on the measure of the level of information replicated. If one is able to replicate 100% of the information (cloning) then one would have replicated completely however this is not necessarily what is needed/wanted.  As an example, a robot replicating itself may want to maintain its identity and create a robot that is almost like itself but with a different identity.

Notice that self-replication does not imply the capability to harvest the materials required for the replication. Clearly a supply of that material should be available but this can be provided by a third party (as it is the case for living beings that often work in a symbiotic relationship with other to become self sustainable).

The acknowledgment that self replication may lead to a new being that is not 100% equal to the original one, opens up the point of evolution through replication as it has happened to life on Earth. However, it should be noted that evolution as we know it, has required eons and a multitude of random variations which is not the case for a self-replicating machine. Here evolution can happen through replication by design, and indeed researchers are working to capitalise on the experience of a machine to improve its self generated offsprings.

The need for an adequate supply chain to fuel a self-replicating machine is also limiting the replication: in general the more complex the replicating organisms the more time is required for the replication taking into account the need to create and maintain an adequate supply chain.

So far this discussion has taken as a given that the machines are made of atoms. Actually, there is a new class of machines, made of bits, that need to be considered and for these the replication takes on a different flavour and is subject to much less constraints.

Cloning of software in the sense of activating several instances of an application is “normal” and is not considered as “replication”. On the contrary, the creation of software bots that can replicate and roam the web is a form of replication. (Soft-bots refer to robots made of atoms, using silicone like substances making them soft and these falls under the previous category of atom based bots).

Today software bots are based on weak AI, in the sense that they can be very smart but in some very narrow endeavour. DeepMind AlphaGo have proven to be extremely good at playing Go (hence very smart, smarter than the human world champion) but that’s (basically) it. You cannot converse with AlphaGo as you would with your friend and not even with Siri (which is another software bots specialised in another area).
Work is going on to reach strong AI, an artificial intelligence that for its breath compares to human intelligence. Technology is not closed to reach strong AI and once reached its implementation at the level of software bots is not a given.

An area of research is the intelligence of swarms, and within swarms it is easier to envisage self-replication in individual component of the swarm. The collective intelligence can actually steer towards the replication of some/all of swarm components.

Interestingly, self replication technologies are creating new legal issues. Manipulation of the genome to obtain crops with specific characteristics is protected by law (the genetic modification can be patented). The issue is what happens to the second generation of crop. Here opinions diverge. Monsanto won a ruling of the Supreme Court in 2013, enforcing its rights on modified seed (soya beans) even after self replication.

Even if ruling for crops can go one way, the ruling over software bots or replicating robots that may change, evolve, may be different. It is a new area of study, beyond technology.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. He's currently Head of the Industrial Doctoral School of EIT Digital, co-chair of the Symbiotic Autonomous Systems Initiative of IEEE-FDC. Until Aprile 2017 he led the EIT Digital Italian Node. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the Industry Advisory Board within the Future Directions Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books. He writes a daily blog,  http://sites.ieee.org/futuredirections/category/blog/, with commentary on innovation in various technology and market areas.

Leave a Reply