by Roberto Verdone
It was in the early afternoon of an otherwise unremarkable Monday that Apophis 99942 hit the Moon. The asteroid, whose diameter was nearly 320 meters, crashed on the ground close to the South Pole on August 6, 2035, releasing an energy of 2,500 megatons; that was sufficient to destroy in few seconds any human infrastructures built there in the range of 5 Km.
Apophis was known since 2004, when it was observed and classified for the first time in Arizona, and the chances of a collision with the Earth were estimated. After few years, models predicted a passage close to our planet for the year 2029, without any precise formulation of the consequences of such event on the asteroid’s future trajectory. It was only after 2029 that new computations allowed to predict that a Big Crash would have happened in 2035, near the South Pole, on the Moon.
In the meanwhile, we had started populating the Moon in 2027. The first conquerors settled down in the area of the South Pole, characterised by stable temperatures, and abundance of water resources.
By the turn of the decade, one thousand people were living securely and stably on the Moon; among them, many scientists, but also entrepreneurs motivated by the desire to exploit the natural resources of our Satellite. Due to the predicted Big Crash, most investments were made in areas far from that to be impacted by Apophis. Every week, sky transporters were used to bringing tens of new adventurers, tourists, startuppers and managers along the three-day expensive journey needed to reach the Satellite from the Earth.
In August 2035, about 5,000 people were living on the Moon. The area of the Big Crash had been instrumented with sensors designed to take samples of the event and transmit them in real time for scientific purposes. Few people were close to the site, waiting for the Crash; they stayed there to perform measurements and to assist to the event, in a camp far enough to be not destroyed, but sufficiently close to the site, whose exact location in the meanwhile was computed with the precision of 1 Km.
Since when the first humans settled on the Moon’s surface, it was equipped with the infrastructure needed to support people’s daily life. This included IT networks, with vast usage of radio communication links, thanks to the lack of any constraints to the instalment of tall base stations. Intelligent, disruptive networking approaches were used, in order to get rid of the many limitations of the old-fashioned Internet technologies, and making the Brain available to all users of the IT infrastructure. The Brain was a distributed storage and computing entity designed during the 20’s; it was able to respond to any complex set of inputs in less than 100 microseconds. It was trained to behave like humans, allowing robots to interact with people, understanding their expressions, emotions, fears and desires.
For the purpose of human and machine communications, initially 5.5G radio technologies were used; after 2030, 6G came and it was widely adopted to serve both humans and the billion connected things populating the Moon. In 2035, there was no stone on the Satellite in the 9,000 square Kilometers populated by the Man (a surface three times larger than Luxembourg), which could not take advantage of a 6G radio connection to the Brain: a 1 Tbit/s link using THz frequencies, with Extra-Reliable-Very-Low-Latency Communication (ERVLLC) services guaranteeing the possibility to reach the Brain in less than 0.5 ms at a distance of 100 Km.
Among those who were still close to the site of the Crash in March 2035 was Peter, a 35-year old researcher in Physics who dared to challenge Apophis; he was waiting for the asteroid at a base located 10 Km far from the expected location of the Crash. He was accompanied by a spare group of scientists, thrilled by the occasion to assist to such a rare event. This was a very dangerous choice, owing to the not-totally-under-control prediction model used to assess the risks of remaining there. One hour before the Big Crash, Peter made a FE Call to Sandra, his girlfriend. Sandra was nearly 50 Km far, working as a telecommunication engineer in a safe base populated by many more people.
Full-Experience (FE) Calls were in use since the advent of 6G communication technologies, both on the Earth and the Moon. In 2035 it already permitted to transfer information related to the five senses of human perception at far distances, supported by the cognitive features of the Brain. It was such technology that allowed Peter to spend the last minutes before the Crash, immersed in a FE Call with Sandra.
At 2 pm, they both put on their FE gloves and glasses, and sat at their respective kitchen tables. The hologram of their partner appeared in front of each of them, projected by the glasses. The 3D audio delivered by the 6G network provided deep sense of presence. They started talking about the times when they met at the University, sixteen years before; technology (at the time of 4.5G networking) was still limiting communication among people to the two main human senses: hearing and sight. Now, through their gloves, they had the chance to put their hands in hands. Sandra could feel Peter’s beardy skin on her fingers while caressing his hologram; Peter could smell the perfume she was wearing, which he had given her for birthday few weeks before. The FE glasses measured the e.m. radiations generated by their bodies under the action of their emphatic synergy; the almost instantaneous distant transmission of such emissions, permitted them to feel each other presence as if they were actually sitting at the same table, with perception of no intermediary. They had lunch together, and shared every single part of it tasting the same flavours. They could live a full face-to-face physical human experience, while sitting at a distance of 50 Km, thanks to the FE feature offered by 6G communication technologies.
Time was like suspended during the lunch. But it was not. The hour passed. Suddenly, the alarm blared in Peter’s room and both realised that in a second the Crash would have happened.
Indeed, it was few tens of the sensors distributed on the Moon ground that could measure the presence of the asteroid few seconds before the Crash. By the time Apophis had hit the surface, thousands of sensors had already sampled physical instances transmitted within milliseconds to the Brain. The land shook violently, ground waves spread all around. In few seconds, all structures built within a distance of few Km were totally destroyed. Clouds of dust raised and obscured the sight. It was night, even for space standards. The Brain reacted almost in real time. Within half a second since when Apophis hit the ground, all actuators embedded into the structures built in the previous years protected the life of the few staying inside. Including Peter. The objects in his room felt on the floor, the building shake; Peter’s robotic bodyguard, under control of the Brain, in few hundreds of milliseconds winded up and formed an iron shell protecting his head from falling objects. At the same time, Sandra could check and feel that Peter was safe, despite the violence of the event; the communication link was resiliently stable.
The day after, the geography of the South Pole on the Moon was dramatically changed; a new crater with a diameter of 5 Km was home to hundreds of new sensors, distributed by Peter’s colleagues. Sandra had reached him, and they were working together to analyse the outcome of the analysis performed by the Brain, which was suggesting how to re-instrument the area with new IT infrastructure. The Big Crash came and went, with limited consequences, thanks to the shielding support of the new generation of structures built in the previous years, and 6G radio technologies.
The Sixth Generation of communication networks, permitting in 2035 remote Human-to-Human Full Experience (H2FE) and Physical-World-to-Human Interaction (PWHI), might have seemed fanciful any improbable sixteen years before, in 2019, when Peter and Sandra were entering the University; at that time 5G was not available yet, and the interactions among humans and with things were still limited by Internet technologies and constrained to the two main senses of human perception. In 2019, they could not even imagine they would have ever lived the adventure of the Big Crash on the Moon protected by the Brain, its robotic actuators, and new technologies offering full experience human interactions. On the other hand, artificial intelligence might have seemed fanciful and improbable sixteen years before, in 2003, to their parents, who were experiencing the first (low-quality) video call from a wireless device, using the newly implemented 3G network; the year before, they were still using the old 2G technology, offering just voice calls and machine-to-machine communication at a data rate 100,000,000 times smaller than the one offered by 6G to their children, at the same age, 32 years later. Sixteen years before, in 1987, the dawn of digital mobile communications was just started, with the finalisation of the first GSM standard.
Author’s note: on August 6, 2035, I will be 70 and will retire from the University. In the year 1987, I gave my first University exam in telecommunications (Signals and Systems). This novel is freely inspired by “Sensor Networks: A Bridge to the Physical World” by Jeremy Elson and Deborah Estrin (2003).