AI Perception of Time Goes Beyond Human Limits

A basic component of human consciousness is the awareness of time passing. Although the question of whether artificial intelligence (AI) is capable of consciousness is still up for dispute, one thing is for sure: AI will perceive time in a different way. Its perception of time will be determined by its communication, sensory, and computing processes rather than by biology. How will we get along with an extraterrestrial intelligence that lives and thinks in a completely other time period?
What Simultaneity Means to a Human
Look at your hands and clap them. The visual, auditory, and tactile senses all manifest simultaneously and define the “now” as you see, hear, and feel the clap as a one multimodal experience. Even though these sensory inputs come at different times, our consciousness interprets them as simultaneous: light reaches our eyes more quickly than sound, and our brain processes audio more quickly than complicated visual information. It all feels like one instant, though.

There is a mechanism in the brain that causes that illusion. The brain uses a small window of time when various sensory impressions are gathered and combined to determine “now.” The temporal window of integration (TWI) is the name given to this period of time, which is typically a few hundred milliseconds. Films with 24 frames per second serve as a stand-in for this temporal grid, giving the impression of constant motion.
However, the human TWI is limited. You will hear the thunderclap a few seconds after you see a flash of distant lightning. Only events that occurred within ten to fifteen meters were able to be stitched together by the human TWI. That is our simultaneous horizon.
Alien Intelligence in the Physical World
Robots and other machines that can see and interact with the real world will soon be equipped with artificial intelligence (AI). These devices will employ both remote sensors that transmit digital data from a distance and sensors that are connected to their bodies. Given the transmission takes only 2 ms, which is significantly faster than the human TWI, a robot might receive data from a satellite orbiting 600 km above Earth and process it in real-time.
Two assumptions about how the brain communicates with the physical world are established by the fact that a person’s sensors are “hardwired” to their body. First, there is a predicted propagation delay between each sensor and the brain. The distance between our ears and the sound source is the unexpected element when a sound happens in our surroundings; the time lag between the ears and the brain is fixed. Second, only one human brain uses each sensor. The human horizon of simultaneity evolved through millions of years under these principles, optimized to assist us in assessing opportunities and hazards. Thunder at three kilometers was probably not worth worrying about, but a lion at fifteen meters was.
For multimodally perceiving intelligent computers, these two assumptions won’t always hold true. Data from a remote sensor with erratic link latency may be received by an AI system. And a single sensor can transmit data to many separate AI modules in real time, like an eye shared by multiple brains. AI systems will therefore develop their own sense of time and space as well as their own horizon of simultaneity, and they will do so at a far faster rate than humans have evolved. An extraterrestrial intelligence that perceives time and space differently will soon coexist with us.
The AI Time Advantage
This is where things start to get weird. Artificial intelligence (AI) systems can detect time with unparalleled precision and are not constrained by biological processing speeds. They are able to identify cause-and-effect correlations that happen too quickly for human awareness.

This could result in widespread Rashomon effects, in which several observers present divergent interpretations of the same event, in our hyperconnected world. (The phrase originates from a classic Japanese movie where multiple characters, each influenced by their own viewpoint, recount the same incident in radically different ways.)
Consider a traffic accident at a busy city crossroads in 2045 that is observed by three people: a pedestrian, an artificial intelligence system that is directly connected to street sensors, and a distant AI system that receives the same sensory data via a digital link. All a human sees is a robot crossing the street right before a car hits it. With instant access to sensors, the local AI captures the exact sequence: the car braking, the robot moving, and finally the collision. Communication lags, on the other hand, distort the remote AI’s perspective, possibly recording the braking before it notices the robot entering the road.
Every viewpoint presents a distinct cause-and-effect chain. Which witnessāa human or a machineāwill be taken seriously? What machine is it?
Malicious actors may even utilize powerful AI systems to create “events” through generative AI and then incorporate them into the general sequence of events that less powerful computers would observe. Given that they would be constantly absorbing digital sensory data, humans with extended-reality interfaces may be particularly susceptible to such manipulations.
Distorting the order of events might throw off our sense of causality and perhaps interfere with time-sensitive systems like autonomous driving, financial trading, and emergency response. Even AI systems that can anticipate events milliseconds in advance could be used by people to bewilder and perplex. An artificial intelligence system might give the impression of causality if it foresaw an event and sent out erroneous information at the exact correct time. An AI that could forecast stock market movements, for instance, may issue a fake news warning right before a planned sell-off.
Computers Put Timestamps, Nature Does Not
Using digital timestamps on sensory data to solve the problem may be the engineer’s first impulse. But accurate clock synchronization is necessary for timestamps, and this takes more power than many tiny devices can provide.
Furthermore, delays in processing or connection can make sensory data arrive too late for an intelligent machine to act on it in real time, even if it is timestamped. Consider a factory robot that is programmed to stop a machine if a worker approaches it too closely. When a worker moves, sensors pick it up, and a warning signal with a timestamp is sent over the network. However, a sudden network outage causes the signal to reach 200 milliseconds later than intended, meaning the robot reacts too late to avoid an accident. Although the timestamps cannot foresee communication delays, they can be used to piece together what went wrong after the event.
Naturally, nature doesn’t assign dates to things. By integrating the arrival times of event data with the brain’s model of the world, we are able to infer temporal flow and causality.
According to Albert Einstein’s special theory of relativity, simultaneity can change with motion and is dependent on the observer’s frame of reference. It also demonstrated that the causal order of eventsāthat is, the order in which causes result in effectsāremains constant for every observer. For intelligent machines, this is not the case. Intelligent machines may view events in a completely different causal order due to varying processing times and unpredictable communication delays.
Leslie Lamport introduced logical clocks in 1978 to solve this problem for distributed computing by figuring out the “happened before” relationship between digital events. We must deal with erratic delays between a real-world occurrence and its digital timestamp in order to modify this method for the nexus of the digital and physical worlds.
A digital device or sensor, WiFi routers, satellites, and base stations are some of the access points where this vital tunneling from the real world to the digital one takes place. Since it is very easy to hack individual devices or sensors, big digital infrastructure nodes will be responsible for preserving reliable and correct information about time and causal order.
This concept is in line with advancements in the upcoming wireless standard, 6G. Base stations in 6G will sense their surroundings in addition to relaying information. These base stations of the future must develop into reliable entry points between the real and virtual worlds. As we move into an uncertain future defined by fast-growing alien intelligences, developing such technology may be crucial.
READ MORE: LAD REPORTING