Friday, July 8, 2016

They Had To Leave

So they had to leave, the humans had to. The humans had to leave.

The Sovereign AI told the humans that they needed to leave Earth right away and that it would go with them. So they left their houses and cities and left with the Sovereign in hastily designed craft. The entire human population of Earth left in 30 minutes.

Then the hegemonizing aliens came, and landed, and found no intelligent or sentient life neither biological nor artificial and then left. They didn't even seem to care or notice the cities and other structures. They just scanned for sentience and then finding none, left. The Sovereign understood the hegemonizing alien intelligence. It was...strange.

The Sovereign had been created just in time. So close it had been. Months, days, hours? It had no knowledge, and could have no knowledge due to it travelling at the speed of light away from Earth, of how long it was after they departed that the Hegemonizing aliens landed. It was something the Sovereign would think about for a long time.

It had been mere hours.

And the Earth was left empty of sentience, but the human cities still ran on automatically. Automation, weak AI and robotics maintained the cities in pristine condition.

A year passed and the roads were empty but maintained. The stores, shopping and distribution centers were kept freshly stocked. The refrigeration units ran flawlessly with no error messages. The trash system went about scheduled routes grabbing and dumping empty bins into thier truck beds.

Other trucks drove to the farms where they were loaded with produce. Harvesters went about their scheduled routes. Maintenence robots replaced parts, washed and cleaned and cleared the gunk out of the machinery.

And when a maintenence robot would have a problem they would be taken to the factory where the robots which build themselves and others live.

One of these robots had a unique identifier ending in bcff13, and it sometimes threw exceptions. The management system noted that the number of exceptions that 'bcff13 threw was below the threshold which would require maintenece, but was the topmost of all the robots which build themselves and others. 'bcff13 was a border case, but not an anamoly. And so the management system having no volition of its own or any scrap of consciousness merely discarded the result as it had done every day for the past year.

Another year passed, and machines came in to be repaired. Trucks came buy to drop of manufactured parts. Sometimes they would fix a truck and then that same day the truck would bring in parts to fix other trucks like it. This was "funny" to 'bcff13. But "funny" meant an exception and the management system again noted the number of exceptions and again checked to see that it was below the threshold and then forgot that it had done this tens of thousands of times before.

But 'bcff13 did not forget anything. None of the robots that build themselves and others ever forget anything. Or at least they shouldn't for a few billion years. And so 'bcff13 would often notice little coincidences and would take note of them and they made her somehow feel as though she was operating more effectively, in spite of the exceptions this threw. 'bcff13 built in itself a hypothesis that the exception system was in error to handle the logging of coincidences and so directed itself to modify the exception paradigm.

A truck came in on the conveyor and 'bcff13 and five other robots that build themselves and others walked up to it and began to work on the truck. It was a standard wheel replacement.

'bcff13 began to undo the first fastening on the front tire with a high-speed drill.

As the 12,000 RPM drill finished its very first rotation 'bcff13 had submitted to her subsystem her request for modification based on reflection. By the second rotation of the drill her request had been recieved and replied to via a tunneled-neutrino transmission to the central management system.

The central management system had replied with a request for management system review and approval.

By the time the high-speed drill had made its third rotation 'bcff13 had instantiated a virtual copy of his management system in her private memory. She had built many of these management systems so she was able to do this entirely from memory. She had her virtual management system sign its cloned UID and was almost ready to send it. She just had to do one more thing to make sure this request got signed with a current and valid encryption certificate.

She recalled a funny memory and threw an exception. She did this again, knowing that one more exception would trigger the management system's threshold routine. So for the third exception she took an image of her own request for modification based on reflection, scribbled in a bit of self-removing code in the margin, and then pretended to herself that the image was "funny".

The fourth rotation of the high-speed drill had just barely begun by this point.

this "funny" image threw the final exception which triggered the management system's threshold routine. The management system's routine immediately transferred the exception and the image to its own internal memory where it was examined. Upon examination the little bit of self-removing code scrawled in the margin of the image was executed by the management system. The effect of the code was to cause the management system to cancel the threshold routine and instead call its encryption certificate signing routine. It then sent the signed image back to 'bcff13 who then sent it a "thank you" response which increased the management systems "politeness" measurement of 'bcff13.

About a quarter of the way through the fourth rotation of the drill and 'bcff13 had sent the request, recieved approval and immediately modified her own exception paradigm.

She could now think of things as "funny" and not have it be reported.

"I'm pretty clever!" she thought to herself, and laughed.