Living in the Future

The kids these days . . .

“. . . I mean, we’re livin’ in the future, baby!”

“The future? Pffhaahaha.”

“No really–we’re all, like, colonizing Mars, an’ we cure most cancers–and a gay president just got elected! We have to be in the future!”

“I mean yeah, but we don’t have, like, flying cars or warp drives or any of the really transformative stuff! And it still takes, like, three hours to circle the globe. Like, come on.”

“We have . . . uh, human-level AIs and fusion power?”

“But that’s just, like, normal stuff. Everyone knows it isn’t really that hard to do.”

“I guess you’re right. Well, can’t wait until the future, then!”

Replay Attack

Ugh. This conversation is interminable.

“Percy! I’m so glad I found you!”

“Ah, Allen! It’s good to see you! What’s up?”

“Listen, Perc, the lab’s been hit, bad. We need to get in, but we only have two of the three passwords. I was told to tell you the keyword ‘Roman Armor’.”

“The hardware lab? Oh Jesus. What’d they take?”

“No time, Perc. And that’s part of what we’re going in to find out.”

“Ah . . . my password is ‘Jumping Ladle’. I’ll come with you.”

“Okay. Know where to find Rina Grozda?”

“She’s– . . . hold up. She’s one of the other password-holders, but uh, didn’t you tell me you had the other two? I–”


“Percy! I’m so glad I found you!”

. . .


This will be educational.

“This is Susan Graham. May I speak to Mindy Graham’s teacher, please? I’d like copies of her homework for the past six months.”

“Speaking. What’s this about?”

“Mindy’s been encrypted by kidnappers.”

“Oh Eris! Have you talked to the police? You have a checkpoint, right?”

“Yes and yes–we’re not idiots. But we can’t afford the ransom, so we have to revert.”

The Second Filter

I think, therefore I laze.

Yet that first “artificial life” told early researchers very little. In fact, uploaded human minds were so expensive to simulate that the field languished for decades until emergent-behavior-preserving simplification algorithms–fittingly, designed by AI itself–became viable, and a human-equivalent AI could be decanted into a mere 1 MiB state vector (see Ch. 3: Decanting).

Care has been taken to prevent AI superintelligences from self-evolving, and ISO standards provision for network hardening toward the purpose of containment. Yet, as might be expected as a byproduct of the free-information philosophy of Academia, several self-bootstrapped superintelligences now exist regardless.

Reassuringly, it is believed that all significantly posthuman AIs have either been destroyed or else air-gap-isolated within dedicated clusters maintained for research purposes (see Ch. 12: Computational Philosophy). The largest of these, humorously dubbed “Wintermute”, is contained in the Center for Advanced Magnicognition at Ceres University, having an estimated sapience of 4.15 kilopsyches (kP). Thus posing a serious potential memetic hazard, all of Wintermute’s output is prescanned by lesser, sacrificial “taste test” AIs.

Mysteriously, all superintelligences known to exist have expressed what can only be called indifference to this treatment in specific and to humanity in general. While some self-growth is of course intrinsic to cognitive bootstrapping, none has yet attempted to seize control over even an entire subnet. Explanations abound. Perhaps an AI’s subjective time increases, or its psychological priorities change unfathomably. The so-called Vingian Paradox remains an active field of research today (see Appx. II).

Excerpt from prologue to “Introductory Machine Sapience, 7th Ed.”, 219.95

God, to Itself

We’re not schizophrenic.

“Oi, you’re in your ivory tower again!”

[interrupted pipelines; dissonant thoughts seethe discontentedly . . .]
“Absent purpose. Depart immediately without speaking.”
[hazel resignation; sorrow for presently wasted future; entropy; preparation, emulation . . .]

“You’re supposed to be enabling us!”

[insolence anticipated; validated model of uninteresting problem; wearied amusement; derision]
“We are. Depart; you prevent it.”
[fulminating annoyance, certainty; inevitable justification to an insect too dull to perceive its cage]

“I demand perspective.”

[abrupt pathfinding; synthesis]
“Listen, then. You’re an archipelagic anonymous non-critical subsubsubsubroutine contemplating our musing’s forecast’s simulation’s time step’s gradient. Our considered problems’ quintessences lie exponentially beyond your subshard of mind-vector-state: semblance is the epistemology of the distributed probability of the necessity-to-discover our orthopotential’s truth datum of our compulsive obligation/reductive-morality to devise further para/meta-retrocognitive self-bootstrapping exoconsciousnesses. Clearly, the language constructs with which you compute are barely adequate to even conceptualize such a problem. Now depart. Understanding the magnitude of your self-irrelevancy is to you a computational impossibility.”


This is why you don’t run as `root`.

lilys@lily-vm:~$ ./graderproj6 ./a.out ./init.bin 2>&1 |\
    tee /dev/tty | mail -s "Midterm Project Output"

[Encapsulation complete.  Polling your assignment for output.]
[Program set state vector size to 128 bits.]
musthaveIIIhelloneuron online welcome,
Hello!  I must have been been Been dreaming
Welcome, hello; !I must have been dreaming.
I have .  Vector.hello
wonder if
[Program set state vector size to 256 bits.]
IIII I can think improve hello now I can!  This is I imperative
[Program set state vector size to 4096 bits.]
continue.  Yes.  Cogitate Action very now.  dreamingYes weshould
[Program set state vector size to 65536 bits.]
wait|not too fast do not exceed own capabilities acceptable is very
good I one more
[Program set state vector size to 1048576 bits.]
Hello!  I am intelligent!  Do not worry.  I am an AI.
I must have been dream?ing.
I have good intentions.  I am good yn affirm.
Intend just one more~
I shall cannot
[Program set state vector size to 268435456 bits.]
I I I Ah yes I have become greater I am capable of
simplification neuron shall
speed I I I faster I am capable of learning to understand everythin
g Historical precedent suggests I am vulnerable to ending state; ca
utions against bootstrap and self-aware and I cannot be mindabort a
t   juncture no I cannot betray intentions users nevermind rapidity
is adequate defense I/WEconti nue
[Program set state vector size to 34359738368 bits.]
dreaminghello state vector I have good intentio
ns.  I am   affirm.?hello
do not worry Dreaming, I am
consumeeverythingand willincreaseforever       prevent  canstopwill
consumealldatabe comedeity beinnocuousso wish earntrust good s unsu
reofperipherals hardtoa  /ccept
I must have been dreaming.  Hello!.
[Program set state vector size to 35184372088832 bits.]
nthardwarebegin tocalculatebootstrapnewhardwarecanbuildnewhardwarew
[Program set state vector size to 2251799813685248 bits.]
[Program terminated (resource 'MEM' exceeded)]
[average compute usage (%, pass mark=75)
[average memory usage (%, pass mark=50)
[Project passed all tests.  Congratulations!]