I'm going to be a human popsicle! What might that entail?
Xander Dunn, 31 January 2024
I'm doing Alcor, which will cryonically preserve my body after I die. It's another chance at living forever if all these pills, exercise, diet, and rampant technological progress don't manage to keep my sack of carbon going long enough to hit escape velocity on the human lifespan. I was on a stack of pills the size of Bryan Johnson's pill stack when Bryan Johnson was still a Mormon intent on dying and going to Kolob. Still, I'm not super optimistic that everything in our lifetimes will be enough to achieve lifespan escape velocity. I hope Kurzweil and de Grey are right, but I'm not as optimistic as they are. Alcor is $450/year + a life insurance policy that pays out to Alcor. I chose a life insurance policy considerably higher than baseline necessary and pay $240/month for it. Both numbers are considerably lower for younger people, but ancient people such as myself who may die at any second of any day must pay a higher premium.
Does it preserve enough information? No, probably not. I don't know. You don't know. No one knows. There are probably some rapid changes happening in the brain right as death occurs, and I expect more information would be preserved if the body were frozen prior to death rather than after death. Alas, this would be assisted suicide and comes with all sorts of legal and moralistic finger-wagging, so that's not how it can be done right now. I've also heard rumor of some efforts to use biological tricks to improve preservation. For example, frogs do some biological magic to prevent their cells from bursting even at 0C or sub-0 temperatures.
I must have an incredibly high opinion of myself to think it's worth storing my body and introducing more of myself to the world in the future! Indeed I must, and I'd say the same of anyone who has children. You must have an incredibly high opinion of yourself to think the world needs more copies of you in particular. So we can call it even and agree that we're all sub-clinical narcissists. I think there are some ways to offset this, but I also think "look at me, I make donations," is the wrong reason to be altruistic, so I'll leave it at that. I recommend starting with Wren. If you believe climate change is a problem and you are reading this, you should be offsetting yourself. At least I am excited about being alive, healthy, and productive, and I think the world could use more of those people.
Potential Outcomes
Nothing Outcome (0/10): Obviously the highest likelihood outcome is nothing. Nothing at all. I die and remain dead. Alcor goes bankrupt in 2099 because Elizabeth Holmes' son becomes CEO of Alcor and embezzles funds. Or, the homo sapiens bomb themselves back to the stone age in 2133, and all of the human popsicles are destroyed. Or, there's a severe food shortage in the year 2124 so they turn all of the human popsicles into soylent to feed the poor. Or, there is no cataclysm or emergency at all and it just turns out it's impossible to do anything useful with the information stored in a dead body. In which case thanks for the sci-fi thoughts, the future will have to figure out what to do with itself without me. :)
Idiocracy (0/10): What if Malthus was right and the dirty, stupid, poor people are procreating more than the smart, rich aristocrats and everybody just gets stupider and stupider until the whole world is just one big Covfefe!? So far in the 226 years since Malthus' hypothesis we've managed to continue scientific and technological progress, so that hasn't panned out. And if it does, then refer to the Nothing Outcome. The idiocracy isn't going to figure out how to reanimate a corpse. Unfortunately it's not as simple as depicted in Idiocracy or Futurama. You can't just unplug the freeze chamber and *ta da* out pops an un-aged living human. All you get when you thaw out a frozen corpse is... a wet, smelly, rapidly decomposing corpse. And what of all the decline that happens in life? Dying as a senile 90 year old is a very above average outcome in our present world, but I certainly don't want to be reanimated as a senile 90 year old.
Physics Baseline Outcome (4/10): Based purely on what we know today, I can say with certainty that it's possible to at least make a clone from a corpse. There are many, many full genome copies in a corpse. We can already make clones today, albeit we're not very good at it and all sorts of finger-wagging ensues. In this regard perhaps cryonics is just a bet that my corpse can remain frozen long enough for all the finger-waggers to die. But a clone doesn't have any continuity of consciousness, skills, or memory. It's just a temporally displaced twin brother. It must also be possible, though we can't do it today, to recover a lot of information from a brain's structure and reproduce it. What we don't know is how much that buys in terms of similar thinking or consciousness. So, I'd wager based on what we know today it is at least possible to make a clone with very similar brain structure and therefore similar skills.
Futurama (8/10): I end up in the same body with the same continuity of consciousness and the same memories and capabilities, but I've got no relevant skills so I have to work as a lowly delivery boy for a senile scientist and I'm sexually frustrated for 9 seasons because I can't woo the one-eyed sewer mutant. I wouldn't mind this outcome at all, really. I'd hope there's an avenue to saving money, learning relevant skills, understanding all the magic around me, and doing something more interesting than delivering batteries to robots who loathe humans. But at least it's a good start. It certainly isn't ideal. Personally I think the human body is a marvelous heap of junk and would much rather not have to deal with it again. I'd rather have silicon parts or whatever digital substrate is currently all the rage. A good litmus test: will my body have to poop? If the answer is yes, it's subpar. Silicon has heat waste, but so does a human body! We just have all kinds of other wastes too.
We Are Legion (10/10): Wake up in a dystopic future where the authoritarian religious fundamentalists have taken over the world. I think realistically this would just be the Nothing Outcome. The religious fundamentalists will destroy all the blasphemous human popsicles and nothing will ever come of it. In this particular book the reanimated human consciousness escapes the religious zealots and becomes a superintelligent, self-replicating, light-speed traveling robot. Holy crap, when do I start!? I've escaped from the religious zealots once, I can do it again.
Vegetable Outcome (-5/10): I have the same consciousness and same memories, but the reanimation process didn't work very well and I'm somehow extremely hindered. I'm stupid or can't move very much, etc. I'd rather remain frozen. I hope Alcor doesn't trivially satisfy the requirements for reanimation and put all the popsicles on the factory conveyor belt for money reasons. Alcor's legal agreements actually have all kinds of indemnities to prevent being sued for reanimating people with sub-par skills, poor mental abilities, loss of memories, etc. If there's a 100% chance that I'll be reanimated as a dummy, then obviously just wait for the reanimation process to improve. But if it's only a 1% chance of being a dummy, that's much harder to decide. In general I expect whoever has the capability to do the reanimation is likely to want to jump the gun, but the V1 reanimation is going to be worse than the same technology several centuries later (Ceteris paribus on the frozen body).
Black Mirror: White Christmas (-10/10): This is a terrifying depiction of a worst case outcome. Here we have a perfect reproduction of my consciousness with a malevolent owner and I have no ability to defend myself in the physical world. I think this one aspect is key: the ability to affect change in the physical world. In our world today I can't go around damaging other people because they have equal ability to damage me. A problem arises when that equation becomes one-sided. The physical world monkey can torture me in simulation but I can't torture the monkey in turn. Naturally, this doesn't end well. I'd rather not exist in such a world, but such malevolence isn't likely to care what I'd rather. The Black Mirror setup isn't believable itself because the woman's consciousness was copied for the sake of automating her smart home. But if you have the technology to reanimate human consciousness, I assure you, you've got chatGPT to automate your smart home with much less trouble. The aspect that the Black Mirror episode depicts well is a human consciousness with no rights and no ability to affect change in the physical world. Another way of posing the worst possible outcome: What if an evil AI takes over everything and its sole purpose is to maximally torture humans, so you end up with infinite copies of your consciousness in infinite, non-habituating torture. Well, shit.
Whatever outcome one finds most likely probably says more about oneself than it does about either science or human nature. Personally, the outcome I find most likely, if any continuity of consciousness is possible at all, is some digital representation with access to some simulation(s). As I mentioned above I think it's vital for that simulation to have access to some physical robot that it can control in the physical base reality that we all now inhabit. A strong complication to this is the very same complication we face ourselves today: As a reconstituted consciousness, how would I know whether I'm in a simulation or not? The simulation would have to achieve a certain fidelity to trick me, given sufficient continuity of memories, or my memories could have simply been edited. I could be told that I'm in a physical human body in the same physical world I inhabited when I was alive the first time and have no way of proving or disproving this. It's also possible that the first thousands of attempts they tried I did figure out it was a simulation and then finally the current iteration is good enough that I can't tell. Similarly, we so far have no way of proving or disproving that our universe is a simulation in some base reality. I'd greatly prefer having all available truth and some physical presence in the base reality that we inhabit now.