Some thoughts on why I would kill myself in order to teleport

Imagine a future society with teleportation technology. Instead of having to spend all day traveling to get from Orlando to L.A., you can now step into a teleporter booth, hit a glowing green button and be more or less instantly transported anywhere on the planet. Here’s how it works: the machine scans your full atomic structure, stores the pattern, then beams it to another teleporter, where a matter-assembler puts you back together again from a stock pile of atoms. You have used this machine many times with no qualms whatsoever. Now, imagine that one day you step in the booth, press the green button, but nothing happens. You are then told in a polite, robotic voice, “I’m sorry traveler, but something went wrong. Although we successfully scanned your body and reassembled you in L.A., the disintegration process failed. Would you please press the purple button in order to finish the disintegration process?”

Horrified, you run out of the booth because there is no way you are going to commit suicide by pressing the purple button. And suicide is exactly what it is to push the purple button. This twist in the teleporter story is meant to alert us to deep issues in the philosophy of personal identity. The point of the twist with the purple button is to pump our intuitions to the effect that any sane person would refuse to use the teleporter at all. After all, the only difference between the normal operation and the purple button case is that you failed to die in the latter. This is supposed to show that there is no continuity between the original person and the duplicate made in the matter-assembler. After all, if there is continuity, why would you not press the purple button? If you sincerely thought your personal identity would be preserved in the reassembly process, and you knew the machine had already reassembled you elsewhere, why would you not press the purple button? The upshot of this analysis is supposed to be that using a normal teleporter is akin to suicide. Identity is not preserved.

I’m not convinced. I would use a teleporter if I knew there was a 99.999999999999999999…% chance of it working properly. But I would not press the purple button. This is not a contradiction. There is in fact a crucial difference between the green button and the purple button. Only a fool would press the purple button, but only a fool would refuse to press the green button if it was working properly. What’s the difference? The green button is useful. It does something, namely, allow you to efficiently travel from location A to B. The purple button, however, does nothing. It kills you. Therefore, any rational agent should have no qualms pressing the green button but would be a fool to press the purple button (unless they wanted to commit suicide). But let’s say you are the traveler in the purple button situation. You know there is a near physical duplicate of you out there somewhere. You know that the duplicate will fulfill whatever responsibilities you have in L.A. What should you do? Well, whatever you want. If you want to travel to New Zealand, you can press the green button and utilize that amazing technology to achieve your desires. The fact that you have a duplicate does not matter. You have no obligation to kill yourself. Why commit suicide when you can travel the world by simply pressing a button? It would be foolish not to use something that had a 99.999999999999999999…% success rate in doing something so incredibly useful.

Thus, I propose that any rational agent, knowing the extreme usefulness of the teleporter and it’s normal success rate, should use the teleporter but shouldn’t press the purple button (unless the agent actually wants to die). It would be quite irrational to refuse to use the teleporter out of the fear that your identity wouldn’t be preserved. The reassembled clone is atom-by-atom identical to the you that pressed the green button. You can’t get any better in terms of continuity of identity than an atom-by-atom preservation. But suicide is irrational unless you want to die. If your atom-by-atom duplicate wanted to use the machine to travel, then presumably that person did not want to commit suicide (unless they were teleporting to an ideal place to commit suicide). Therefore, given that you would have identical desires, it would be strange to want to commit suicide by pressing the purple button given you just previously had a traveling mindset. There is thus no contradiction between using a working teleporter but not pressing the purple button.



Filed under Consciousness, Philosophy

5 responses to “Some thoughts on why I would kill myself in order to teleport

  1. How very odd. When my kids were small – nearly 20 years ago, I would make up stories on the fly at bedtime and during one of them I had the main character warn them never to press the purple button. They thought it was the funniest thing ever but none of us can remember what the story was about. We just remember the “don’t press the purple button!” part. And then you come along with this. To make matters more interesting, I wrote a piece on teleportation myself you may find interesting. Perhaps all of this is related somehow…

  2. Hi, sorry to comment an even older post, but it’s only because I find them interesting…

    I had a two-folded remark, but in the elaboration, the second fold got really long, too long for a comment, so I posted it on my blog, if ever you’re interested. I also added the comment of your more recent post, because they somehow connect.

    Nice blog !

  3. Jake

    Sorry to comment on an old post, but this question fascinates me. It seems like it really is an inconsistency to be willing to press the green button but not the purple button, because in both cases, the machine destroys the body that steps into a teleporter. The exact same thing is done to the original body in both cases; all you’ve done is separate the steps with this thought problem. You’re saying “if it will kill and duplicate me at the same time, I’ll do it, but if it duplicates me before it kills me, I won’t.” Either way, it kills you in both situations.

    You don’t “travel” anywhere; you don’t get to experience what’s on the other side of the machine. Your existence ceases when you push the green button. Therefore, the machine is not “useful” at all. It kills you and creates a copy somewhere else. How is it of any benefit to me that a copy of me is running around somewhere else? I have no access to that copy’s experience because I’ve ceased to exist completely.

    Instead of teleporters, we ought to call them suicide machines or “destroying and duplicating” machines. The person that steps into the machine dies, the person on the other end is a newborn. There’s no conscious continuity between the two; you cease to exist when you step into the machine

    To understand this, think about what the machine is doing whether it is “working properly” or not; it takes the body and brain apart all the way to the atomic level. This is essentially a very precise form of slicing. If the machine, instead of destroying you on an extremely thorough level, scanned you, duplicated you, and then sliced you up with rotating blades (slicing you into larger chunks), then you would have no problem believing it killed you. It’s bloody, messy, painful, and it obviously destroyed the body and brain. It’s just the fact that the painless destruction of the physical form doesn’t feel or look like death that we are convinced we can survive it at all. We can’t deal with or understand an “abstract” kind of death.

    If you’re a materialist, you ought to favor preemptively banning the teleporter before millions of people are tricked into committing suicide on their way to work or vacations.

    • Gary Williams

      Hi Jake, thanks for the very interesting comment.

      I don’t deny that the “teleporter” actually does kill you (hence the title of the post). However, I believe this is no obstacle to actually using the machine.

      To see why, consider a case of destruction and re-assembly that occurs in the same spatial location. In this case, the machine scans you, destroys you completely, and then reconstructs you in the very same location you were before it “killed you”. Let’s imagine this whole process occurs very very fast (in a few nanoseconds). It seems intuitive (to me) to think that what *actually matters* for consciousness has been completely transferred. Although you suffered physical death, you were “functionally resurrected”. New body, exact same mind. And since I only care about my mind, not my body per se, then I would consent to this process of destruction-reconstruction so long as I was 99.99% sure it would work. The teleportation case is the same, but with a longer time delay for reconstruction.

      I think there is already precedent for this “functional resurrection” in the medical literature: so-called brain dead people that come back to life. Let’s take an imagined case of brain death with brain area X completely dies, the person dies, but the medical team holds the person in a state of frozen animation. So the person is now dead. But, now, the medical team implants a functionally-identical silicon replacement for brain area X, unfreeze the person, and now the person comes back to life. Do we really want to say that the person who wakes up from the surgery is a completely different person? Or would we want to say it’s the same person? Their brain function (and hence mental function) is identical pre-death and post-death. So my intuition is to say that what matters for identity is preserved.

      Likewise for the teleportation case.

  4. shane

    Very clever point made Gary but if I was given all the information in order to make an informed decision before entering the teleportation machine, given your first point about pressing the purple button I would not enter. Who is to say that dead people brought back to life are the the same people anyway? I’m sure some people would use the teleportation machine. Some people used asbestos in our homes and some people take drugs, but there came a time when it could be proven that they were bad for you. Perhaps a teleporter would bad bad for you too. I did enjoy reading your post so felt compelled to add another comment.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s