Dinner tonight: Hamburgers. With Perrier.
Because I wanted hamburgers. And I didn't feel like just having water, but I didn't want a soda or alcohol, so fizzy water was the ideal solution, OK!?
Tuesday, September 9, 2008
Wednesday, August 13, 2008
On living forever with technology and continuity of consciousness: Ramblings in Transhumanism
Disclaimer: I do not know a whole lot about transhumanist philosophy. If all of you have long ago solved this problem and I look like a dumbass, so be it. This is an idea that creeps me the fuck out, and I can't but think about transhumanism without eventually settling on it, and it's my blog, so I'm sharing.
Disclaimer part two: This contains a lot of semi-rambling scientific theorizing and philosophy. If that's not your thing, move on.
Assume that it's possible to create a fully functional synthetic brain, comparably sized to a normal brain. It replicates the actions of neurons in the brain perfectly, such that if you were to copy the current states of all neurons in a living brain, it would function identically: same memories, same thoughts, and so on. Obviously it could (and probably would) diverge once the copy was done, but it starts out the same.
Obviously, if I copy my brain in this fashion, my self, my consciousness, which is as best we can tell just the activity of neurons in my brain, is still in my body, and ONLY in my body. The guy on the desk over there thinks like me, and is probably pretty surprised to not be in a body anymore, but he's not really me. The continuity of my consciousness remains in my physical body. If you replace my brain with the metal one, the consciousness that's in the meat dies. This seems fairly obvious.
Now, suppose that as a corollary to our ability to create the synthetic brain, we're also able to create individual neuronal units, and use them to replace individual organic neurons in a living brain. In theory, if we replace one of my neurons with a synthetic one, I shouldn't notice the difference. It'll act just like the organic neuron it replaced, giving the same inputs and outputs, so the rest of the brain should treat it just like any other neuron. This assumes that it can change state the way the organic neuron did, but I'm treating that as an assumption of our ability to create a fully functional synthetic brain.
Anyway, it would seem as though replacing more and more organic neurons with synthetic ones shouldn't present a problem for my consciousness. They act the same as the old neurons, and if we change them out when I'm unconscious, there shouldn't be any neuronal confusion, since the replacements go in as exact copies of the parts they replace (presuming nothing relevant is changing while I'm unconscious). So it would seem like I should never know the difference. Following this to its logical conclusion, if we replace the organic neurons bit by bit, eventually we should be able to replace the whole squishy thing with hardware. Since the continuity of my consciousness was maintained throughout the gradual changeover, in theory "I", the sense of self that was in the meat, should now be in the hardware, running my body just like always, except now I can be transplanted into a robot body or a car or whatever.
But obviously, the first option, where we copy everything all at once, killed "me," even though there's a damn good copy running around. If we copy my brain all at once and replace it at once with the synthetic one, "I" am dead. But what's the difference. Consciousness is the activity of the neurons of the brain; why should replacing them bit by bit be any different than replacing the whole thing? How gradual do we need to make the change? If I replace one half, then the other, am I still alive? Can I replace all but one neuron? Two? Do I have to only replace them one at a time, or else parts of my self start winking out? And how would they wink out? The synthetics are communicating with the organics just fine, so why should my experience of them be any different?
Obviously the answer to this is something we can't have yet, because we don't know nearly enough about how consciousness actually works. It might turn out that you just can't perform the above thought experiment, although I'd have to wonder why not. But the worst part is the problem of the philosophical zombie. To the outside observer, there's no difference between "real" me, the me that's been in this organic body my whole life, and synthetic me, who was copied from my meat brain into circuits. If I change over gradually through a series of neuronal replacement surgeries, everyone sees me acting exactly as they'd expect me to. But if "I" die, and the synthetic, perfect replacement of my consciousness is the one in my body now, nobody can tell the difference. He's a perfect copy, he acts just like me, and does exactly what I'd expect to do. We can't set up a code word to see if it's "really" me, because he remembers that. The only way to set up the code word is to tell me after the copy is made, but that might screw things up, since either the neurons we're installing aren't exactly the same as the ones we're replacing (because they changed in response to the memory) which might just be enough to disrupt my continuity of consciousness (or hey, if we replace all of the neurons containing the memory, I don't have it anymore, even if I'm still me) or they didn't change, and the synthetic me can read it out of the meat memory.
So, when the first organic to synthetic brain changeover happens, I can't see a way for those of us watching from the outside to know if the person who wakes up is actually the person who went to sleep, or if it's a perfect copy, which makes figuring out exactly how to do the whole thing without killing the original consciousness safely somewhat problematic to say the least. But the worst part is if I do it. Ideally, I'd love to transfer my consciousness to a synthetic brain, but only if "I" am actually changing over, not just making a copy.
And I don't know how anyone else could tell if it actually worked that way. The consciousness that wakes up from the operation certainly thinks of itself as me; it has all of my memories, it thinks like I do, and it's probably pretty happy that this fear of mine was unfounded, because there it is, alive and whole, just like it remembers going to sleep. But that doesn't mean that the consciousness that went to sleep is the one that woke up; "I" might lie down into oblivion, and nobody I know would have any idea that I'd killed myself hoping to live forever.
Disclaimer part two: This contains a lot of semi-rambling scientific theorizing and philosophy. If that's not your thing, move on.
Assume that it's possible to create a fully functional synthetic brain, comparably sized to a normal brain. It replicates the actions of neurons in the brain perfectly, such that if you were to copy the current states of all neurons in a living brain, it would function identically: same memories, same thoughts, and so on. Obviously it could (and probably would) diverge once the copy was done, but it starts out the same.
Obviously, if I copy my brain in this fashion, my self, my consciousness, which is as best we can tell just the activity of neurons in my brain, is still in my body, and ONLY in my body. The guy on the desk over there thinks like me, and is probably pretty surprised to not be in a body anymore, but he's not really me. The continuity of my consciousness remains in my physical body. If you replace my brain with the metal one, the consciousness that's in the meat dies. This seems fairly obvious.
Now, suppose that as a corollary to our ability to create the synthetic brain, we're also able to create individual neuronal units, and use them to replace individual organic neurons in a living brain. In theory, if we replace one of my neurons with a synthetic one, I shouldn't notice the difference. It'll act just like the organic neuron it replaced, giving the same inputs and outputs, so the rest of the brain should treat it just like any other neuron. This assumes that it can change state the way the organic neuron did, but I'm treating that as an assumption of our ability to create a fully functional synthetic brain.
Anyway, it would seem as though replacing more and more organic neurons with synthetic ones shouldn't present a problem for my consciousness. They act the same as the old neurons, and if we change them out when I'm unconscious, there shouldn't be any neuronal confusion, since the replacements go in as exact copies of the parts they replace (presuming nothing relevant is changing while I'm unconscious). So it would seem like I should never know the difference. Following this to its logical conclusion, if we replace the organic neurons bit by bit, eventually we should be able to replace the whole squishy thing with hardware. Since the continuity of my consciousness was maintained throughout the gradual changeover, in theory "I", the sense of self that was in the meat, should now be in the hardware, running my body just like always, except now I can be transplanted into a robot body or a car or whatever.
But obviously, the first option, where we copy everything all at once, killed "me," even though there's a damn good copy running around. If we copy my brain all at once and replace it at once with the synthetic one, "I" am dead. But what's the difference. Consciousness is the activity of the neurons of the brain; why should replacing them bit by bit be any different than replacing the whole thing? How gradual do we need to make the change? If I replace one half, then the other, am I still alive? Can I replace all but one neuron? Two? Do I have to only replace them one at a time, or else parts of my self start winking out? And how would they wink out? The synthetics are communicating with the organics just fine, so why should my experience of them be any different?
Obviously the answer to this is something we can't have yet, because we don't know nearly enough about how consciousness actually works. It might turn out that you just can't perform the above thought experiment, although I'd have to wonder why not. But the worst part is the problem of the philosophical zombie. To the outside observer, there's no difference between "real" me, the me that's been in this organic body my whole life, and synthetic me, who was copied from my meat brain into circuits. If I change over gradually through a series of neuronal replacement surgeries, everyone sees me acting exactly as they'd expect me to. But if "I" die, and the synthetic, perfect replacement of my consciousness is the one in my body now, nobody can tell the difference. He's a perfect copy, he acts just like me, and does exactly what I'd expect to do. We can't set up a code word to see if it's "really" me, because he remembers that. The only way to set up the code word is to tell me after the copy is made, but that might screw things up, since either the neurons we're installing aren't exactly the same as the ones we're replacing (because they changed in response to the memory) which might just be enough to disrupt my continuity of consciousness (or hey, if we replace all of the neurons containing the memory, I don't have it anymore, even if I'm still me) or they didn't change, and the synthetic me can read it out of the meat memory.
So, when the first organic to synthetic brain changeover happens, I can't see a way for those of us watching from the outside to know if the person who wakes up is actually the person who went to sleep, or if it's a perfect copy, which makes figuring out exactly how to do the whole thing without killing the original consciousness safely somewhat problematic to say the least. But the worst part is if I do it. Ideally, I'd love to transfer my consciousness to a synthetic brain, but only if "I" am actually changing over, not just making a copy.
And I don't know how anyone else could tell if it actually worked that way. The consciousness that wakes up from the operation certainly thinks of itself as me; it has all of my memories, it thinks like I do, and it's probably pretty happy that this fear of mine was unfounded, because there it is, alive and whole, just like it remembers going to sleep. But that doesn't mean that the consciousness that went to sleep is the one that woke up; "I" might lie down into oblivion, and nobody I know would have any idea that I'd killed myself hoping to live forever.
Friday, August 1, 2008
Extraordinary amounts of pain
In the event that you're not someone who already knows me, this post should shed some light. So, the other night I slept on my back wrong and strained a muscle going from my right shoulder up my neck. I do this every few months (always the same muscle), usually in response to stress. Considering the unusual amounts of stress I've been having lately, both personal and professional, this was perhaps understandable. So ow ow, neck hurt etc etc. Moving on.
Quite separately, I decided that I'd try the hundred pushups challenge. It's a fitness program designed to get you doing a hundred pushups in six weeks. Step one is an initial test, just do as many pushups as you can, and use that to determine how you're going to do the challenge. I decided, in my infinite wisdom, that it'd be a good idea to do this initial test last night. I thought it'd give me a few days to recover, so I could start fresh on Monday morning, and I didn't think the thing in my neck would be an issue. And hey, I did my test (and actually pleasantly surprised myself) and my neck didn't hurt any more than normal.
Until about 5 am, at which time my world was pain. I don't know about you, but when I have deep muscle pain wake me from sleep, I am like a wounded animal. I'm in pain, I'm VERY confused, and a little scared, and I don't know how to make it stop. Somehow Google (yes, the company) got mixed in with the half-dream state I was in.
So today I am in even more pain than I was a couple days ago, which is a big win for me. I've been trying these Icy Hot patches, which are 5% menthol and supposed to relieve the pain. In practice, this means that in addition to being very stiff and sore, my neck is now also chemically warm and cold at the same time. Somehow this is considered a benefit worth spending actual money on. Can't say I understand that.
Quite separately, I decided that I'd try the hundred pushups challenge. It's a fitness program designed to get you doing a hundred pushups in six weeks. Step one is an initial test, just do as many pushups as you can, and use that to determine how you're going to do the challenge. I decided, in my infinite wisdom, that it'd be a good idea to do this initial test last night. I thought it'd give me a few days to recover, so I could start fresh on Monday morning, and I didn't think the thing in my neck would be an issue. And hey, I did my test (and actually pleasantly surprised myself) and my neck didn't hurt any more than normal.
Until about 5 am, at which time my world was pain. I don't know about you, but when I have deep muscle pain wake me from sleep, I am like a wounded animal. I'm in pain, I'm VERY confused, and a little scared, and I don't know how to make it stop. Somehow Google (yes, the company) got mixed in with the half-dream state I was in.
So today I am in even more pain than I was a couple days ago, which is a big win for me. I've been trying these Icy Hot patches, which are 5% menthol and supposed to relieve the pain. In practice, this means that in addition to being very stiff and sore, my neck is now also chemically warm and cold at the same time. Somehow this is considered a benefit worth spending actual money on. Can't say I understand that.
Subscribe to:
Posts (Atom)