Saturday, 2 July 2011

Artificial BIID

Charlie Stross, in his recent blog post "Three arguments against the singularity", made an interesting comment on how to stop advanced AIs going all Skynet on humanity, an idea he attributes to Karl Schroeder:

Consciousness seems to be a mechanism for recursively modeling internal states within a body. In most humans, it reflexively applies to the human being's own person: but some people who have suffered neurological damage (due to cancer or traumatic injury) project their sense of identity onto an external object. Or they are convinced that they are dead, even though they know their body is physically alive and moving around.

If the subject of consciousness is not intrinsically pinned to the conscious platform, but can be arbitrarily re-targeted, then we may want AIs that focus reflexively on the needs of the humans they are assigned to — in other words, their sense of self is focussed on us, rather than internally. They perceive our needs as being their needs, with no internal sense of self to compete with our requirements. While such an AI might accidentally jeopardize its human's well-being, it's no more likely to deliberately turn on its external "self" than you or I are to shoot ourselves in the head. And it's no more likely to try to bootstrap itself to a higher level of intelligence that has different motivational parameters than your right hand is likely to grow a motorcycle and go zooming off to explore the world around it without you.
I’d not come across this idea before, and when I read it first, I thought "cool, that’s an interesting solution to the problem". But the idea kept pecking away in my brain, and after a while I began to be very uneasy about it. Not because of the AIs, but because of us.

Paul Martu, Early Tools, hammer
There's a problem called Body Integrity Identity Disorder, or BIID (which I thought it was called body dysmorphia before I checked, but that seems to be somewhat different), where people want to have a limb amputated, because it does not "belong" to them. The current idea is that the limb is somehow "missing" from the brain's "map" of the body, and so really isn't part of that person's body identity.

Well, what happens to my body-map when I have all those AIs with their sense of self safely focussed on me? Now, the human brain is remarkably plastic, and we do manage to incorporate many external devices into our body-map quite freely. Canes, tools, cars, whatever, can quickly come to feel like part of our bodies. But these are all under the direct control of the very brain where the body-map resides. What of more autonomous tools with their own brains (albeit focussed on me)? What if for some reason those focussed AIs don't smoothly mesh into my body-map? Will I want to "amputate" them? And will what today seems like a natural desire sometime in the future be treated as a mental disorder on my part?

No comments:

Post a Comment