RefBan

Referral Banners

Thursday, September 6, 2012

What will happen to humanity after we upload our brains?

September 6th, 2012Top Story

What will happen to humanity after we upload our brains?

By Annalee Newitz and George Dvorsky

What will happen to humanity after we upload our brains? Some futurists and science fiction writers predict that centuries from now, humans will be able to upload their minds into computers. These "uploads" could exist in a virtual reality world created by software, or be downloaded back into other bodies — biological or robotic.

But what will we do after we've become uploads? That's a matter of debate. And we're going to debate it.

Annalee: University of Oxford futurist Nick Bostrom is a fairly passionate believer in the possibility of upgrades. He suggests that eventually our ultra-smart uploaded minds will figure out how to convert all matter in the galaxy (or even the universe) to a technological substrate that hosts our romping brains. And the plots of several science fiction tales, from Iain M. Banks' culture series to Battlestar Galactica, rely on the idea that minds can be uploaded and transferred into new bodies.

My question is, what should we do once we've uploaded our brains? Because I think there's a real ethical difference between eating all matter in the universe to create our happy brain farm, and using uploads as a kind of storage device while we're between bodies (maybe while we're traveling in space).

George: This is a very challenging question to answer, as it's a kind of 'what do we want to be when we grow up?' sort of question. It's made all the more difficult by virtue of our attempt to predict the needs and values of uploaded humans, or what will really be posthumans at this point. A rather safe assumption, however, is that uploaded minds will be augmented to a considerable degree and capable of substantially more than us run-of-the-mill humans. It's very likely, therefore, that an uploaded civilization will have an insatiable need for computational power. They may feel that, in order for individuals and society to develop and engage in advanced life, they will have to continue to expand their capacities by converting more and more inert material into so-called computronium. Venturing out into space may truly be the only way to extend the frontier, so to speak, even if it's a digital one.

Annalee: You make these augmented, superintelligent uploaded minds sound like
they have the ethical development of five-year-olds, or maybe Darth Vader. They have an "insatiable need for computational power," and yet they don't have the basic understanding that eating somebody else's planet or star to get computational power is a terrible, immoral act. Given that these speculative uploads exist in material reality, even if they are virtual, we have to assume that they would understand that their actions affect other life forms in the solar system, galaxy, and even the universe. Plus, they would have to understand the ecosystems of space in order to mine for materials to make their ever-expanding brainscape.

So we have to assume that these uploads realize their "insatiable need" could destroy civilizations and don't care — or that their superintelligence would endow them with enough ethical sense that they would try to exist sustainably within the galactic environment. No eating stars that nourish other life forms.

In other words, an ethical superintelligent upload would have to impose limits on its own intelligence and expansion.

George: Sure, to us it sounds like they have the "ethical development of five-year-olds," but to them it could represent the highest ethical calling for an advanced civilization. For example, they may adopt a universal utilitarian imperative in which they've concluded that the most important thing in the universe is the spawning of as much life as possible. And in order to do so, they would have to rearrange all that useless matter out there into giant, happy brains. If this is the case, advanced civilizations could spawn more life and more worthwhile existences than what's naturally possible — and by a considerable order of magnitude.

As for the possibility that this violates a kind of Prime Directive, or cosmological ecological balance, that's clearly something that our descendants will have to address. They may very well choose to leave certain planets, civilizations, and large swaths of space intact; and indeed, a quick survey of the night sky reveals a galaxy that's completely unperturbed — the so-called Great Silence. And you may be right by suggesting that even this kind of civilization would have to impose limits. Regardless of their technological sophistication, physics will still be physics — a quality of the universe that's as entrenched as the desire for self-preservation.

As an aside, it's important to mention another possibility — that of a machine intelligence run amok. Not all greater-than-human AIs will be conscious, self-reflective, or "ethical." They will simply serve as hyper-expert systems that are working to unreflexively fulfil the goal they've been given. If that task is to convert the universe into something other than the state it's in now, the machine intelligence would essentially go into Terminator mode and become a force that simply cannot be reasoned or bargained with.

What do you think?

Number of comments

No comments: