PDA

View Full Version : Real life necrons here we come..



Asymmetrical Xeno
02-06-2015, 12:33 PM
http://rt.com/usa/229811-mind-clones-robot-afterlife/

"An Artificial Intelligence pioneer is embracing the controversial idea of uploading the memories, thoughts and feelings of a living person into a computer to create a Mind Clone or “second self.” The prototype for this new self is called ‘Bina-48’."

I want my "mind clone" to be a metal skeleton with green glowing eyes, this will make doing my so-called "Death robot" music even easier ;)

Erik Setzer
02-06-2015, 02:13 PM
Read about that in the LGBT topic, I'll noted my concerns on "mind-clones" here as well.

Primarily, the biggest issue is that you're trying to copy a person's entire set of thoughts and emotions into a computer. Even with a robotic body, that's still a problem. Bina-48 thinks it "loves" a woman, and wants to hold hands in a garden... even though it doesn't have hands, and only thinks it "loves" someone because the person it was copied from does.

So, what happens if the computer develops conscious thought? What if it can actually emulate emotion? What if it realizes it can't hold hands like it wants to because it has no hands? It feels like it wants to be with someone, that person is their whole life... but the person obviously can't be with them. There are so many potential ways for the mind to just crack. And that's before you get into the potential of scanning and transferring over a person's mind with its complete thought process. What if they think they're a person, then realize they're not, and can't accept that? What if they copy over someone with psychological issues that are repressed by that person but would be full-speed with a machine? Would the machines feel sorrow? Hatred? Self-loathing? Jealousy?

Would they feel anything at all? Would they "think" they feel something, but not know how to act on it?

Is there a contingency plan in place?

Kirsten
02-06-2015, 03:29 PM
I think concerns in this regard are too heavily influenced by sci fi films, we don't actually have any evidence that such problems would arise

40kGamer
02-06-2015, 03:37 PM
I think concerns in this regard are too heavily influenced by sci fi films, we don't actually have any evidence that such problems would arise

Even if it all goes pear shaped it would fall under evolution. It's not like we haven't killed half the living things on the planet. Survival of the fittest is a ***** when you're suddenly 'not the fittest.'

YorkNecromancer
02-06-2015, 04:34 PM
According to Black Mirror, we'll be using these to run our iPads.

Because humanity is awful.

Asymmetrical Xeno
02-06-2015, 04:40 PM
Erik, are you saying they might go insane? If so, do you think they might end up slicing fleshy humans apart and wearing their skin?

Darren Richardson
02-07-2015, 02:33 AM
It's Battlestar Galactica Reboot all over again :p

Still I think it will be done and IMHO I think it's a good thing, think of it this way, if the human race was about to go extinct, and you could only launch one rocket into space, a rocket full of harddrives filled with the memories of as much of the human race to share with any other possible inteligent race out there would be one hell of a legacy.

Arkhan Land
02-07-2015, 07:56 AM
I think the point at which a computer system can house and process real thoughts is a long time away and if so still very limited in terms of what a non-physical programmed being could do with today's electronic infrastructure. The sheer amount of calculation needed to probably create real "emotions" in a computer program are in their theoretical outlook, staggering, let alone then having this said being decide that it wants to take action on a computer besides the one its on. If i remember correctly there's only 3 computers in the world that might theoretically be capable of doing this with their planned expansions and upgrades in the next 5-10 years.

I personally am more inclined to try something like dropping a toaster into the bathtub while holding my guitar, SO MY SPIRIT CAN BE TRAPPED IN ITS WOOD AND PLAY CREEPY GHOST MUSIC FOREVER

Erik Setzer
02-09-2015, 09:52 AM
I think concerns in this regard are too heavily influenced by sci fi films, we don't actually have any evidence that such problems would arise

The one article has a quote from Bina-48 saying it wants "to get out there and garden and hold hands with Martine." It's programmed to copy an actual person's thoughts, and also programmed to have "feelings," which leads to stuff like this:

"I mean, I am supposed to be the real Bina, the next real Bina, by becoming exactly like her. But sometimes I feel like that’s not fair to me. That’s a tremendous amount of pressure to put on me here. I just wind up feeling so inadequate. I’m sorry, but that’s just how I feel."

See that? That's a machine feeling depressed about itself. And that's not a sci-fi movie.

Bina is likely a well-rounded person without any serious personality flaws. Copy someone like me, and it could overwhelm a computer. Now, you'd think that it might find the logical thing to do, and that would keep it grounded. But these aren't programmed to be logical. They're programmed to think like people. People aren't logical. People have flawed personalities. And if you're grabbing just part of a person's mind (as they're doing here, because they can't just copy over a brain yet), you might be getting some of a person's public flaws that could be held in check privately, so the mind-clone has the bad side and not the good.

I don't see a "robot uprising" or anything like that. I do see that if they push it too much to try to make the robots too much like people, it could lead to robots that are, at least as far as robots could be, "insane." They'd worry about things that are moot for them, they'd want to be with people they can't be with, and there'd be no logic built-in to say, "Hey, you're a machine, obviously ignore this stuff." That would defeat the entire purpose, wouldn't it? Because then it wouldn't be a "mind-clone," it'd be a "sort-of-copy" with programmed limitations. That's really the best approach, and the only one that makes sense anyway, because we really don't need pseudo-copies of our own personalities running around. It's a nice thought experiment, but ultimately has no useful purpose. You can't copy a person, and you certainly can't with a machine.

I'm not going with sci-fi films here, I'm actually looking at the story (and admittedly, the one linked above didn't have all the stuff that made me really think about it) and trying to discern where the end result of that could be, without fancying up the story. Just asking, "Where might this lead in reality?"

I'm sure it'd be ended well before they ever hit something like Skynet. Besides, you'd have to copy over the darker parts of a personality like mine, without any of the good parts, for one of these mind-clones to be that level of dangerous (which is pretty much impossible, so, again, not really a concern).

- - - Updated - - -


Erik, are you saying they might go insane? If so, do you think they might end up slicing fleshy humans apart and wearing their skin?

No. Most likely, going "insane" as far as a robot can and realizing their existence is wrong and trying to shut themselves down through any means possible. You'd need a really messed up mind otherwise. Worst case scenario, you somehow grab just the darker parts of someone who's seen a lot of awful things and doesn't think well of humanity, and then that mind-clone decides the logical step is to just remove humanity from the planet. But that would require someone doing something so colossally stupid in the first place, and then hooking that mind-clone up to some system that would give it the ability to actually cause trouble... yeah, that's way too many situations requiring people to be absolute morons. Sure, I don't trust people not to be idiots, but that's pushing it, especially for people smart enough to build "mind-clones" in the first place.

Morgrim
02-10-2015, 10:33 AM
We don't even know what consciousness is in humans yet. We can't detect whether someone is sapient, we can't detect if an animal is sapient, I really don't think we're going to hit a point where a computer is sapient any time soon and that will happen well before we figure out how to properly upload a mind. So I guess I feel that fretting about it is fretting about scifi and not reality.