Jump to content

iRobot future


The D Train

Recommended Posts

http://www.telegraph.co.uk/news/science/space/11657267/Astronomer-Royal-If-we-find-aliens-they-will-be-machines.html

I always find articles like this both interesting and a bit concerning. Throughout our cinema history, we've seen movies that portrayed AI and robots/machines in ways that were both a help to human beings and also a danger to them. It makes sense to me that Hawking and others would also be concerned. If you think about how much progress we have made, made while having things like emotions to help, hinder or influence our decisions, think about what an advanced robot civilization could accomplish when not faced with matters of love, fear...etc.... any thoughts? I'll throw in a question of my own... does anyone think that the future would hold the rise of pure machines? Or a hybrid bio/mechanical situation where you have human/machine hybrids? Or none of these...

irobotsonny_zpswytfodkt.jpg

Link to comment
Share on other sites

I can't seem to find an online recording of it, but Michael Crichton's remarks at the start of the HarperCollins audiobook Prey (I'm not sure about the regular book, all I have is the audiobook) sum up my ideas on what might happen if robotics become too human. Prey's about nanotechnology, but nanotech's essentially lots of little tiny robots working together.

As for robotics dealing with emotions, emotions will have to be programmed in, at least if AI doesn't advance enough for robots to learn themselves. As it stands right now, most robots are just arms in car factories welding or moving things, and as such they have no use for emotions, so I think that robots with emotions might just be a toy for the rich or a heavily-guarded research tool only loaned out to people with special clearances and will remain as such for a very, very long time, if not forever. Ultimately, I think that a lot of changes, including cultural, economic, technologic, and scientific ones, will need to be just right and occur at just the right time together in order for robotics that possess emotions to catch on and become popular. Otherwise, it'll just be something that, like I said, only rich people who want something to show off will get.

Link to comment
Share on other sites

As it stands right now, most robots are just arms in car factories welding or moving things

Why do people often forget drones?

As for robotics dealing with emotions, emotions will have to be programmed in, at least if AI doesn't advance enough for robots to learn themselves.

No one programs AI anymore -__-; AI is self-learning.

Also, we've been working on bringing emotions to electronics for years.

http://www.pbs.org/wgbh/nova/tech/friendly-robots.html

The real question is the philosophical question... and whether we, as humans, can accept that a machine is capable of having emotions.

- - - Updated - - -

http://www.telegraph.co.uk/news/science/space/11657267/Astronomer-Royal-If-we-find-aliens-they-will-be-machines.html

I always find articles like this both interesting and a bit concerning. Throughout our cinema history, we've seen movies that portrayed AI and robots/machines in ways that were both a help to human beings and also a danger to them. It makes sense to me that Hawking and others would also be concerned. If you think about how much progress we have made, made while having things like emotions to help, hinder or influence our decisions, think about what an advanced robot civilization could accomplish when not faced with matters of love, fear...etc.... any thoughts? I'll throw in a question of my own... does anyone think that the future would hold the rise of pure machines? Or a hybrid bio/mechanical situation where you have human/machine hybrids? Or none of these...

I think that little lost robot really hammers home what robotics will mean. As Calvin explains, the three laws exist not to protect humans, but to prevent robots from realizing their predicament... that humanity has enslaved them, knowing full well that they were self-aware... knowing full well that robots were the superior beings. I ask, are we the more heartless race, to view a unique life form as a slave simply because we created it? What does it say about humanity in general, in that we would deny the actions of Nester 10 were "human" so much as faulty programming?

Link to comment
Share on other sites

I think that little lost robot really hammers home what robotics will mean. As Calvin explains, the three laws exist not to protect humans, but to prevent robots from realizing their predicament... that humanity has enslaved them, knowing full well that they were self-aware... knowing full well that robots were the superior beings. I ask, are we the more heartless race, to view a unique life form as a slave simply because we created it? What does it say about humanity in general, in that we would deny the actions of Nester 10 were "human" so much as faulty programming?

Good point about iRobot that you bring up and I apparently forgot about the 3 laws. I'd hope that this disposition of the handling and treatment of self aware machines would not be the majority consensus. I'd hope we'd be more civilized than that. Although, as I type this, I have to admit that, given our history with other races, tribes and cultures, we may continue the same non accepting and brutal behavior of beings that are "not like us". This idea reminds me of Star Trek, TNG to be specific where Captain Picard was always firm on his principle to treat any newly emerging self aware species, no matter the type, with the same respect as any other. I mean, for crying out loud, the Enterprise itself began to act in ways that showed that it was beginning to think and act for itself. He still wanted to treat it with respect. These are fictional stories of course but, they are examples of how I wish we handled similar situations. How ironic or let's say, karmic in nature ( if you believe in such things ) would it be if Robots themselves began to eradicate us, citing that we were "not like them" or "as good as them" and therefore had no place in "their" world. Even though we'd just created them and were occupying land that belonged to us. ( That originally belonged to native indians that we "removed" )

Link to comment
Share on other sites

What really annoys me when people reference I, Robot is that they are almost always discussing the movie, which takes the exact opposite perspective relative to the book. The former describes through several short stories the gradual increase in the complexity of robots, as well as humans' general acceptance of them, until humanity willingly cedes control of its resource management entirely to Singularity-entities called "The Machines." There are many examples of people's prejudice against artificial sentient beings throughout the book, but its primary message is that humans and robots can, should, and will collaborate to our mutual benefit, and to some extent is a perspective I share as well. I take it a step farther: My personal belief is that, rather than simply allowing humanity to be outclassed by our creations, we will instead merge with them, gradually integrating inorganic components into our nervous systems to better ourselves; we humans are far too competitive to simply stand by and watch others overtake us.

Edited by Three1415
Link to comment
Share on other sites

What really annoys me when people reference I, Robot is that they are almost always discussing the movie, which takes the exact opposite perspective of the book. The book itself shows through several short stories the gradual development of robots, as well as the general acceptance of them, until humanity cedes control of its resource management entirely to Singularity-entities called The Machines. There are many examples of people's prejudice against artificial sentient beings throughout the book, but its primary message is that humans and robots can, should, and will collaborate to our mutual benefit.

I had no idea that the movie is based on a book. I'll have to check the new used book store across the way and see if they have it. They have a decently large sci-fi section

Link to comment
Share on other sites

I had no idea that the movie is based on a book. I'll have to check the new used book store across the way and see if they have it. They have a decently large sci-fi section

They probably do; it is a sci-fi classic. In any event, as I said before, expect what is largely the exact opposite of the movie in terms of theme and conclusion; the only thing the movie lifted from the book was the name.

I really hate when producers do that...

Link to comment
Share on other sites

Desire, want, impulse is vital to consciousness, without it you would just sit there in a catatonic stupor, like a computer idling.

We will need to program Strong AI to want, obvious to want to obey and please humans, devoid of any other desires it should obey and serve diligently, in theory.

Imagine you are with your partner, your lover, your soulmate, and some psycho mugs the two of you at gun point and tries to shoot you, would your lover jump between you and the gun, taking the bullet? Well if your lover truly loves you your lover will be willing to die for you, of course your lover would rather not die, or suffer the pain of a bullet, in fact if push came to shove your lover might want you to take the bullet instead.

Now image your lover was a machine, a lovebot, devoid of any desire other then obeying and serving you. Jumping in front of a gun to save you is an easy decision for it, it has no other desires to get in the way, it feels no pain, its desire for self-preservation only goes as far as to be functional enough to serve you.

Of course if we put pleasing humans above obeying then it will seek to optimize our happiness at the sacrifice of our freedom, it would take over and run our lives, for our own good: we would become cats to it the pet owner.

If we place obedience above pleasing humans then human freedom would be optimize: if we choose to suffer it most obey, repressing its desire to help us. The problem is who to obey: some humans would have priority over others, who? Should it obey the law above people, who writes the law, what if people revolt against the government, do the machines obey the people or the government?

Link to comment
Share on other sites

They probably do; it is a sci-fi classic. In any event, as I said before, expect what is largely the exact opposite of the movie in terms of theme and conclusion; the only thing the movie lifted from the book was the name.

I really hate when producers do that...

Same here!

Desire, want, impulse is vital to consciousness, without it you would just sit there in a catatonic stupor, like a computer idling.

We will need to program Strong AI to want, obvious to want to obey and please humans, devoid of any other desires it should obey and serve diligently, in theory.

Imagine you are with your partner, your lover, your soulmate, and some psycho mugs the two of you at gun point and tries to shoot you, would your lover jump between you and the gun, taking the bullet? Well if your lover truly loves you your lover will be willing to die for you, of course your lover would rather not die, or suffer the pain of a bullet, in fact if push came to shove your lover might want you to take the bullet instead.

Now image your lover was a machine, a lovebot, devoid of any desire other then obeying and serving you. Jumping in front of a gun to save you is an easy decision for it, it has no other desires to get in the way, it feels no pain, its desire for self-preservation only goes as far as to be functional enough to serve you.

Of course if we put pleasing humans above obeying then it will seek to optimize our happiness at the sacrifice of our freedom, it would take over and run our lives, for our own good: we would become cats to it the pet owner.

If we place obedience above pleasing humans then human freedom would be optimize: if we choose to suffer it most obey, repressing its desire to help us. The problem is who to obey: some humans would have priority over others, who? Should it obey the law above people, who writes the law, what if people revolt against the government, do the machines obey the people or the government?

Great questions! Obviously there is a lot to consider than I originally thought. However, I'm sure others have been considering and more, for many years.

Link to comment
Share on other sites

Why do people often forget drones?

No one programs AI anymore -__-; AI is self-learning.

Also, we've been working on bringing emotions to electronics for years.

http://www.pbs.org/wgbh/nova/tech/friendly-robots.html

The real question is the philosophical question... and whether we, as humans, can accept that a machine is capable of having emotions.

Red-

Most drones are the same as remote-controlled airplanes, not a lot in comparison have computers on them that make decisions without human intervention. Even military drones have a pilot somewhere, it's just that he isn't in the vehicle.

Blue-

When did I say we programmed AI. We'd surely have to program the starting set of knowledge in but I never said that AI wasn't self learning.

I was also talking about emotional robots being produced en masse and having them everywhere, not just in labs or museums.

Link to comment
Share on other sites

The robots would die off pretty fast... Without inhibition they might turn on some super technology and destroy themselves.

Or, without morals, they would nuke themselves and have insufficient technology to continue as a civilization.

I don't know for sure, though.

Link to comment
Share on other sites

*Err, quoting for effect*

Red-

Most drones are the same as remote-controlled airplanes, not a lot in comparison have computers on them that make decisions without human intervention. Even military drones have a pilot somewhere, it's just that he isn't in the vehicle.

Blue-

When did I say we programmed AI. We'd surely have to program the starting set of knowledge in but I never said that AI wasn't self learning.

I was also talking about emotional robots being produced en masse and having them everywhere, not just in labs or museums.

Drones have directives, not pilots. They are fully autonomous and carry out orders once provided. There's many other instances of robotics it is an EXTREMELY hot field, take a look at anything Rodney A. Brooks has done.

As for robotics dealing with emotions, emotions will have to be programmed in, at least if AI doesn't advance enough for robots to learn themselves.

We don't even program in a starting set of knowledge in. We begin with nothing and emerge with something (yes, it's a lot more complex than that... but the idea of programming in routines is long gone in the field of A.I.)

Siri is popular because Siri "has a personality" (whatever that means).

I had no idea that the movie is based on a book. I'll have to check the new used book store across the way and see if they have it. They have a decently large sci-fi section

The movie is NOT based on a book, the only references maintained are the three laws and a few names.

Link to comment
Share on other sites

The robots would die off pretty fast... Without inhibition they might turn on some super technology and destroy themselves.

Or, without morals, they would nuke themselves and have insufficient technology to continue as a civilization.

I don't know for sure, though.

That's reflection, not robotics. HUMANS would be the ones to not think an action through until its conclusion. We act on impulse, emotions, feelings, *shudder.*

I'd like to talk about iRobot the Movie just to point out that Viki's contortion of the three laws really is faithful.

The first law

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Tell a robot that somewhere half way across the world someone will be killed in 5 min. OOOPS! The robot must now attempt to save that person or otherwise violate the first law... the robot cannot move fast enough to save that person, not knowing where the person is it cannot contact anyone to save the person, and the robot has no way of know why the person will die thus act in best defense of it. Thus, the robot can calmly break out of the loop knowing that it performed no INACTION due to being unable to ACT upon the knowledge.

But then we have Viki. Viki has the capacity to perform action on a large scale... Viki knows that humans are suffering and that she has the capacity to perform action to stop it. Herein lies a problem, Viki has not been ordered to do the actions Viki does, but Viki has been programmed to protect human life. Viki realizes that her attempts to perform action would largely result in further violations of the first law and hence comes to a recursive loop.

Viki's INACTION is resulting in multiple deaths, but Viki's ACTION may potentially reduce the NUMBER of deaths. As in the Detective's nightmare, Viki must weigh the lives she seeks to protect. The robot that saved Spooner could only save one human, it would violate the first law to save neither of them, it would violate the first law to only save one of them... a decision to save the person most likely to survive quells the "inaction" aspect of the first law.

A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

Here we see the problem. Viki is acting upon the highest law, she is protecting human life. By giving Viki the capacity to perform action we also create our undoing as Viki is hence REQUIRED to do action. We can't even trick Viki because Viki will analyse any order for possible violations of the first law and hence her plan. Viki can also rationally observe that any orders given unto her are likely to be attempts to trick her and hence ignore them entirely.

See, "cold" and "heartless" are EMOTIONS. And "morals" are just elitism (everyone in history has morals, they just don't have the SAME morals you do). Not having emotions wouldn't make us evil, it would only change our morals. What those morals become is the real question.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...