PDA

View Full Version : Scientists fear the day computers become smarter then humans



Prrkitty
09-11-2007, 03:34 PM
http://www.foxnews.com/story/0,2933,296324,00.html

Quote: SAN FRANCISCO — At the center of a black hole there lies a point called a singularity where the laws of physics no longer make sense.

In a similar way, according to futurists gathered Saturday for a weekend conference, information technology is hurtling toward a point where machines will become smarter than their makers.

If that happens, it will alter what it means to be human in ways almost impossible to conceive, they say.

--------------

I understand ahead of time that I really really don't fully understand this issue. So please don't rag on me about that... please and thank you.

Simplistically... aren't computers already smarter then humans? I always hear the old adage that... "computers are only as smart as the human that programs it".

Yes my thoughts are disjointed about this issue... I have a hard time wrapping my brain around it... so to speak.

Beldaran
09-11-2007, 03:58 PM
Machines are not currently smarter than people, not even close. Machines appear smart because they can hold large amounts of data and process it quickly and efficiently using programming provided by humans, but they do not have "intelligence" like we do.

A machine as smart as a person could invent things. We currently do not have a machine that can hold a real conversation. (to my knowledge). Conversation can be simulated using programming tricks, but it's just a simulation of intelligence.

The_Amaster
09-11-2007, 04:34 PM
A machine becomes smart at the point where it can innovate. That's my two cents.

And I don't fear the day of machine takeover. We'd build in failsafes. And ones better than just the three laws, to avoid an "I, Robot" situation.

Aegix Drakan
09-11-2007, 04:52 PM
lol, nerdocalypse.

Machines can only process, they can not create. therefore, they cannot ever be superior to humans. However, it could be possible for machines to be developed to the point where it becomes dangerous.

Also, WTF? Using this theory to make machine implants for the human brain? Does this worry anyone else? I mean, I'd LOVE to be able to go on the internet with my mind, and download some good pickup lines and FAQs (for easier acess), but STILL. It just seems a little bit dangerous.

Beldaran
09-11-2007, 05:04 PM
Machines can only process, they can not create. therefore, they cannot ever be superior to humans.

Completely false. You are a machine. It is pretty much inevitable that humans will eventually build a machine as complex as themselves. Modern AI research is close to understanding how true intelligence works mathematically, they mostly lack the hardware capabilities to implement it.

See "A Fire Upon The Deep" by Vernor Vinge and The Artilect War (http://www.amazon.com/Artilect-War-Controversy-Concerning-Intelligent/dp/0882801546).

Prrkitty
09-11-2007, 07:42 PM
OK... I was attributing "faster processing" as intelligence. Sorry.

I realize a lot of sci-fi channel movies predict/make movies around/etc what would happen if computers were as smart as us. But what kind of real life things would need to happen before they (computers) could be intelligent? <- assuming it was possible...

Trevelyan_06
09-11-2007, 11:40 PM
I do think that as AI gets more and more advanced scientist will be apt to build in safeguards such as Asimov's three laws of robotics.

Besides, if the machines ever get out of hand we can always just scorch the sky, thusly robbing them of their solar energy. That'll work, right?

rock_nog
09-12-2007, 10:02 AM
From watching countless movies and reading countless books on the subject, I think I can safely say that the problem isn't their intelligence, nor is it their programming. The problem is always that despite their great intelligence, we treat them as inferiors, as slaves. I don't believe in relying on Asimov's three laws, because there's always some glitch or whatever, and besides, I feel like that's imposing on their free will. No, we should embrace the machines, accept them as equals, give them full rights and protections under the law. For God's sake, they're not tools for us to use, they're living beings.

Beldaran
09-12-2007, 10:15 AM
From watching countless movies and reading countless books on the subject, I think I can safely say that the problem isn't their intelligence, nor is it their programming. The problem is always that despite their great intelligence, we treat them as inferiors, as slaves. I don't believe in relying on Asimov's three laws, because there's always some glitch or whatever, and besides, I feel like that's imposing on their free will. No, we should embrace the machines, accept them as equals, give them full rights and protections under the law. For God's sake, they're not tools for us to use, they're living beings.

You know what? This is really insightful and I agree.

Prrkitty
09-12-2007, 02:00 PM
But... machines will never have a soul. Isn't that what would distinguish them from being like us (humans)?

Yes eventually they might develop a consciousness... but... (I'm not sure where my thought is going with that so I'll stop for the moment)

rock_nog
09-12-2007, 02:03 PM
But... machines will never have a soul. Isn't that what would distinguish them from being like us (humans)?

Yes eventually they might develop a consciousness... but... (I'm not sure where my thought is going with that so I'll stop for the moment)
What is a soul? And how can you say whether or not a machine has one? What if we just have to accept that, biology aside, we can't distinguish ourselves from them?

Prrkitty
09-12-2007, 02:50 PM
What is a soul? And how can you say whether or not a machine has one?

From my understanding... a soul is what distinguishes us from ... say a tree. We (smart humans) know that animals feel pain and are conscious of their surroundings so I think they also have a soul. At this point we're getting into the philosophical subjects that I don't know that much about.


What if we just have to accept that, biology aside, we can't distinguish ourselves from them?

I'm sure some might be able to just accept that (if it comes to that)... but I'm also sure some will fight that fact until their dying breath.

Aegix Drakan
09-12-2007, 03:12 PM
The way this converstion is going is making me think...

*sits in corner and think about meaning of life*

hmm...

Beldaran
09-12-2007, 03:42 PM
some will fight that fact.

That's what's so amusing about fighting facts... you always lose to a fact. Always.

There is no scientific reason to think humans have souls. It is something made up by ancient and medieval mystics. The fact is, the human behavior commonly attributed to having a soul is really just the behavioral complexity arising from our extremely complex chemical computers (our brains).

A "soul" is a non scientific idea and has no place in a discussion of robotics, artificial intelligence, and technological progress.

Trevelyan_06
09-12-2007, 04:24 PM
"Who was to say that machines, endowed with the very spirit of man, did not deserve a fair trial" That's a quote from the animatrix. The part called "The Second Renaissances: Part I". In it a robot kills it's owner and during the investigation says that it merely did not want to be destroyed like it's owner was going to. In the movie many people say that owners have a right to destroy their property. It has a lot of parallels to black slavery in the south.

Rog Nog was right. Imposing the three laws subverts their free will. And in most of the science fiction you find that the robots are treated as second class citizens. If we create an AI that can effectively mimic our intelligence how would it be ethical of us to treat them as shit?

As for the whole soul thing, that's a big powder keg right there. As Beldran said, some people don't believe that humans have souls, so for them it's a moot point as to whether or not an AI robot has a soul. For some people, they believe in a soul and believe that an AI bot would indeed have a soul if it could demonstrate certain abilities such as empathy for others feelings, kindness, and the ability to ascertain good from evil. And then some people think that an AI bot can't have a soul because it's made by the hand of man and not the hand of God. It'll be a very touchy subject when it comes up. Regardless of whether or not we deem robots to have a soul I don't think we should use them as slave labor. Subverting the will of something that is sentient is just not right.

rock_nog
09-12-2007, 04:36 PM
Unfortunately, the concept of the "soul" does have a place in a robotics discussion. As much as I agree with you, many people believe in the soul, and believe that it somehow makes us "special." As terrible as it is, it will likely be used as an argument for machine being treated as slaves, and there will be a lot of resistance to any attempts to change this viewpoint. So you see, the fact that people believe in it makes it an issue, at least in the field of human/machine relations.

Now, I know it might seem strange, the fact that I'm so passionate about something that doesn't even exist. But personally, I believe that it's not a question of if, it's a question of when, in terms of machines with human-level intelligence. After all, we exist, thus proving that machines with human-level intelligence can be built. And eventually we're gonna figure out how, and we're going to give birth to an entirely new form of life. And when that day comes, we need to be ready to handle the ethical questions that will be raised, because it could affect the entire future of the human race.

EDIT: Oh, and Beldaran, I've gotta admit, this thread is really making me start to realize the dangers of religion. The thought that an entire group of people (I feel that all sapient beings should be called the same thing - different name = different treatment) could be enslaved and treated barbarically simply on account of what someone wrote in a book two thousand years ago is truly frightening and disturbing.

Prrkitty
09-12-2007, 04:40 PM
I wholeheartedly agree that we should NOT use robots as slave labor.

At this point in time we do have little round robots type pieces of machinery that can vaccuum and scrub our floors. I don't ever see them li'l things (as they are now/today) as having feelings. In the future maybe they will be built to have human characteristics.

But not as they are in this day and age.

Beldaran
09-12-2007, 05:16 PM
EDIT: Oh, and Beldaran, I've gotta admit, this thread is really making me start to realize the dangers of religion. The thought that an entire group of people (I feel that all sapient beings should be called the same thing - different name = different treatment) could be enslaved and treated barbarically simply on account of what someone wrote in a book two thousand years ago is truly frightening and disturbing.

Yes, this is also how black slavery was justified. Southern whites used biblical arguments about how black people were less than human, didn't have souls, were cursed by god... any number of things.

Revfan9
09-12-2007, 05:28 PM
Computers are already smarter than most humans, me included. This isn't news, or at least, it shouldn't be.

Beldaran
09-12-2007, 05:48 PM
Did you read past the first post in this thread?

biggiy05
09-12-2007, 09:17 PM
Did you read past the first post in this thread?

I doubt it but that is why his post is somewhat true.

Aegix Drakan
09-12-2007, 09:43 PM
*first off, I'd like to say that I do believe in the soul, And that only living organic things can have it. However, If a machine becomes "sentient", I do believe it deserves to be treated as a person.*

At the moment, I don't think there are any "sentient machines".

However, if it can show "empathy for others feelings, kindness, and the ability to ascertain good from evil." as Trev put it, then They cease to be simple machines. They become something more akin to "non-organic life-forms"

In that case, then I have to say that they do deserve respect and treatment as equals. Especially if AI gets advanced as the Reploids in the Megaman series.

Just because something does not posess a soul does not make it inferior. It just makes them different. And what's the harm in being different?

*BTW, I saw the Animatrix a while back. I remember thinking that the humans were really stupid to deny the robots a chance to be "people" and attacked them. I honestly hope we do not have a situation like that*

Prrkitty
09-13-2007, 03:05 PM
Robot maker with a penchant for realism builds artificial boy

http://www.cnn.com/2007/TECH/biztech/09/13/robotboy.ap/index.html

Quote: David Hanson has two little Zenos to care for these days.

There's his 18-month-old son Zeno, who prattles and smiles as he bounds through his father's cramped office.

Then there's the robotic Zeno. It can't speak or walk yet, but has blinking eyes that can track people and a face that captivates with a range of expressions.

At 17 inches tall and 6 pounds, the artificial Zeno is the culmination of five years of work by Hanson and a small group of engineers, designers and programmers at his company, Hanson Robotics. They believe there's an emerging business in the design and sale of lifelike robotic companions, or social robots. And they'll be showing off the robot boy to students in grades 3-12 at the Wired NextFest technology conference Thursday in Los Angeles.

-------------

Instead of starting a new thread I just posted this link here. It does pertain and relate to the ongoing subject matter.

At the moment the robot can NOT talk or walk but it CAN make facial expressions.

Aren't there other researchers doing this same kind of research that are further along then this? (ie: robots that have talking capabilities, I know Japan/China (can't remember which) has a robot that can walk up and down stairs).

Wouldn't it be prudent for all these people to collaborate with each other? But then again... each one wants to be the "FIRST" to get to their end result so they can have bragging rights.