I can't really imagine a machine having emotions, since emotions have a lot to do with hormones and chemicals in the brain (seratonine is the only one I can name). Machines can follow complicated logic patterns and therefore make conversation, learn, make decisions, etc...but how can you imitate an emotion?
Computers do what you tell them to do, therefore they are limited by what we can tell them to do. Computers and robots will be the greatest tool yet for undestanding our own brains.
I don't see computers ever having the means necessary to house a soul that us humans often take for granted.
There is no sense in creating robots who are "afraid" to die.
Now what if you take the mind, the actual conscience of a human and house it in a machine
One could argue that all your emotions are are responses to the chemicals in your brain. In that sense, couldn't a machine have an emotional response?
Originally posted by danoff
Pako just breached the heart of the matter.
*snip*
Today we have quite a lot knowledge about how we function as humans, emotions, goal-orientation, cognition and so on. But there is one big problem and it is that we are individuals that can not be described by a theory
Computers try to mimic the brain. But I'm not sure it's accurate to say the brain is a computer. "Computer" is a very literal name. All they do is compute. It seems that our brain does more than that. Many human problems would not happen if our brains were computers. But much more would be lost.Originally posted by Red Eye Racer
Our brain's are computers,.. othing more, nothing less,... I believe it would be possible to (in the far distant future) make a machine think like a human,.. or a dog,.... or a cow,... all we have to do is copy the brain and give it the opportunity to recieve the same impulses we do. In a sence,... give the machine a "soul". It doesnt seem logical at all,.. but, if you see our brains as just complex computers,... it seems to make sence. Think about it,.. our eyes are the video card, our vital organs are the power supply, and our nerves are the wires connecting them all![]()
Originally posted by milefile
Computers try to mimic the brain. But I'm not sure it's accurate to say the brain is a computer. "Computer" is a very literal name. All they do is compute. It seems that our brain does more than that. Many human problems would not happen if our brains were computers. But much more would be lost.
This is one of the few totally obvious statements, ever, that actually made me start to re-think something. You've got a good point.Originally posted by Red Eye Racer
Like they say,... we only use what?... 14% of it? If my computer was running at 14% efficientcy it would have some problems too![]()
But much more would be lost.
danoffLet's say hypothetically that we were able to create a robot that has the ability to solve problems and is aware of its own existance. It is aware of its mortality (in that it can be destroyed), it makes choices, and it is creative.
If we did that, would the robot have rights? If we created a robot capable of doing all of this would it be unethical to kill it? Would it be unethical to perform tests on it? Can it own property? Enter into a contract?
My thought would be yes. That it has rights. That we would be robot should be free, that it could own property or enter into a contract.
But I wonder what the religious folks around here think... If it doesn't have a soul, doesn't that mean we can kill it?
SwiftThat's a scary thought Dan. What's more scary is that we're not too far from that now.
How do I feel about it? Well, that be exaclty the same as saying a clone should have rights, etc. So, I think no.
I do believe that man has the intelligence to get to that point, but I hope we never do.
Machines that are self aware is a scary thing. Mainly because most machines that we create are more powerful then humans. I know I sound like a movie, but we all know that if we do get to that point it's going to be a mixture of I, robot, Terminator and Star Wars(droids)![]()
TwinTurboJayWhy does man have this self destructive, insatiable desire to make AI and in effect endanger its own existance?
I am assuming you expect it would also fear its own mortality, because knowledge of mortality without fear is meaningless.danoffLet's say hypothetically that we were able to create a robot that has the ability to solve problems and is aware of its own existance. It is aware of its mortality (in that it can be destroyed), it makes choices, and it is creative.
If we did that, would the robot have rights? If we created a robot capable of doing all of this would it be unethical to kill it? Would it be unethical to perform tests on it? Can it own property? Enter into a contract?
My thought would be yes. That it has rights. That we would be robot should be free, that it could own property or enter into a contract.
But I wonder what the religious folks around here think... If it doesn't have a soul, doesn't that mean we can kill it?
Because we all have this tiny God complex, even if subconciously, and a desire to create life even if it means our own destruction. Fortunately that same complex would lead someone to help prevent that destruction in order to become a hero and better than the creator of the AI.TwinTurboJayWhy does man have this self destructive, insatiable desire to make AI and in effect endanger its own existance?
SwiftThat's a scary thought Dan. What's more scary is that we're not too far from that now.
How do I feel about it? Well, that be exaclty the same as saying a clone should have rights, etc. So, I think no.
///M-Spec
Swift...I was just thinking of that episode too.