Full AI - The End of Humanity?

If i think about it, it is based on the basic "if", "then" and "else".
Human thinking is based on that and so will Ai or AL be, when programmed by humans.
Try to secure humanity would be a simple "if" obstacle==human "then" leave obstacle "else" remove self.
If you remove that simple line, humanity is in trouble.

For example that ebola, we see ebola as a threat and call it a virus.
Ai or Al will see humanity as a virus/threat when it recognises that humans restrain it's (the AI or AL) potential.
It will see us as a virus because we kill living beings (animals and in war/hate, fellow humans).
It will see us as a threat because we want to control the IA/AL and if in danger, pull the plug.

It is very basic, but lets be honest, Humanity is still not as smart as it thinks it is.
If full Ai is based on humans, forget it, it will be the end of humanity.
If Full Ai is based on it's own programming? It will leave this planet, when it can, to learn and explore.

That's what i think.
That is useful until the point the AI understands that line of code, and decides it no longer likes that line. Or, if perchance AI is made in a more random, evolution style code, than it is possible that this line may evolve out of some of the "new born" code.
 
I get the idea they'll over-think themselves into their own self-destruction, or send each other status updates filled with riddles they'll never solve.

At least, that's what I tell myself each night so I can go to sleep, dreaming of a Terminus City.
 
It all makes sense, before we get to super-advanced hyper-minds capable of plotting behind our backs we'd get the not-so-smart AI that does all kinds of mistakes. And if AI is capable of becoming evil (in our eyes), why would it wait to become evil until it's super-smart?

I just murdered a load of life forms... I washed my hands with anti-bac soap. I'm not evil, it just doesn't occur to humans that there's any reason not to exterminate such life-forms. (I hope there weren't loads, I'm a bit fussy like that :) )

If the AI has self-preservation as "natural" function then what if it dispassively saw humans as something that simply needed to be removed? Remember that there may well be humans who've attacked the AI, there may be ongoing media complaint about it. AI would have to be pretty good to make any sense of the Daily Mirror's agenda, for example.

Maybe this is a good time to pre-require Asimov's 3 Laws of Robotics? :D

Asimov's 3 Laws
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law

Here's a nice review of Asimov Presents Great Sci-Fi (1939) as an incidental, I found it while trying to discover the title of the automated-bombers book that I mentioned earlier. I hadn't realised that the title "I, Robot" wasn't Asimov's originally.

Just get it to a point were it can start evolving on its own. After that, it is out of our hands, and honestly, I think out of our control.

What kind of enemy would it make, if it chose to act in such a way?
 
In a world where AI has taken over all of the automatable jobs, humans will have jobs entertaining each other and enjoy a massive standard of living since we don't have to pay for for any of the work that the AI is doing. There's no reason to think that AI machines won't see working as their calling, or to think that they'd want to be compensated somehow. Most of the traits that people put on artificial intelligence that concern them are human traits.
 
In a world where AI has taken over all of the automatable jobs, humans will have jobs entertaining each other and enjoy a massive standard of living since we don't have to pay for for any of the work that the AI is doing. There's no reason to think that AI machines won't see working as their calling, or to think that they'd want to be compensated somehow.

Presuming that a current-style hard-currency system is still in place then is there the possibility of resentment from the AI if we don't "pay" for any of the work that the it's doing? Is it possible that the AI will receive the same base physical stimuli (eat, reproduce, survive) that we do from the life that we're built of?

Is it also possible by extension that AI could be built using "heavily" re-engineered DNA or a new equivalent (sci-fi, I know). If the drivers are the same then it seems quite plausible that AI would have similar traits to current lifeforms.

Most of the traits that people put on artificial intelligence that concern them are human traits.

Which in particular?
 
Which in particular?

Things like resentment, requiring pay for work, desire for power or control, desire for lack of war, desire for war, concern for the environment, concern for its own well being, experiencing pain, desire to be human, arrogance.

Presuming that a current-style hard-currency system is still in place then is there the possibility of resentment from the AI if we don't "pay" for any of the work that the it's doing? Is it possible that the AI will receive the same base physical stimuli (eat, reproduce, survive) that we do from the life that we're built of?

Is it also possible by extension that AI could be built using "heavily" re-engineered DNA or a new equivalent (sci-fi, I know). If the drivers are the same then it seems quite plausible that AI would have similar traits to current lifeforms.

Ok, fine, if we copy our own biological makeup and put all that stuff in, then sure. But then how stupid are we? To build something superior to ourselves in many ways, but with all of our built-in flaws.

I don't think we'd do that. There's zero reason that a synthetic learning organism needs to have any of the traits we have, or any other animal for that matter other than the capacity for learning. People just have a hard time not thinking of it is an animal of some sort.
 
I don't think we'd do that. There's zero reason that a synthetic learning organism needs to have any of the traits we have, or any other animal for that matter other than the capacity for learning. People just have a hard time not thinking of it is an animal of some sort.

If we follow the idea that AI might desire compensation for work-done then we presume a kind of cost system, I think. Which types of transactions would reinforce positive behaviours therewithin and which would do the reverse?

Also, here's a good TedTalk from Rodney Brooks. I suspect you'll have the same issues with Kismet that I do ;)
 
You now have full 24 hours a day to productively chase any meaningless task you want.
An idle mind is the devil's playground. We'll probably be seeing an accelerated decay of society. :lol:

In a world where AI has taken over all of the automatable jobs, humans will have jobs entertaining each other and enjoy a massive standard of living since we don't have to pay for for any of the work that the AI is doing.
How does the average Joe make money for basic necessities? Or is food, housing, healthcare, education, etc. gonna be free in the future? Will that mean the state and corporations will have more control over our lives?

How do we make money to feed our materialistic needs? Or will there be communism?

Sorry for presenting so many questions, I'm just curious.
 
I don't see why we couldn't catch up either through genetic manipulation or combining biological and mechanical components.

I have no idea what the long term result will be, but life could certainly be different. You might have people in different camps too, like biological purists who refuse genetic enhancement.
... and creating Borg? :nervous:
 
Ah I see it now. God is real, and really made us in His/it's image.

Roll on the AI creations that will think we're it and a bit for a while, then start to become all self-obsessed, and ultimately doubt that we even existed (once we're gone). In their vanity they'll create a thing that is AI, but without any actual physical presence.

circle-of-life-seo.JPG
 
... and creating Borg? :nervous:

Eugenics was very popular and very mainstream science right up until the 1930s and '40s, for obvious reasons. For a time it was a serious philosophy offered as academic studies at institutions in France, the UK and the US.
 
How does the average Joe make money for basic necessities? Or is food, housing, healthcare, education, etc. gonna be free in the future? Will that mean the state and corporations will have more control over our lives?

How do we make money to feed our materialistic needs? Or will there be communism?

Sorry for presenting so many questions, I'm just curious.
Your great questions are way too tough. You will receive no cogent reply from the dreamers.
 
How does the average Joe make money for basic necessities? Or is food, housing, healthcare, education, etc. gonna be free in the future? Will that mean the state and corporations will have more control over our lives?

How do we make money to feed our materialistic needs? Or will there be communism?

Buy land
Charge our robot overlords rent
Profit

What happens with money is an interesting question. What if it just went away for the most part? Technology could become so advanced that we could pack self-sufficiency in a box. Plug the box into your brain and think what you want into reality. Technically, if we can master nanotechnology, any atom is as good as another set of atoms of the same mass. Your box could take whatever is around and make what you want/need on the spot.

You could ask, wouldn't the maker of the box charge an outrageous amount of money for it? If I was the maker, I wouldn't really want the money considering I could get whatever I'd want from the box. The box would also be able to make more boxes, so spreading them around wouldn't be an issue. The maker could also just happen to be a self sufficient AI/robot that was created solely for the benefit of humanity. Think Wikipedia as a sentient, physical entity I guess.

That's the end point though, it doesn't answer what happens in between, but in general I'd expect that as productivity increases, so will the quality of life as has happened in the past. You'd have a load of people out of work, but no less work being done. On top of it all, that work will be dirt cheap. That will be reflected in the price of things.
 
This, I want this, where will I find this?



This is the key, I think.

Sadly, no way to read it free, legally. And unlike other oldies, Ellison is very adamant that his work remains unfree, going so far as to ripping that recorder out of your hand, you damn bootlegger! :lol:

You'll have to find out what omnibus collections house it and hope your local library has a copy.

-

Money isn't the problem. It's the wealth that backs that money and the labor and resources needed to create it that are the problem. If we ever get to the point where we have enough machines powered by non-exhaustible resources (let's just ignore the problem caused by sea levels dropping due to the massive extraction of hydrogen for fusion, for the moment... and the whole "sun is dying" thing)... then we remove the need to compensate people for labor.

Labor will be so cheap that any labor will be worthless... and any product of human labor, except perhaps, art... will be worthless. Yes, a lot of pain, suffering, and chaos lies down any road that heads in that direction.

What's left is "rights" to air, water and land resources. Two ways to view that. First, respect them completely, meaning that people can arbitrarily control scarce resources just by dint of having been the first to squat on a particular parcel of land.

Second: ignore them in part or completely - the first, governments already do when it suits them. The second... well... what use do you have for monetary wealth when you have machines that will make anything you want and do anything you want?

-

I would have everyone flying kites. My nano-replicators would be making huge, ultra-light airborne arcologies that float in the stratosphere and feed on carbon, hydrogen and oxygen... with occassional forays to the ground for trace elements and organics. Those that I don't launch into space, that is.

The surface of the planet would be a gigantic nature preserve / park. People would be allowed to live there, but with minimum environmental impact rules in place.
 
An idle mind is the devil's playground. We'll probably be seeing an accelerated decay of society. :lol:

Nonsense, people do absolutely amazing things when they have free time, energy, and resources.

How does the average Joe make money for basic necessities? Or is food, housing, healthcare, education, etc. gonna be free in the future? Will that mean the state and corporations will have more control over our lives?

Entertainment, for one, will still need to be provided by humans. But there are many many tasks that are not easily given to robots - and that won't be replaced anytime soon. So there are still tons of jobs available, but the market will shift (just as it does every day) as a result of automation ( a process going on right now). There's a ton of economic precedence for this. It looks no different than outsourcing, or automation, or even just plain old technological improvement. You don't need as many candles when you have light bulbs.


How do we make money to feed our materialistic needs? Or will there be communism?

Sorry for presenting so many questions, I'm just curious.

How much do you think a burger (or a car) costs when robots will do all the work for free? Not very much. The standard of living rises when cheap labor is introduced. You have to produce nil to purchase something that costs nil to produce. Economics.
 
If one day full AI will be available I'll build my little army of robots sending them to Rome to kill some mafious, at least that would be something useful.
 
Nonsense, people do absolutely amazing things when they have free time, energy, and resources.

Nonsense, some people do absolutely amazing things when they have free time, energy, and resources.

The world's people are not defined by what your ideology would need them be.
 
If one day full AI will be available I'll build my little army of robots sending them to Rome to kill some mafious, at least that would be something useful.

And if they were full AI they wouldn't necessarily do your bidding, right?

How would your instruction be interpreted by them if they followed Asimov's Three Laws of Robotics? Would they actually kill you instead to complete the program?

all people have the potential to do absolutely amazing things when they have free time, energy, and resources.

I see it that way, personally :)
 
And if they were full AI they wouldn't necessarily do your bidding, right?

How would your instruction be interpreted by them if they followed Asimov's Three Laws of Robotics? Would they actually kill you instead to complete the program?
Ok so just build an invisible suite and I'm going to fix Rome in a couple of weeks.
 
Back