Full AI - The End of Humanity?

After seeing the film Her, it got me thinking about AI again.

The world's got a lot of lonely people. The internet in general multiplies that. On the one hand, it helps giving someone an ear to talk to, but on the other hand, it's pretty dangerous, especially to people with a mental illness.

Remember how zombie films were all the rage in the early to mid 2000s? Expect AI to be this decades cinematic schlock.:lol:
 
I followed a link from the above article to one more directly focused on AI. While I understand concerns about job security, I really disagree with some of the positions taken against AI. Some quotes from the article:

"AI can’t write or rewrite literary material; can’t be used as source material; and MBA-covered material can’t be used to train AI."

Outside of the last one I can obviously see why the AMPTP is pushing back. AI can provide the first two things faster and cheaper. This can be good for the writers. I won't pretend to know how things work in Hollywood, but I assume more shows means a bigger need for writers and editors. It seems like instead of adapting to the technology, they are trying to suppress it.

There was also this:

"With streamers' addiction to paying minimums, we can clearly see a future in which the first drafts are done by AI to circumvent the cost of a real first draft — leaving all of the creative heavy lifting to be done by writers for bottom dollar. Are the studios pursuing this? Not to my knowledge at the moment. But the fact that they did not flat out agree to it when we asked them not to is extraordinarily revelatory. If you ask the person you’re dating not to cheat on you and they refuse to promise not to, that’s what we call a big red flag."

Financial concerns clear, but what does this have to do with cheating on people? WGA is basically saying it's unfair to make use of advances in technology. Such a bizarre stance to take.

I'm not part of the discussion, so I freely admit that I might be missing details, but from what I am seeing I think the WGA is leaning too much toward complacency. They're not trying to work with AI for their own benefit. They're trying to keep the status quo out of fear.
 
"With streamers' addiction to paying minimums, we can clearly see a future in which the first drafts are done by AI to circumvent the cost of a real first draft — leaving all of the creative heavy lifting to be done by writers for bottom dollar. Are the studios pursuing this? Not to my knowledge at the moment. But the fact that they did not flat out agree to it when we asked them not to is extraordinarily revelatory. If you ask the person you’re dating not to cheat on you and they refuse to promise not to, that’s what we call a big red flag."

Financial concerns clear, but what does this have to do with cheating on people? WGA is basically saying it's unfair to make use of advances in technology. Such a bizarre stance to take.

I'm not part of the discussion, so I freely admit that I might be missing details, but from what I am seeing I think the WGA is leaning too much toward complacency. They're not trying to work with AI for their own benefit. They're trying to keep the status quo out of fear.
The reason they say "bottom dollar" for a rewrite is that, based on my understanding, writers typically do not get paid as much to "punch up" an existing script as they would to create something from scratch. They are concerned that they will be handed AI generated garbage and told to "punch it up" for rewrite pay, and have to do all of the work they typically would do anyway. It does seem like a legitimate arguing point.

It also seems to be easy to work around. The parties could stipulate that AI generated work does not constitute a draft that needs a "rewrite" or "punch up".
 
They are concerned that they will be handed AI generated garbage and told to "punch it up" for rewrite pay, and have to do all of the work they typically would do anyway. It does seem like a legitimate arguing point.
I agree, if nothing is changing for them, it would make sense that their pay stays the same. The problem here is less AI and more how compensation is handled. If you can use anything as a first pass no matter the quality and then modifications to that are counted as rewrites no matter how much work goes into the rewrite, something is off. I'd be trying to fix that.
It also seems to be easy to work around. The parties could stipulate that AI generated work does not constitute a draft that needs a "rewrite" or "punch up".
Yeah, there are a lot of options that don't involve barring AI.
 
Yeah, there are a lot of options that don't involve barring AI.
For writers, sure. But actors? (The AI issue with SAG-AFTRA involves Hollywood wanting to scan the actors and keeping the rights to the scan for future AI use.)
 
For writers, sure. But actors? (The AI issue with SAG-AFTRA involves Hollywood wanting to scan the actors and keeping the rights to the scan for future AI use.)
Actors should definitely have control over their likeness. AI can fill roles in similar ways that stunt doubles or CGI animations do.
 
OpenAI just released DALL-E 3


Like previous versions, we’ve taken steps to limit DALL·E 3’s ability to generate violent, adult, or hateful content.

DALL·E 3 has mitigations to decline requests that ask for a public figure by name. We improved safety performance in risk areas like generation of public figures and harmful biases related to visual over/under-representation, in partnership with red teamers—domain experts who stress-test the model—to help inform our risk assessment and mitigation efforts in areas like propaganda and misinformation.

We’re also researching the best ways to help people identify when an image was created with AI. We’re experimenting with a provenance classifier—a new internal tool that can help us identify whether or not an image was generated by DALL·E 3—and hope to use this tool to better understand the ways generated images might be used. We’ll share more soon.

DALL·E 3 is designed to decline requests that ask for an image in the style of a living artist. Creators can now also opt their images out from training of our future image generation models.


it seems to have improved quite a bit, especially with text. Some examples from their site:

1695258563230.png


1695258572470.png


1695258589167.png


1695258601772.png
 
I suspect unscrupulous parties will hack the software to remove the provenance features but hopefully they'll find it harder to get around living creators (and ones like Jimi Hendrix whose estate is actively managed) refusing to have their data logged.
 
OpenAI just released DALL-E 3





it seems to have improved quite a bit, especially with text. Some examples from their site:

View attachment 1288813

View attachment 1288814

View attachment 1288815

View attachment 1288816

This technology i find very interesting, and the future of it. I don’t imagine we’re too far away from being able to create a 3D space in the same manner, that you can access in VR. So then the next step would have anybody with the right interface able to create custom gaming environments with simple prompts - something not too far removed from Star Trek’s holodecks. Where this technology is now was barely imaginable 15 years ago, so in another 15 or 20 years…
 
Last edited:
One issue that is already becoming apparent with AI is that it ain't going to be cheap to use it.

ChatGPT 4 is $20/mo ; Midjourney ranges from $10 a month for a very limited package to $120/mo, plus $4/hr for extra GPU time (plus there is no free trail on Midjourney any more) ; Stable Diffusion is $9 / $49 / $149/mo, again with very limited use at the lower end subscriptions.

Depending on who is using it and for what, prices could rise very significantly to the point where higher end functionality won't be affordable to the average person.

-

I've spent some time this morning exploring options for free AI generators, and Bing seems to be the best option at the moment.

Bing Chat apparently works from ChatGPT4, and also allows you to use Dall-E 3 for free, albeit only 25 times per month without wait times. I've been playing with it for work (honestly!) this morning and it is quite amazing...

--

"Researcher speaking to a team of consultants to discuss ideas in a meeting called an Opportunity Audit":

_d1250888-580d-4220-a76b-13b1cfacd791.jpg

... and a flyer based on it:

Design 1.png

And another flyer:

Design 2.png

A new logo for our team:

_dc9a9a8c-8760-4686-95eb-aac9634bba60.jpg

"a montage to summarise 'Health & Wellbeing'"

_dbea183a-92ac-42a8-a252-579f9376c5f3.jpg


--

"A GT2 car in a forest"

OIG (1).jpg


This last one is mind-blowing, given the prompt used above only.
 
Last edited:
IMG_9902.jpeg
IMG_9903.jpeg
IMG_9904.jpeg
IMG_9905.jpeg
IMG_9906.jpeg
Bing is quite good at getting results quick and easy without the added costs involved.

A few bases around a global map made using race cars.

The last being my favourite.
 
Hey @Sprite, please add your artwork to this thread: https://www.gtplanet.net/forum/threads/share-your-ai-art.418486/#post-14108041

I like the 4th one alot!

Bing is great for playing with, and I also think it is a seriously valuable tool for work as well.

Just playing with it for a few hours has really triggered my imagination, which is highly ironic because I thought that AI image generation was all about letting AI be creative for you. I think that is a risk for some, however, but for me it's currently like being a kid in a candy store.

-

I'm finding that most prompts that involve parts of the human body are being blocked... I can understand why, and I guess that the reason is fairly self-evidence, but it does make it quite difficult to make realisitic images that involve people.
 
Last edited:
I thought that AI image generation was all about letting AI be creative for you. I think that is a risk for some, however, but for me it's currently like being a kid in a candy store.
It really depends on what you want. I'm excited about AI not because I want it to think for me, but because I want it to work for me. I want to be the one in control and the one coming up with ideas while AI is the one that actually creates the end product. Though even in that case, I can't deny the benefits for creativity. An AI can potentially study and take input from many many times more images (or whatever it is one is trying to make) and avoid my own personal leanings to help me consider ideas I might not have come up with on my own. Then that is just multiplied by the ability to produce outputs far faster than I can. In the end, maybe it isn't that surprising that it can boost one's own originality.
 
It really depends on what you want. I'm excited about AI not because I want it to think for me, but because I want it to work for me. I want to be the one in control and the one coming up with ideas while AI is the one that actually creates the end product. Though even in that case, I can't deny the benefits for creativity. An AI can potentially study and take input from many many times more images (or whatever it is one is trying to make) and avoid my own personal leanings to help me consider ideas I might not have come up with on my own. Then that is just multiplied by the ability to produce outputs far faster than I can. In the end, maybe it isn't that surprising that it can boost one's own originality.

Or replace it.

I see the death of catalog photography, or at least relegated to the valueless level of stock work. Also, marketing firms can probably fire half the staff. Maybe 3/4. While we’re cleaving things.. banks and the like can probably fire half as well. Which is fine. Those people haven’t really done much for years. More like middle class welfare than an occupation. Im sure insurance can do the same with their cubicle dwellers.

It will be interesting to see how this shakes put in about a decade.
 
Or replace it.

I see the death of catalog photography, or at least relegated to the valueless level of stock work. Also, marketing firms can probably fire half the staff. Maybe 3/4. While we’re cleaving things.. banks and the like can probably fire half as well. Which is fine. Those people haven’t really done much for years. More like middle class welfare than an occupation. Im sure insurance can do the same with their cubicle dwellers.

It will be interesting to see how this shakes put in about a decade.
Generic mass produced content made by humans may end up going extinct, yes. I'm not sure if that is a huge loss. It would allow people to focus on bigger better projects and with AI to do the heavy lifting, small niches that were not sustainable enough to be catered to may thrive.

The future is uncertain though, so I agree that we will still have to wait to get a better idea of where we're headed.
 
I mostly want to focus on creative work as far as AI is concerned.

All my life I have been a creative person. When I see all the AI art out there, I always felt making something you produce through AI takes away originality and actually making something look as impressive as AI can provide. It is sort of the reason why I stay away from filters for art or stories. This is also why I stayed away from AI... until recently.

I am starting to come around to the mindset that AI should HELP you and not REPLACE you, at least in regards to creative work. Even the most trash creative work done by people can bear more originality and uniqueness than when someone uses AI to produce something impressive that one could have done all along. For example, anyone with any decent ability to 3D model can make a simple house. AI could turn that simple house to look like it should sell for north of $500K USD. Could you 3D model a house to look as impressive as an AI interpretation? Absolutely. However, what if you don't have those skills or the patience to make something impressive? That is where AI can help.

I guess my problem with AI in creative work is that it makes you feel sorry you can't make quality material on your own without the aid of computer technology. Where I am starting to think differently is that you still provide your own idea, and then the AI generator turns your idea(s) from simple to advanced. You still are supplying the ideas; AI just tries to make simple simply impressive. I am kind of thinking of using AI for concept design. I don't have the skills to make impressive digital art or 3D models. So I will look to AI to make the most true-to-life concepts possible.

Again- I am slowly changing my views regarding AI in creative work. Or at least, trying to...
 
Like many things, it can be used as a tool to help a skilled tradesperson do better work or to do the same work faster than they otherwise could. Or it can be abused by someone with no knowledge to produce a barely passable result.

If you're an artist simply looking to create something that suits your vision, there is no problem here. It's another tool that may or may not help you.

The problem mainly lies in the fact that it potentially automates the skilled work of a decently sized group of people, removing their ability to comfortably support themselves doing something they probably enjoy. But that's a problem with society. It has happened to other professions before, it will happen to other professions after. Throwing away this potentially useful thing isn't the solution any more than throwing clogs into steam powered looms was a solution. Fixing the system so that creatives aren't dependent on rigid control of the supply and "ownership" of art in order to survive seems much more sensible to me.
 
As a blogger, I have seen AI as possibly a way to generate some generic images to describe a certain topic without needing to find certain images from other sites to hotlink. Like, if I was to blog about car care, I could come up with a simple AI generated image to show a car being worked on to set the mood for the blog post. It's a safe way to get in some pictures without relying on other sites. You may need to run one prompt a few times to get a result you actually like, but I can see some non-guilty uses for AI.
 
Like many things, it can be used as a tool to help a skilled tradesperson do better work or to do the same work faster than they otherwise could. Or it can be abused by someone with no knowledge to produce a barely passable result.

If you're an artist simply looking to create something that suits your vision, there is no problem here. It's another tool that may or may not help you.

The problem mainly lies in the fact that it potentially automates the skilled work of a decently sized group of people, removing their ability to comfortably support themselves doing something they probably enjoy. But that's a problem with society. It has happened to other professions before, it will happen to other professions after. Throwing away this potentially useful thing isn't the solution any more than throwing clogs into steam powered looms was a solution. Fixing the system so that creatives aren't dependent on rigid control of the supply and "ownership" of art in order to survive seems much more sensible to me.

I actually agree with this. AI seems like it will benefit more than anything. Humans from the dawn of civlisation have made mistakes, holy wars, cold blooded murder. if anything technological progress is what will help the human condition more in the long run, in the short run we may suffer
 
Last edited:
I recently held a series of meetings between senior academics and a team of consultants that I hired.

After the meetings, the consultants are supposed to write a detailed report, including recommendations and advice on a whole bunch of things, ranging from policy development to commercialisation of scientific research.

My boss has been clear that the consultants are to write the reports together. They get about 3 hours per report each, and will be charging around £100 a hour; so the whole report will cost us around £5000 to get done. My boss has also made it very clear that I am not to work on the report until the paid consultants have written their reports, but the consultants have asked me to summarise the presentations given by the senior academics before they get started.

I think this makes sense - but I can also see my boss's side - even writing a summary of each half-hour presentation is a huge amount of work; it would take about 3 hours per person, or a total of around 24 hours; and that would mean me working on it solidly for at least 3 full days.

The solution: ChatGPT.

I took the text transcripts from Zoom and asked ChatGPT to write a one-page summary of each. I've not read them fully yet, but from what I can see, they are excellent and exactly what I wanted - something that is good enough to be used as a guide, but didn't take me a week of work to do. In total, between editting the transcripts, feeding them to ChatGPT and presenting them in one report, it's taken me about one hour to produce a 3,000 word summary of the entire series of meetings.
 
Last edited:
I recently held a series of meetings between senior academics and a team of consultants that I hired.

After the meetings, the consultants are supposed to write a detailed report, including recommendations and advice on a whole bunch of things, ranging from policy development to commercialisation of scientific research.

My boss has been clear that the consultants are to write the reports together. They get about 3 hours per report each, and will be charging around £100 a hour; so the whole report will cost us around £5000 to get done. My boss has also made it very clear that I am not to work on the report until the paid consultants have written their reports, but the consultants have asked me to summarise the presentations given by the senior academics before they get started.

I think this makes sense - but I can also see my boss's side - even writing a summary of each half-hour presentation is a huge amount of work; it would take about 3 hours per person, or a total of around 24 hours; and that would mean me working on it solidly for at least 3 full days.

The solution: ChatGPT.

I took the text transcripts from Zoom and asked ChatGPT to write a one-page summary of each. I've not read them fully yet, but from what I can see, they are excellent and exactly what I wanted - something that is good enough to be used as a guide, but didn't take me a week of work to do. In total, between editting the transcripts, feeding them to ChatGPT and presenting them in one report, it's taken me about one hour to produce a 3,000 word summary of the entire series of meetings.
Oddly relevant.

You're aware that one of the top-ranked academic leaders in America had their career destroyed recently because AI identified some otherwise-irrelevant and completely overlooked plagiarism mistakes. Mistakes that any high school kid could make and probably has.

I think it's interesting that you've found an easy way to do your work but I can't get over the fact that you didn't do the work. I can't wait until somebody argues during a plagiarism trial that no plagiarism occurred because the author didn't actually write it, AI did, and therefore the mistake wasn't made by the writer. AI giveth and AI taketh away - it's smart enough to allow you to create good things quickly but it's also smart enough to destroy your career.

I think it's an absolutely terrible idea to be using such tech before it becomes part of your company's or your industry's SOP. Until it's officially sanctioned I wouldn't touch it. Technology that even its own experts don't full understand has just as much potential to be dangerous as it does helpful, but this technology seems to take both of those possibilities to the extreme.

Apparently British judges are now allowed to use AI to create official judicial opinions? Have you heard about this? So now we've got an entire country worth of consituents voting for elected officials who appoint judges who don't even write their own decisions. Incredible. Do you want computers to define the society you live in?
 
Surely human oversight is relevant here? It's not as if those judges just hit the generate button without reading the output and reviewing it for conformity to the rule of law.

Maybe I should credit Google Assistant for every time my Android keyboard corrects a tpyo or prunting error. Until the AIs start learning to fake those as well, lol.
 
Last edited:
I think it's interesting that you've found an easy way to do your work but I can't get over the fact that you didn't do the work.
Except, I did do the work. I set up the meetings, hired everyone involved, recorded the meetings with the consent of everyone who participated, and disseminated the unadulterated raw material to all participants to review in their own time/way.

I was requested to provide a brief summary of each meeting based on the transcripts, and I found an innovative way of doing it that saved me (and my employer) a week's time/wages for the same outcome.
I think it's an absolutely terrible idea to be using such tech before it becomes part of your company's or your industry's SOP. Until it's officially sanctioned I wouldn't touch it.
I'm glad, then, that my boss's boss was the one who first mentioned the idea of using ChatGPT to streamline certain tasks. It isn't part of our SOP, and I wouldn't use it to attempt to pass something off as an original piece of work, but in this case I used ChatGPT to summarise content that was my own intellectual property to begin with.
 
Last edited:
I think it's interesting that you've found an easy way to do your work but I can't get over the fact that you didn't do the work.
How much involvement does someone need for it to count as them doing the work? These days we have computers doing all the calculations for us that once would have had to be done by hand. We have machines helping us fabricate and transport things. We have machines for capturing images and video. You name it, there's a machine for it. Most of these at some point were probably considered "cheating", but now are normalised or even considered essential.

For example, who machines stuff by hand any more? For anything even moderately complex you get a CNC to do it, and the operator skill is in setting up the machine, providing it the data and materials, and checking the result afterwards to make sure it meets spec. Nobody considers that operating a CNC is somehow not doing the work.

Yet somehow writing and art is a bridge too far for a lot of people. It's okay to ask your secretary to write up a summary of a meeting, but asking your computer to do it is a problem?

I think the problem here is that people's idea of work is still very much tied to the idea of spending a lot of time doing something unpleasant. Without that, it doesn't feel like work to them. Whereas realistically if you're efficiently and effectively creating something that is useful then that is valuable work, and using efficient and effective tools in order to get the result you wanted just makes you smarter than the guy who spent 8 hours sitting down and doing it by hand.

Don't get me wrong, there's a lot of pleasure that can be derived from creating by hand and that will never go away. But if you're not interested in that and you're just trying to get to the end product, then getting there by the most efficient and effective method is not negative thing. Nor is it not doing the work. The work got done, and you caused it to happen. It's just that some things that used to be difficult and time consuming are now kind of easy if you know what you're doing.
 
How much involvement does someone need for it to count as them doing the work? These days we have computers doing all the calculations for us that once would have had to be done by hand. We have machines helping us fabricate and transport things. We have machines for capturing images and video. You name it, there's a machine for it. Most of these at some point were probably considered "cheating", but now are normalised or even considered essential.

For example, who machines stuff by hand any more? For anything even moderately complex you get a CNC to do it, and the operator skill is in setting up the machine, providing it the data and materials, and checking the result afterwards to make sure it meets spec. Nobody considers that operating a CNC is somehow not doing the work.

Yet somehow writing and art is a bridge too far for a lot of people. It's okay to ask your secretary to write up a summary of a meeting, but asking your computer to do it is a problem?

I think the problem here is that people's idea of work is still very much tied to the idea of spending a lot of time doing something unpleasant. Without that, it doesn't feel like work to them. Whereas realistically if you're efficiently and effectively creating something that is useful then that is valuable work, and using efficient and effective tools in order to get the result you wanted just makes you smarter than the guy who spent 8 hours sitting down and doing it by hand.

Don't get me wrong, there's a lot of pleasure that can be derived from creating by hand and that will never go away. But if you're not interested in that and you're just trying to get to the end product, then getting there by the most efficient and effective method is not negative thing. Nor is it not doing the work. The work got done, and you caused it to happen. It's just that some things that used to be difficult and time consuming are now kind of easy if you know what you're doing.
Those machines don't make decisions. They merely act on hard programming, that's it. The human expertise on the back end is in creating that programming, and on the front end is managing that programming - making decisions on how best to utilize the machine. But sometimes they stop working, and every machinist friend I have can still fire up the old belt-driven Bridgeport and cut steel by hand if needed. The Bridgeport still works. It always works. This is proof that the humans still fully understand the what, why, and how of their machines' automation. Same for me in the plane. I understand its decision making thoroughly, but when it stops making them my cables and accumulators help me keep making them.

AI isn't like that. Literal AI experts admit that they don't really understand what's going on which is horrifying to say the least. If ChatGPT is simply scouring the internet for the next word or for formatting techniques then it's not AI, it's just a textbot. Fine. We know how textbots work, right? But people seem to want to take this in a direction that literally removes the decision making from the person which from my perspective means the person isn't doing anything worth their money. Various forms of automation have already rendered highly educated and skilled humans to nothing but managers but if we remove the management aspect then what are we, secretaries? Even less than that? And we'll get paid approrpriately I'm sure.

But you let me know when you'd like AI to operate the next flight you're on so I can get up and make me a coffee. It gets boring sitting up there watching the pretty colors go by.
 
Back