34 / 80
Feb 26

I actually got heartbroken regarding AI art for a dnd campaign. I was so excited about a new campaign, I offered to draw everyone's characters for tokens and such. I started gathering everyones references and the DM flat out said 'don't worry i got them all done with AI so you don't waste your time.'

That was my first red flag for that campaign. I'm no longer a part of it.

Its generally pretty garbage. Like its just ugly, it uses a lot of power, which hurts the environment, and its filled the internet with trash. I've been on sites where its just all AI art, and its always ugly as heck

On a more serious note, there is this thing that bothers me and that proponent of A.I. seem to ignore. Whenever someone points out that A.I. art is kinda shit, they always argue that right now, it is imperfect, but it is only going to get better from here on out, and I don’t believe it.

Yes, progress in the last years is staggering, but it is a logical fallacy that progress is constant and linear. Just take smartphones, Yes they technically get better every year but there has not been real innovation in years. My first smartphone was my dad's old galaxy s1 from 2010, and in terms of basic functions, it is the same as my current one. The point is A.I. might not get better, it might just stagnate or even get worse.

A.I. advocates don’t seem to understand the basics of what they proselytize. What is commonly called A.I. is actually called large language models. It is not what we imagine when we say A.I. (Skynet, Glados, HAL 9000), but a more advanced complicated version of when you type something on your smartphone, and word suggestion appears at the top of the keyboard. They take in large amounts of data, analyse it for trends, and spit out something it thinks the end user wants based on the previous input. That kind of tool can be useful in limited circumstances but blindly applying it for everything might not work due to the inherent limit of the tool.

Simply predicting the next word function works well, (usually) because words and letters are something easy for a computer to understand, and the structure of language makes it easy to notice a pattern. Things get more complicated when you ask the same tool and ask it to do something more complicated.

Image generation is a lot harder because there are a lot more data points, and the human brain is a lot better at identifying when something is “off”. Let's take the example of hands, for a long time L.L.M. had difficulty reproducing hand because they produce things that are close to the average without understanding what that average is. Objectively, human hands have on average less than 5 fingers, so the computer will spit out an image with something like 4.9 fingers on one hand because it lacks basic understanding on why the average is less than five. Take the same problem, multiply it by hand in every pose at every angle, and it becomes understandable why for a long time image image generators had trouble creating hands.

The way it was eventually fixed was with the amount of training data. There are a lot of images on the internet that feature at least one human hand, so now L.L.M. are more competent at producing images of hands. The problem is that the tool has not fixed the underlying problem that it does not understand what it produces so it can only properly generate things that have sufficient training data.
There are a lot of things that are a lot more complicated than hand with a lot less training data available so I doubt generative models will ever be at the point where everything is right.

There is a common rule of thumb that the last 20% take exponentially more work as the first 80%. A.I. companies have effectively used the entire internet, the collected sum of human knowledge and creation, to reach the 80% mark, but that means they are running out of new training data to finish the last 20%. All of that does not include the risk that the training data is corrupted by shitty A.I. art and the data start inbreeding with itself like the Hapsburg, or deliberately sabotaged by disgruntled artist/employee or rival government/company.

That is why I’m skeptical that “A.I.” will truly reach the potential its advocates insist it has.

Agreed. There is a permanent limitation in that prompters still don't have any taste or skill no matter what the quality of the output is. They can't distinguish good work from bad, and they don't have the skills to spot and fix any specific problems they might have. No amount of machine learning can fix that.

I hate how every time I try to search for photos of real-life animals, Google keeps showing me some horrid AI images.

The emergence of AI has seriously worsened the problem of fake news and fake photos.

In the past, creating fake photos required a certain level of Photoshop skills, but now anyone can easily produce fake photos.

Although many AI-generated images are obviously fake at first glance, for many people online, these fake photos are already convincing enough that they cannot distinguish them from real ones.:weary:

A.I. generated images are attempted murder on the human spirit. An attempt to kill art. It will not work. But it still scares me

I know! Science Fiction has shown us for decades what happens if we make AI, but still we continue with it! The human race is doomed!!!

The human race has only got to last another 30-40 years or so (which would put me between 83 and 93 years old)

I hate it,A.I isn't art it is a copy past generator using other peoples hardwork.It is a tool so the rich can be even richer without having to rely on people.

It's souless,and ugly.

Guys, Gen AI isn't skynet, that's giving it waaay too much credit for what it actually is.

They may call it AI, but it is anything but intelligent. There's no thought, no reason behind the calculations. All it is is an extensive list of labeled input and an algorithm that can make an aggregate output based on that. It doesn't understand what it's doing like how a human understands.

These things aren't suddenly going to "wake up" and decide to launch all the world's nukes. They're single use machines that can only perform their one task. Pretending otherwise plays into the marketing spiel of these techcompanies, who can then pretend they have something more on their hands than just a plagiarism machine.

Everything starts somewhere...a logical thinking powerful machine will definitely think the human species is awful.

But on more practical concerns it's that the technology always is ahead of the ethics and no one wants to slow down and think about them because it's a capitalist race. Like you're referring to, AI has the ability to plagiarise every piece of art produced before someone puts in the laws to protect creators.
Or not getting to heavily into social politics, but the prejudices that have been found in face recognition technologies obviously programmed in from the society its created in.

(post withdrawn by author, will be automatically deleted in 24 hours unless flagged)

I can kinda accept it, if you need an idea and only use it for inspiration. It's still not good, because you detrain your brain to be crearive... But okay.

Other than that, no thanks! My little sis experiments with AI generated music and yes, she puts a ton of work in it to get the right outcome... Still, it sounds so lifeless, I don't like to listen to it.

But what makes me angry, is when people sell AI generated stuff on little art and crafts markets... Gosh I was so pissed, when I figured out, I bought AI stuff! I GO to such markets to SPECIFICALLY help artists and not to buy lifeless art!

This :100:
I recently watched a video on why AI machines currently can't show a "to the rim filled glass of wine", it only knows what its training data shows it.

I think we need to sit this out. At some point AI is going to poisdon itself. But we will see.

Don't know if you people are aware of how AI art is actually created, for example, if the engine needs to draw a curve, it simply resorts to a mathematical function to do it, computers can't draw at all, all you see is a complex set of analytic functions trying to mimick a form. It's common to see fractals and repeated patterns in that type of "ART". It's nothing but an ilusion, it aint art at all.
Real art is all about randomness and manual strokes, and computers totaly suck at it.

Eeeh, I highly doubt it. It's not machines ruling over us that we should fear, rather the ones already in power using machines to make their control more absolute.

Why though? Why is it logical to think the human race is awful. There's nothing objective about it, even "logic" itself is not something that exists naturally in nature, it's a human invention. So a machine that would 'logically' conclude the human race is awful will do so based on the logic of the person or people that made it in the first place. And yeah, there are plenty people out there (including the most powerful amongst us) that think humanity is awful.

So again, it's not the machines that need to be feared, but the people that wield them against us.

Pretty much this.

The way I see it, the dystopia we are potentially barreling towards is not that of terminator, but a cyberpunk one.

I do confused when some people seem to put a ton of work prompt engineering the AI to produce a desired outcome. That kind of defeats the supposed "convenience" the AI is supposed to bring, no? No one's forcing you to produce the result using AI. At that point, it might be a better use of time to just learn the skill yourself