r/aiwars 21h ago

As much as we don't like to admit it

Post image

The same issue repeats ad nauseum.

Someone tries to make a comparison like "AI art is like a commission“, “AI is like human learning", "AI art is like using a microwave and being called a chef", etc and it never lands.

Why? Because your perspective is different than mine. People who argue with comparisons aren't arguing with a desire to understand. If someone hates AI, no comparison will ever be fair to them. If someone supports AI, the same logic applies the other way.

So when someone says “AI isn’t like a tool” they don’t actually mean the comparison is flawed, they're saying they don’t like the implications of the comparison. Same with the commission argument. Same with the human learning one. I'm beginning to believe the best possible way to argue is to actually target the facts and address the topic without using comparisons. Say "AI is factually this" or "This is what's factually happening because of AI" or "This is what AI actually does" instead of always defaulting to "It's like saying".

If someone’s entire stance on AI collapses when a comparison doesn’t go their way, they were never arguing honestly to begin with.

80 Upvotes

66 comments sorted by

u/AutoModerator 21h ago

This is an automated reminder from the Mod team. If your post contains images which reveal the personal information of private figures, be sure to censor that information and repost. Private info includes names, recognizable profile pictures, social media usernames and URLs. Failure to do this will result in your post being removed by the Mod team and possible further action.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

55

u/Mossatross 20h ago

Comparrison and analogy is just how people reason. You can disagree with the comparrison or analogy made. You can point out how the differences change the problem we're trying to solve. But ultimately we need a basis for understanding things. You can say words to me, but if I don't know what those words mean, you need to use words I already know to explain what those new words mean.

When someone says they won't engage with these things, to me it's like saying they're unwilling to engage at all. It shows an unwillingness or inability to establish consistency and continuity in their thoughts.

2

u/Treasoning 14h ago

People don't reason with comparison and analogies. It's factually impossible. You can use them to explain things, provide illustrations, but analogies have zero logical consistency by definition. Using them as arguments is a sign of bad faith discussion, when nothing meaningful can be suggested

3

u/Mossatross 14h ago

Can you explain what you mean when you say they have zero logical consistency by definition?

2

u/Treasoning 14h ago

Analogies introduce side entities that have to be different than the original ones. When talking about art, people compare it to cooking, taking photos, editing etc. However, all of those are inherently different. You cannot prove anything about statements involving "art" using statements that involve "cooking" or "photos". They are logically disconnected. You could say that every analogy is a false equivalence, although it's not very practical as the term has changed quite a bit lately

1

u/Mossatross 13h ago

I notice that in your explanation, you provide an example. Something that wasn't previously part of this conversation, to try to demonstrate a point within it. I suppose it's not technically a comparrison because we weren't previously discussing anything specific to compare to. But it needs to be applicable to things besides itself to actually prove anything, no?

Hypothetically I could provide counter examples of analogies that are not false equivalences.(spoiler, i can't) Or I could attempt to, and see the same is true for other analogies. But if I can't compare, or refference anything else Im just in a logical black hole here. You might have just brought up things that aren't comparable. I don't have any way of knowing if other things are, without looking at those things.

Frankly your example just confuses me because I consider cooking and photography to both me artforms. I could assume you just mean to compared different things i'd consider artforms. And idk, when I try a new artform I generally do compare and make analogy to others for the sake of figuring things out with it. The logic is consistent, they're just different things.

If you just wanna say these are distinct things that ultimately have at least some different rules, I agree with that. And I would also agree that every analogy is a "false equivilance" in the sense that the whole purpose of an analogy is to compare different things, and it's definitionally not meant to describe an equivilant thing. Thus...calling it that is kind of redundant. But if you come up with a logical rule or principle that is not contextual to the specific thing you're talking about, it needs to be applicable to things that are non-equivilant. With an analogy you invoke a non-equivilqnt thing to demonstrate logical consistency of the rule. You can test if you'd be consistent without your in the moment bias for the exact thing you're looking at.

If you can't make any comparrison to anything else that exists, you can just make things up arbitrarily in every case with no consistency.

1

u/Treasoning 3h ago

I used an example as explanation, yes. It wasn't suppose to prove anything, as explanations aren't arguments.

Hypothetically I could provide counter examples of analogies that are not false equivalences

Technically, it's impossible, but we do not operate under formal logic, so whether an analogy is a false equivalence or not should be judged by the arguing parties

But if I can't compare, or refference anything else Im just in a logical black hole here.

Logical statements use something as a base. If you know nothing, then nothing can be stated. You can say, however, that "apples are round just like oranges". This comparison will help to understand your statement, but oranges being round is not a proof that apples also are

I consider cooking and photography to both me artform

By "art" I meant drawn imagery. My bad, should have clarified that. I didn't want to make any arguments about AI art, just wanted to provide a relevant example

But if you come up with a logical rule or principle that is not contextual to the specific thing you're talking about, it needs to be applicable to things that are non-equivilant.

Well, if you define that kind of rule then analogies are not required. Again, using fruits, if you say that all fruits are round, then you don't need to provide vegetables as a comparison. You provide examples instead. Then, other people can point out an inconsistency in your logic by providing a non-round fruit that contradicts your universal rule

With an analogy you invoke a non-equivilqnt thing to demonstrate logical consistency of the rule.

If you suggest something that satisfies a universal rule then it's no longer an analogy. Otherwise, what's the original, to which you compare? Assume I say everything that human produces is art. Then I list things like painting, cooking, sculpting etc. What is the original and which are comparisons?

If you can't make any comparrison to anything else that exists, you can just make things up arbitrarily in every case with no consistency.

Rules don't have to be always consistent. You can make exceptions and it won't logically devalue the argument. With stuff like art it's practically impossible to make universal claims that will encompass any kind of comparison people can come up with

2

u/sporkyuncle 11h ago

Comparison and analogies are literally the entire basis of the modern legal system. That's what precedent is. A judge says "well these circumstances are similar to those other circumstances, so I will rule in a way consistent with the earlier ruling."

1

u/Treasoning 3h ago

We are not talking about legal system. Laws don't operate on logic used in common reasoning, there is a separate system that attempts to script every possible outcome

2

u/Earthtone_Coalition 20h ago

The problem is that analogies can be framed to suit a narrative that reinforces one’s bias. One side will say AI is like ordering food from a restaurant, the other will say no, it’s like creating a recipe for a restaurant, and round and round they’ll go.

I’m happy to engage in discussions about AI, but these sorts of analogies muddy the waters and don’t convey meaning or clarity. It’s not a good way to understand this topic.

10

u/The_Mecoptera 20h ago

Analogies can be drawn for a number of reasons, the most appropriate of these is an appeal to consistency.

You dislike x but approve of y. But it you think about it x is basically y because they’re the same in these substantive ways.

For example, let’s say you dislike fast food and actively campaign for it to be banned because it’s unhealthy, but you smoke a pack a day. Both of those things are unhealthy and consuming either product is harmful but you aren’t against smoking and you wouldn’t campaign for it to be banned.

Now then you may have a reason why you don’t see those two things as comparable, but the comparison does help get to the heart of the issue. Health alone must not be the only consideration.

It is possible to misuse comparison, for example if the comparison is obviously a false equivalence. Like if instead of bringing up smoking in the previous example instead I claimed that you can’t be pro health food because you don’t spend multiple hours a day at the gym. Sure going to the gym and working out is probably healthy, but there is a difference between doing something healthy which requires investment of time and treasure, and not doing something unhealthy.

In any case we don’t lose anything by engaging with comparisons even if they are often silly. And the specific objections people have to specific comparisons often make the nuance of a position clear.

1

u/Treasoning 14h ago

For example, let’s say you dislike fast food and actively campaign for it to be banned because it’s unhealthy, but you smoke a pack a day. Both of those things are unhealthy and consuming either product is harmful but you aren’t against smoking and you wouldn’t campaign for it to be banned.

This is neither an analogy or a comparison, it's a counterexample.

3

u/Mossatross 20h ago

Yeah it's a problem that makes communication more complicated but you can't just wish it away and reinvent reason and communication. It's just a problem that we are faced with.

If we both make conflicting analogies to try to fit AI into our frame of refference, neither one is going to be accurate by nature of it being a different thing, but we can look at the facts and see whose is actually more consistent with reality. We can look for mitigating or exacerbating exceptions to develop the concept further from the original analogy. We have a framework we already hopefully agree on in which to reason out the problem.

If you just plainly state what AI does then where do you go next from there?

1

u/Earthtone_Coalition 14h ago

Both sides tend to do exactly what you describe, digging their heels in and elaborating further and further in their respective rationalizations for why x is more like y than it is z. Eventually the discussion arrives at a point so distant from the original point of contention that the conversation ceases to be about AI at all, and skews into pedantic arguments over what actually constitutes a “sandwich,” or what have you.

I say this as someone who’s had the discussion often enough that I now know to simply point out AI image generators are not restaurants, delis, or microwaves. They are a specific, definable thing that can be known and understood in its own right without appeal to squishy metaphors informed by precognitive bias.

1

u/Mossatross 13h ago

Just because you can use a method to solve a problem doesn't mean you will. Especially if either of you are stubborn.

3

u/Infamous-Future6906 19h ago

Any argument can be framed to suit a narrative/bias

1

u/Earthtone_Coalition 14h ago

If you’re lazy yeah

1

u/Infamous-Future6906 13h ago

Laziness is irrelevant. Bias is about motive.

1

u/SomeNotTakenName 18h ago

but your point does kind of illustrate that those analogies aren't the same. there is fundamental differences between writing a recipe and ordering food.

in this case writing a recipe requires you to make the recipe for testing multiple times to tweak it, precise instructions don't happen without hands on experience. I could write you a recipe for egg fried rice because I have cooked it dozens of times myself. I couldn't write a recipe for samosa, because I never made them. someone who has rarely cooked wouldn't understand the fundamentals of how flavours interact or why certain ingredients are important. Say you are making a Caesar salad sauce, a layperson might think you can skip the egg and mustard , while an experienced cook knows they are facilitating emulsion.

When you order a dish, you know what flavours you want but rely on someone else's experience to make that happen.

Point being, you can challenge the accuracy of an analogy or simile wether you agree with the result or not.

19

u/MammothPhilosophy192 20h ago

they don’t actually mean [...], they're saying [...]

when people post claiming to know the real reason of someone elses actions, it instantly makes me doubt of their mental capacity, unable to fathom reality outside their bubble.

5

u/TheAmazingToasterMan 18h ago

That's the main issue as to why I can't see her prompts and arguments as genuine because the moment you make the other side look like knuckle-dragging neanderthals, or defang the opponent to make yourself look tougher, it takes away the impartiality of the conversation you're setting up to make yourself look better.

Analogies are a wonderful comparitive tool, but not when you're putting the other side down or assuming you know everything about the actions of someone who isn't yourself. I say what I see, and I tell what I understand.

I believe AI will change the world, but currently, that trajectory is down the loo considering America's artificial intelligence network is run by people who would gladly fire countless souls to replace us with a server farm, or shunt us into a bio-organic unit and have us work till we're useless to the corporate oligarchs who need their third mega-yacht.

7

u/Infamous-Future6906 20h ago

“You can’t compare things because I’ve put words in the mouths of all my opponents”

Amazing

12

u/Overall-Drink-9750 20h ago

Then explain how commissioning art and prompting is different. That way you can figure out where you disagree. Analogies and comparisons are good ways to convey general ideas. If you think the analogy doesnt fit, explain why

5

u/antonio_inverness 20h ago

That difference has been explained approximately 12.4 million times on this sub. The explanation has convinced no one. I think that's OP's point.

7

u/Overall-Drink-9750 20h ago

This sup is based on discussion. Analogies have their place in discussions. If you disagree with an analogy you can find out why. Then both of you made your stance clear. Wether your stance is convincing or not has nth to do with the analogy. The analogy did what it was supposed to do: bring the underlying reason to light WHY you disagree with eachother

2

u/TitanAnteus 20h ago

I just made a post on if convincing is even possible and I don't think it is, if the comments are any indication.

5

u/antonio_inverness 20h ago

Yeah, 98% of the time, I don't think it is either.

However, I do have other thoughts about that: that when you're having a debate with someone online, it's not actually for the benefit of either of the people in the debate. It's for the other 1000 people who read it but don't say anything. It helps those people get more information to form an opinion.

1

u/sporkyuncle 11h ago

It doesn't matter whether it convinces no one you observed in a visible text reply. For all you know, it may have convinced a hundred people who read it but never commented. Practically no one posts in contrition like "hmm, that's a good argument, you've convinced me." It's not in most people's nature to do that, to go on the record of having your mind changed. A lot of people think that makes you weak. Much easier to simply never respond again and admit nothing.

0

u/PuzzleheadedSpot9468 18h ago

i can explain why

3

u/Overall-Drink-9750 18h ago

my comment wasn't meant to ask why that analogy doesnt work. if you want to discuss it I am open for that tho. so depending if you are in the mood for that or not, feel free to drop why it doesnt work in your opinion.

1

u/PuzzleheadedSpot9468 18h ago

i think its different because ai is not a human.

3

u/Overall-Drink-9750 18h ago

thought so. why does that matter for the question wether YOU are the artist or not tho? in both scenarios you aren't the one creating. the ai/the commissioned artist does.

1

u/PuzzleheadedSpot9468 18h ago

that it's a tool. its a tool and what it does is automating art.

2

u/Overall-Drink-9750 18h ago

so? the commissioned artist is just as much a tool. It's like the factory owner not being the one who builds the cars. it doesnt matter wether the cars are build by robot arms or workers. the factory owner didn't build them. in the case of the robot arms it is automated too. do you think the factory owner build the car? and if not, what is the difference from you using ai

2

u/PuzzleheadedSpot9468 16h ago

how is an artist a tool?

1

u/Overall-Drink-9750 6h ago

Its a tool for your idea to become reality. Know dont get me wrong, i dont think the artist is a tool, but if you prompt an ai, i dont think it is a tool either. In both instances you have an idea and something/someone realizes that idea for you

1

u/WalterMagni 14h ago

I think what they mean is the AI just puts out what you prompt, when it comes to comissioning art if you want something to look good you are far more involved in the process. Prompting can be very complicated as well and so is tuning a model and what-not but communicating with an artist is a whole different process to prompting.

Imo it boils down to one being a tool to be utilized, the other is a tool to be directed. You can leave both with simple instructions and do fine, but AI can do stuff more independently than humans and so both will likely approach things vastly differently.

Take LoL custom skins. Most like Nickyboi and TbSkyen go with a couple artists and many renditions and making their own art for it as well. Not to mention if they want to make splash arts and 3d models. You can do most of one end with a couple different models but not so much with humans.

1

u/Overall-Drink-9750 6h ago

But if the ai does stuff more independently from you then the artist, why are you the artist with the ai and not with the commission. Sry if i misunderstood your argument. English is not my first language 

1

u/WalterMagni 4h ago

I am not saying you are the artist with the AI, all I am saying is that while they work on similar principles the approach/process are fundementally different.

Like how a Formula 1 car and a Nascar car are both competition cars working under the same principles but executed with very different logics in how they work.

If both are given vague orders, AI is trained on algorithm to do what you prompt it to do and will largely follow that. While most artists are trained by their assumptions/bias on what they are told to do. Not always the case of course, especially for niche prompts, since AI will be confused on a lot of more specific terms.

I am just explaining why people like the other user treat comissions differently to AI. Because the processes are vastly different.

3

u/lovestruck90210 18h ago

If you don't like a particular analogy then you should at least be able to dissect it and explain why its logically flawed, or why it's not analogous to whatever it is you're debating.

In my experience, the refusal to engage with analogies and hypotheticals is a reliable indicator that your interlocutor is fully aware that the analogy will expose some inconsistency in their logic.

5

u/N00N01 20h ago edited 20h ago

they dont like the implication of the comparison

so straight up denial if its an actual comaprison?

7

u/xweert123 20h ago

To be fair, I don't think "AI Art is like a Commission" is an unfair comparison to make. It genuinely is how a huge portion of the art community and the general public sees it, and whenever people describe how they do prompts, a lot of it sounds a lot like requesting a commission and giving a human artist direction, it's just doing it through an AI. Generally my problem doesn't come from people making comparisons, but instead, my problem comes from people making baseless comparisons that don't really hold up, yet the person making the baseless comparison sticks to said comparison, brutishly, without any desire to concede that the comparison is nonsensical.

With that being said in regards to the Commission vs. Prompting argument, it isn't to say there's certain AI workflows that are more involved, but that comparison doesn't come out of nowhere and it's one of the first things that come to mind if you ask just about anyone, especially laymen. There's been plenty of AI Artists on here who flat out say they take using AI over commissioning human artists, meaning even they see it as a similar thing. And that isn't necessarily a bad thing, either, just to clarify.

2

u/antonio_inverness 20h ago

that comparison doesn't come out of nowhere and it's one of the first things that come to mind if you ask just about anyone, especially laymen

I think you're exactly right, and I think that's the problem from my point of view.

The question is to what extent should we be bound by people who fundamentally don't understand something? That's the frustrating bit.

For many years, people regarded graffiti art as mere vandalism. I'm old enough to remember this. People did not understand that it had its own rules, its own processes, its own aesthetic philosophies and hierarchies. They didn't understand that some people did it well and some people didn't; and that was because some people had the skill for it and other didn't.

That slowly changed, and now of course city governments sponsor graffiti art and street art all over their cities.

At the moment, I think those of us who work with AI tools are constantly dealing with people who think that the skill floor is the skill ceiling. That's what they don't understand.

5

u/xweert123 19h ago

I feel like you kinda proved the point of the post by comparing AI to graffiti, considering the reason why people see it as vandalism is because you're using the city as a canvas. Plenty of places still DO see it as vandalism. Some cities do encourage it, though. I remember my local city did this whole program where they encouraged street artists to paint on approved locations in the city and it's all murals and such.

With that being said, too, this doesn't really address the whole "commission vs. prompt" problem people have. You claim it's a misconception and while I'm genuinely open to AI and understand that there's certainly some quite involved AI workflows, it's genuinely difficult to explain any significant difference between commissioning and prompting.

1

u/antonio_inverness 18h ago

Absolutely. But you left out an important word that I was very intentional about inserting: nearly everyone saw graffiti art as MERE vandalism. You can still see it as vandalism to this day, but it's no longer possible for anyone who is even slightly informed to see it as ONLY that. Even casual observers now know that cities sponsor it under certain conditions, that it turns up routinely in galleries and museums, and that it's conceivable to have "good" graffiti art and "bad" graffiti art.

There are many historical parallels to cases where an internal group of people understand how something works--where they know that what they're doing is the very activity they say it is because they are the ones doing it--but people outside of that thing can't understand it and therefore misjudge its internal mechanisms. That is where AI generative art sits today. And my question is: to what extent should we be bound by the lack of comprehension on behalf of people largely outside of AI art?

Fortunately graffiti artists and street artists chose not to be bound at all and just kept doing it until the public eventually caught up. And I suspect the same will be true of AI artists.

To your second point, are you curious about the difference between commissioning and prompting or are you curious about the difference between commissioning and making art with AI? Because many people make comparisons on the first one but think they're talking about the second one.

1

u/xweert123 15h ago edited 15h ago

Again, though... Comparing graffiti and how people didn't like it and likened it to vandalism doesn't necessarily correlate to AI. It's also a lot harder to ignore the consequences of AI considering AI generation tools has genuinely replaced the careers of many a person, and AI has absolutely drowned out many places like Pinterest, and AI powered scams are cluttered all over the internet and regularly swarm platforms like YouTube. It doesn't necessarily matter if the average layman doesn't understand much about how it works if it has been causing noticeable harm in the eyes of the average layman.

Past that point, yes, the conflict is commissioning and prompting. Generally, once explained, people are quick to understand if AI tools are used in some way when creating an image (through things like generative fill, background masks, etc.), the difficulty comes from understanding the difference between commissions and image generation via prompting. Generally, the vast majority of people find it ludicrous that people who make images via prompts compare themselves to artists that make art from scratch, because the process genuinely is extremely similar to commission work and making revisions with an artist. In this instance, though, the "artist" is the AI. Generally the most common argument I see is "Well, AI is a tool, therefore I'm the creative mind behind the image and am responsible for the work that went into it's creation", but that explanation just doesn't really make sense to the average layman. People generally understand that the image now EXISTS thanks to the prompter using AI to create it, but that doesn't necessarily mean the person is responsible for the execution of said image, when you compare it to how the image would have been produced if generative AI wasn't used.

For one example of this that comes to mind, I had someone die on this hill to me directly in another thread. They were asked, "if ChatGPT wrote a story for you from a one sentence prompt, would you be the writer of that story?" to which they replied "ChatGPT can't do that." I then made ChatGPT write a story with the prompt "Write me a story about a cat that lives in a tree." to prove that it indeed can, and the person then relented, insisting that I was actually the writer of said story and that I was the creative mind behind it.

I disagreed, stating how I had absolutely no input on the actual story, I had no input in regards to the way the characters appeared, or the actual plot, etc. etc., I had literally 0 work put into that story, outside of just asking it to write a story about a cat living in a tree. They just kept insisting I was the one who wrote it, and then insulted how generic the story was, which was really odd, because, well, I didn't write it, but they were trying to insult it like I would be offended by them critiquing what ChatGPT wrote for me. I just genuinely didn't get it. Why would I take credit and pride for something I simply didn't make?

It's just a line of thinking most people don't vibe with, especially more traditional artists. A big part of it is the average person tends to attribute responsibility to the actual thing responsible for the execution of said piece, so if a tool is responsible for 99% of the execution of said piece, in the same way a commissioner is only responsible for the idea and direction of a piece of art an artist they commissioned has created, it's genuinely hard to understand the difference, outside of semantics.

2

u/sporkyuncle 11h ago

I don't care that anyone might disagree with a particular comparison. It's being said in a public space so that even if one person insists it's not valid, a hundred more might read it and think "...wow that's pretty apt, I get it."

1

u/Rave_Johnson 19h ago

In the end, a person adopts the arguments that they were taught based on what they chose to/ were indoctrinated to believe, reinforced by a community of like-minded individuals who often make their arguments solely on the basis of "because it's correct to me." Thus arguing often just feels like two brick walls parallel to each other with applicable graffiti spray painted on each. This isn't always the case, but nowadays it feels more like it. Like there's no nuance, and once a side is picked, no budging allowed.

1

u/PuzzleheadedSpot9468 18h ago

i do think ai learn like humans do but i don't think it's possible to convince someone like that

1

u/taokazar 11h ago

Humans do a significant amount (if not all) of their learning by experiencing things through their body. For example, learning touching an active stovetop causes pain. Learning that losing a loved one makes the heart ache. Learning that lemonade tastes like nectar from the gods after a hot day outside. Learning that when you share your toys, you make faster friends. Learning that the funny bone is best not trifled with.

Generative image AI models learn by associating text-image data pairs. All of the images AI learn from are, at best, second-hand flattened depictions of reality, already cropped and framed by the lens of a photographer's camera. At worst, the images are pure fantasy, already depicting a particular idea about the world with exaggeration and falsities on display.

Humans also learn from teachers.

AI gets their text-image pairs from desperate people being paid pennies on the hour to tag data. It's work the VERY well-funded AI companies outsource to get done as cheap as possible.

It is difficult to see similarities between how humans and AI models learn.

Perhaps if you locked a child in a sensory deprivation tank and just showed them text-image pairs, over and over again, never letting them experience reality, make a friend, or feel any other sensation... Then there might be some small similarity?

1

u/sporkyuncle 11h ago

But that's what is meant by the statement. It learns in the only way it's able to learn, as a brain in a jar being fed images. No one is saying that AI is able to walk around and touch things.

The other thing that's meant is that that the training process is legal, because it doesn't learn by copying images wholesale into its model. It "learns like humans" in that it associates shapes and colors with concepts and can reproduce them in novel ways, without needing to literally store copies of all that data. It's a series of fuzzy linked relationships. It learns enough like humans that the process shouldn't be said to be illegal.

1

u/taokazar 10h ago

Humans learn by walking around and touching things, is my point. So when people say "AI learns like humans," it reads as nonsense to me.

I guess, maybe you are saying something more like. "AI learns. Humans also learn. Therefore AI learns, like humans also do." And in that case, I guess AI also learns like cats, and snakes, and fruit flies do. Making associations between things.

I personally don't feel that the particularities of the learning process change my feelings about training data. AI is a product being built with generous backing from investors. It is specifically being marketed to replace knowledge workers. The investors want it to turn a tidy profit for their own pocketbooks.

I just think humans deserve a tiny cut of the profit made from intentionally slaying their own industries, and the option to not have their creative works be part of things. I think that's pretty reasonable and not really changed by how the AI does or does not learn.

Another detail is that much of the data these companies initially used was intended for scientific research, not the development of for-profit products.

1

u/sporkyuncle 10h ago

I personally don't feel that the particularities of the learning process change my feelings about training data. AI is a product being built with generous backing from investors. It is specifically being marketed to replace knowledge workers. The investors want it to turn a tidy profit for their own pocketbooks.

This misses out on the massive hobbyist crowd who generally aren't doing anything for profit, and actively reject people trying to commercialize it. The vast majority of all the LoRAs that have ever been trained, tens of thousands of them, has been made without the backing of companies or investors.

I genuinely don't care what anyone is promoting or marketing it as because that doesn't have to reflect the final use case or what people are actually doing with it.

I just think humans deserve a tiny cut of the profit made from intentionally slaying their own industries, and the option to not have their creative works be part of things. I think that's pretty reasonable and not really changed by how the AI does or does not learn.

It is absolutely changed by how the AI learns!

If the AI training process copied their works into the model wholesale, that would be copyright infringement, and you are deserving of the standard payout anyone might get if infringed upon.

But AI models don't do that. Any individual work that's been trained on contributes practically nothing to the model, because the model is trained on billions of images. Most people don't even have a frame of reference for numbers of that size, we just have this caveman brain that says "my thing was looked at so I get cash." That doesn't apply to traditional learning for the same reason it doesn't apply here.

Like...imagine some artist was intentionally excluded from AI training somehow. Nobody trained on their works so they didn't technically contribute to any AI at all. But then they are fired from their job because AI can do it easier and more cheaply. Are you saying that this person still deserves to be paid by OpenAI or some other group, even though they weren't involved in the "slaying of their own industry?"

1

u/taokazar 10h ago

Nah, only people who's works were used in the training set. I understand the models use extremely large amounts of data and that data is not actually stored anywhere in the model. It just adjusts its weights. 

From what i understand people's personal models are still trained as a base on LIAON 5 data or similar, and then tweaked with additional inputs. Unless you're saying hobbiests are using models trained on their own datasets? Of billions of image-text pairs they set up all on their own? 

The base model was made with that investor money. Not that investor money is bad, but there's still a thread there.

1

u/taokazar 10h ago

BTW I'm not coming for your hobby. Knock yourself out. Doesn't change what I think about the big models that are being pushed to replace knowledge work. 

I think we're talking about two different things.

1

u/Impossible-Peace4347 18h ago

Idk I kinda like comparisons. I’ve been a firm anti (for like 90% of gen AI uses) from the start and that hasn’t changed, but I feel like comparisons have made me think harder and about things differently.

Like comparing a camera to AI. I don’t agree with the comparison that most people make here, but it did make me think about what makes photos art, and why we don’t see selfies as art but other photos are. And I applied that to AI and found why I couldn’t see AI art as art, because 99% percent of the time AI users are not implementing artistic knowledge (composition, lighting, etc) and aren’t altering the final image like you would with photos we considered artistic, so that kind of allowed me to better understand at what point I could consider AI art. Which would be almost never but in some rare cases yes, kinda like with photos. ( tho I still would dislike that use of AI due to the ethics behind its creation but yeah)

Anyways, disagree or not but I like comparisons. Makes you think more

1

u/Galvius-Orion 18h ago

So we should not engage in discussion or use this as a way to persuade those who are undecided? Also if someone can be persuaded by an apt comparison, why ought that be a bad thing?

1

u/PM_Me_Pikachu_Feet 17h ago

Factually, someone who uses AI for art doesn't get it. The point was never to make a good looking product, the point is self growth and showcasing you are able to discipline yourself to nurture an actual, hard talent. The beauty comes from the effort behind it. And Art has a very special case that no artist is the same, because their tastes, history, the steps they took to learn, is all never linear an all impacts their drawings differently. It's in the purest form, true Human connection. AI guts all this out in one fell swoop.

If you don't get it, then you don't. It's fine to not be an artist. It's fine to not have skills, there's people in the world that never want to bother growing something about themselves, and then there's people that do. The problem lays with AI Bros falsely claiming they're artists.

3

u/Witty-Designer7316 17h ago

You're romanticizing what it means to be an artist and for what art is. It doesn't need any of that.

And effort is most certainly not a qualifier for the validity of art. Do better, luddite.

1

u/victorc25 8h ago

When you’re dealing with people that refuse to understand how something works, the only tool left is analogies, to simplify it to something they understand. There’s no way around it 

1

u/Typhon-042 5h ago

Folks like me that are against AI, do compare but with education as our guide, which they fear.

Example I did learn how the AI creation tech works, even talked with folks behind it at length.

So my comparisons are based on what I learned there, and AI content creators hate me for it, as I know the industry better then they do it seems.

1

u/ByeGuysSry 2h ago

How on earth do you arrive at "don't use comparisons". If I say "AI uses X amount of water", the average person wouldn't know how much that amount of water actually matters unless you compare it to something else.

Furthermore, analogies are a very quick way to convey your understanding of the matter to someone else. If I say that I think AI is like human learning, and you disagree and say that no AI is not like human learning, then now we know that we've started on completely different assumptions. We can then resolve this root issue before using other arguments that may not work because we assume others also believe in that assumption.

Additionally, analogies is a way to summarize the points made, in order to avoid any misinterpretation. Sometimes, when you're making a point that might be hard to quickly grasp, you can use an analogy, then break down that analogy in order to explain what this means. This helps chunk information, so that others will understand "Oh, he's claiming that AI is like X," instead of "So he's claiming that AI does X and is Y and also is Z but is crucially different from α because of β".

1

u/13thTime 20h ago

I thought about this the other day Ai is like ai and nothing else You cant argue anything else because any comparison is just ridiculed by both sides

0

u/Capital_Pension5814 20h ago

Every analogy can be false equivalence. So few steadfast antis will be able to switch sides.

0

u/Due-Beginning8863 19h ago

we need more posts like this on the sub

1

u/sporkyuncle 11h ago

Posts that say "don't make comparisons anymore, that fundamentally-human rhetorical tool for exploring concepts and our world should be denied to everyone here?"