(They/Them) I like TTRPGs, history, (audio and written) horror and the history of occultism.

  • 0 Posts
  • 13 Comments
Joined 29 days ago
cake
Cake day: January 24th, 2025

help-circle

  • Part of the problem is that sufficient wealth seems to destroy people’s understanding of consequence. They don’t experience them very often, and so reach a point where they can simply pursue whatever their feelings tell them to do and the world magically restructures itself to allow them to do so.

    Combine this with how the incentives of the social system result in the people who are most likely to pursue a selfish course being the most financially successful- you get a recipe for short-sighted, ignorant and self-important nonsense.


  • Hey, thank you so much for your contribution to this discussion. You presented me a really challenging thought and I have appreciated grappling with it for a few days. I think you’ve really shifted some bits of my perspective, and I think I understand now.

    I think there’s an ambiguity in my initial post here, and I wanted to check which of the following is the thing you read from it:

    • Generative AI art is inherently limited in these ways, even in the hands of skilled artists or those with technical expertise with it; or,
    • Generative AI art is inherently limited in these ways, because it will be ultimately used by souless executives who don’t respect or understand art.



  • The university I went to had an unusually large art department for the state it was in, most likely because due to a ridiculous chain of events and it’s unique history, it didn’t have any sports teams at all.

    I spent a lot of time there, because I had (and made) a lot of friends with the art students and enjoyed the company of weird, creative people. It was fun and beautiful and had a profound effect on how I look at art, craft and the people who make it.

    I mention this because I totally disagree with you on the subject of photography. It’s incredibly intentional in an entirely distinct but fundamentally related way, since you lack control over so many aspects of it- the things you can choose become all the more significant, personal and meaningful. I remember people comparing generative art and photography and it’s really… Aggravating, honestly.

    The photography student I knew did a whole project as part of her final year that was a display of nude figures that did a lot of work with background, lighting, dramatic shadow and use of color, angle and deeply considered compositions. It’s a lot of work!

    I don’t mean here to imply you’re disparaging photography in any way, or that you don’t know enough about it. I can’t know that, so I’m just sharing my feelings about the subject and art form.

    A lot of generative art has very similar lighting and positioning because it’s drawing on stock photographs which have a very standardized format. I think there’s a lot of different between that and the work someone who does photography as an art has to consider. Many of the people using generative art as tools lack the background skills that would allow them to use them properly as tools. Without that, it’s hard to identify what makes a piece of visual art not work, or what needs to be changed to convey a mood or idea.

    In an ideal world, there would be no concern for loss of employment because no one would have to work to live. In that world, these tools would be a wonderful addition to the panoply of artistic implements modern artists enjoy.



  • I did close my post by saying capitalism is responsible for the problems, so I think we’re on the same page about why it’s unethical to engage with AI art.

    I am interested in engaging in a discourse not about that (I am very firmly against the proliferation of AI because of the many and varied bad social implications), but I am interested in working on building better arguments against it.

    I have seen multiple people across the web making the argument that AI art is bad not just because of the fact that it will put artists out of work, but because the product is, itself, lacking in some vital and unnameable human spark or soul. Which is a bad argument, since it means the argument becomes about esoteric philosophy and not the practical argument that if we do nothing art stops being professionally viable, killing many people and also crushing something beautiful and wonderful about life forever.

    Rich people ruin everything, is what I want the argument to be.

    So I’m really glad you’re making that argument! Thanks, honestly, it’s great to see it!


  • The question about if AI art is art often fixates on some weird details that I either don’t care about or I think are based on fallacious reasoning. Like, I don’t like AI art as a concept and I think it’s going to often be bad art (I’ll get into that later), but some of the arguments I see are centered in this strangely essentialist idea that AI art is worse because of an inherent lack of humanity as a central and undifferentiated concept. That it lacks an essential spark that makes it into art. I’m a materialist, I think it’s totally possible for a completely inhuman machine to make something deeply stirring and beautiful- the current trends are unlikely to reliably do that, but I don’t think there’s something magic about humans that means they have a monopoly on beauty, creativity or art.

    However, I think a lot of AI art is going to end up being bad. This is especially true of corporate art, and less so for individuals (especially those who already have an art background). Part of the problem is that AI art will always lack the intense level of intentionality that human-made art has, simply by the way it’s currently constructed. A probabilistic algorithm that’s correlating words to shapes will always lack the kind of intention in small detail that a human artist making the same piece has, because there’s no reason for the small details other than either probabilistic weight or random element. I can look at a painting someone made and ask them why they picked the colors they did. I can ask why they chose the lighting, the angle, the individual elements. I can ask them why they decided to use certain techniques and not others, I can ask them about movements that they were trying to draw inspiration from or emotions they were trying to communicate.

    The reasons are personal and build on the beauty of art as a tool for communication in a deep, emotional and intimate way. A piece of AI art using the current technology can’t have that, not because of some essential nature, but just because of how it works. The lighting exists as it does because it is the most common way to light things with that prompt. The colors are the most likely colors for the prompt. The facial expressions are the most common ones for that prompt. The prompt is the only thing that really derives from human intention, the only thing you can really ask about, because asking, “Hey, why did you make the shoes in this blue? Is it about the modern movement towards dull, uninteresting colors in interior decoration, because they contrast a lot with the way the rest of the scene is set up,” will only ever give you the fact that the algorithm chose that.

    Sure, you can make the prompts more and more detailed to pack more and more intention in there, but there are small, individual elements of visual art that you can’t dictate by writing even to a human artist. The intentionality lost means a loss of the emotional connection. It means that instead of someone speaking to you, the only thing you can reliably read from AI art is what you are like. It’s only what you think.

    I’m not a visual artist, but I am a writer, and I have similar problems with LLMs as writing tools because of it. When I do proper writing, I put so much effort and focus into individual word choices. The way I phrase things transforms the meaning and impact of sentences, the same information can be conveyed so many ways to completely different focus and intended mood.

    A LLM prompt can’t convey that level of intentionality, because if it did, you would just be writing it directly.

    I don’t think this makes AI art (or AI writing) inherently immoral, but I do think it means it’s often going to be worse as an effective tool of deep, emotional connection.

    I think AI art/writing is bad because of capitalism, which isn’t an inherent factor. If we lived in fully-automated gay luxury space communism, I would have already spent years training an LLM as a next-generation oracle for tabletop-roleplaying games I like. They’re great for things like that, but alas, giving them money is potentially funding the recession of arts as a profession.


  • Those are only conflicting statements if you believe that the market will not embrace worse products. It totally will so long as you have a group of people who lack the critical analysis skills to compare the products and arrive at the conclusion that the new one is worse.

    It doesn’t help that the potential drivers of this action are massive conglomerates, so if a sweeping change comes from the top-down and is paired with a lot of propaganda (Marketing) then people will have no choice but to accept it as the standard.

    I think that a lot of criticism about the actual quality of AI art is mixed, though. I feel like it has flaws, but I’ve seen arguments about flaws I don’t think are actually real problems with the technical quality.


  • Hey, I wanted to say I’m sorry for using ambiguous language there. My time studying history has profoundly affected me, so I tend to use “Opinion” to mean “What’s your understanding and reasoned analysis of X thing?”

    The alternate implications slipped my mind when I posted. My bad!

    I wanted to say thanks for sharing your thoughts and I think we’re actually on the same page with regards to current AI technology. I do recognize that a lot of people are interested in using it, and that they will continue to highly value the functionality it has. My objections to it are, of course, related to secondary concerns and problematic social issues- which I think you understand based on your post.



  • My bingo-board didn’t have, “replacing actual research with a liar-box,” but here we are. I’ve noticed, to my increasing discomfort, trends with using ChatGPT (and similar) to replace actual research into topics, or people using them to “summarize” articles. As someone with actual research training it’s pretty alarming to realize how little people understand about what research is.

    I’ve seen how badly these things mangle my area of academic interest, because they can’t write reasonable citations.