• bionicjoey@lemmy.ca
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      No, it’s an incredibly dumb way because fucking with people’s prompts will make the tech unreliable

      • Fizz@lemmy.nz
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        You can’t balance every single aspect of the training data. You will always run into some searches favoring one race over another.

    • SinAdjetivos@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      It’s not, the underlying data is still just as biased. Taking a bunch of white people and saying they are “ethnically ambiguous” is just statistical blackface.

    • raptir@lemdro.id
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      If a request is for a generic person, sure. But when the request is for a specific character, not really.

      Like make one of the undefined arms black.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I agree with you, but there is a lot of gray area. What about Spider-man? 95% of the pictures it ingests are probably Peter Parker so it would have a strong bias towards making him white when there are several ethnicities that might apply. What about Katniss Everdeen? Is she explicitly white in the book or is she just white because she’s played by a white actress? I truly don’t know so maybe that is a bad example. What about Santa? What about Jesus? Of all characters, Jesus absolutely shouldn’t be white but I’ll bet the vast majority of AI depicts him that way.

        I’m not disagreeing with you so much as I’m pointing out the line isn’t really all that clear. I don’t like this ham-handed way of going about it, but I agree with and support the goal of making sure the output isn’t white biased just because preserved history tends to be.

        • raptir@lemdro.id
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          It’s tricky because the data itself is going to be biased here. Think about it - even the video game is specifically called “Spider-Man Miles Morales” while the one with Peter Parker is just called “Spider-Man.”

          Katniss is actually a good example. I was not aware of the details, but the books apparently describe her as having “olive skin”. The problem though is that if you image search her all you get is Jennifer Lawrence.

          That said, Homer is yellow.

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Absolutely. There is only a single depiction of Homer and I agree that unless you specifically ask for a race bent Homer it shouldn’t do this. I was just pointing out that you can’t draw the line at “identifiable character” because clearly that’s also a problem. Maybe there is a better place to draw the line, or maybe it’s going to be problematic regardless of where is drawn, including not doing anything at all.

            I would say if you can’t do it right just do nothing at all, except as a white guy in a white biased world, that’s self-serving. I’m not the right person to say it’s fine to just let it be.

    • drkt@feddit.dk
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Can you explain to me how racial bias in general-purpose LLM is a problem to begin with?

      • flora_explora@beehaw.org
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 year ago

        If you were really curious about the answer, you practically gave yourself the right search term there: “racial bias in general purpose LLM” and you’ll find answers.

        However, like your question is phrased, you just seem to be trolling (= secretly disagreeing and pretending to wanting to know, just to then object).