• Ferk@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      3 hours ago

      Yes! I mean, blame those who post AI-generated translations as if they were their own, or blame the AI scrappers that use those poorly generated pages for training, but it makes no sense to blame Wikipedia when the only thing they have done is just exist there and offer a platform for knowledge sharing.

      In fact, this problem is hardly exclusive to Wikipedia, every platform with crowdsourced content is in some level susceptible to AI poisoning which ultimately ends up feeding other AIs, the loop exists in all platforms. Though I understand wanting to highlight particularly the risk of endangered languages being more vulnerable to this, since they have less content available to them so the AI models have a smaller dataset which makes them worse and more sensible to bad data.

    • chobeat@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      If you build the infrastructure for a certain thing to happen, you’re responsible for the thing. For the same reason we hold facebook accountable for the rise of the far-right, we should hold WikiPedia accountable for this stuff. Infrastructure is never neutral.

      • glimse@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        22 minutes ago

        That is a completely unfair comparison. For starters, Facebook is a for-profit advertising company and Wikipedia is a community-driven encyclopedia and should be judged by different standards

        Second, both admins and users can edit Wikipedia when there’s a problem. Everyone is “responsible” for fixing it - or at the very least equally at fault

        Next, the content in question. Facebook was (rightfully) given hell for hosting gore, CSAM, adult porn, etc. Things that are immoral, illegal, or outright dangerous. The offending content on Wikipedia is bad translations.

        Lastly, the bigger issue is always enforcement of said content. Facebook was made aware of the problem users/pages/uploads and slacked off on doing anything. These Wikipedia pages have very low traffic and weren’t getting reported. And even with reports, Wikipedia then has to consult with people who speak the rare language.

        They’re similar problems of vastly different scales

      • thejml@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        1 hour ago

        Not exactly the same. I don’t blame facebook for the rise, its just a place to post and share… I blame the algorithm that facebook created and keeps updating to enhance and expand those bubbles while pushing users to outrage and divide them into bubbles that empower and embrace conspiracies, right/alt-right, and other extreme viewpoints. Same thing with X/Twitter.

        Wikipedia doesn’t have any such algorithm. They don’t have a team dedicated to pushing people to those extremes (or anything at all).

    • eleijeep@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 minutes ago

      Vulnerable to going extinct.

      If you read the article it briefly touches on how the “doom spiral” could affect the trajectory of a language that is not widely spoken. It’s not a great article though, it just repeats the same thing for several pages, points the finger at wikipedia instead of the content-generation farms and then fails to properly conclude the argument of their presumed hypothesis.