One morning last year, Jacobus Louw set out on his daily neighborhood walk to feed the seagulls he finds along the way. Except this time, he recorded several videos of his feet and the view as he walked on the pavement. The video earned him $14, about 10 times the country’s minimum wage, or for Louw, a 27-year-old based in Cape Town, South Africa, half a week’s worth of groceries.

The video was for an “Urban Navigation” task Louw found on Kled AI, an app that pays contributors for uploading their data, such as videos and photos, to train artificial intelligence models. In a couple of weeks, Louw made $50 by uploading pictures and videos of his everyday life.

Thousands of miles away in Ranchi, India, Sahil Tigga, a 22-year-old student, regularly earns money by letting Silencio, which crowdsources audio data for AI training, access his phone’s microphone to capture ambient city noise, such as inside a restaurant or traffic at a busy junction. He also uploads recordings of his voice. Sahil travels to capture unique settings, like hotel lobbies not yet documented on Silencio’s map. He earns over $100 a month doing this, enough to cover all his food expenses.

And in Chicago, Ramelio Hill, an 18-year-old welding apprentice, made a couple hundred dollars by selling his private phone chats with friends and family to Neon Mobile, a conversational AI training platform that pays $0.50 per minute. For Hill, the calculation was simple: he figured tech companies already capture so much of his private data, so he might as well get a cut of the profit.

These gig AI trainers – who upload everything from scenes around them to photos, videos and audio of themselves – are at the frontlines of a new global data gold rush. As Silicon Valley’s hunger for high-quality, human-grade data outpaces what can be scraped from the open internet, a thriving industry of data marketplaces has emerged to bridge the gap. From Cape Town to Chicago, thousands of people are now micro-licensing their biometric identities and intimate data to train the next generation of AI.

This ends well.

  • youcantreadthis@quokk.au
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    7
    ·
    edit-2
    1 day ago

    Laws will not protect you. That is only living by the sword of those who wpuld exploit you.

    Only your own violence will bring freedom and safety. Which sucks, because violence is really really bad. Resent them for that too.

    • MentalEdge@sopuli.xyz
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      1 day ago

      Let me paraphrase your comment: “world bad, good things only possible through bad”

      I’m gonna go ahead and reject that, and ask that you re-evaluate whether you had something to contribute.

      • youcantreadthis@quokk.au
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        edit-2
        1 day ago

        The violence of your masters invoked wholly at their discretion though. Thats a good thing. I should give them more excuses to do that. They’ve done such wonderful things so far.

        Just because you hand it off give the order and look away doesnt mean your ass doesnt live by the sword. Sounds like you really like violence, you just don’t want to think about it.

            • MentalEdge@sopuli.xyz
              link
              fedilink
              arrow-up
              4
              ·
              edit-2
              14 hours ago

              A better world.

              And I happen to believe that humans will co-operate more than defect. And game-theory supports my view. Not yours.

              You walked in saying people only ever defect. You’re wrong.

              And before you twist my words again, no. I don’t think all-defectors can be turned into co-operators. They need to be removed. But their existence does not mean the rest of us have to be ones, too.

              • youcantreadthis@quokk.au
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                14 hours ago

                You think people will instinctively collaborate and not be shitty, therefore we should give proven-bad actors more cover to act badly?

                • MentalEdge@sopuli.xyz
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  edit-2
                  14 hours ago

                  Instinctively? No.

                  Due to learned experience and principles of game theory? Yes.

                  Don’t you try to find out which people will defect and which will co-operate, and act accordingly, instead of just screwing over everyone around you all the time?

                  Most people will co-operate as much as possible, and only retaliate if and when they are abused, and only against the individual or group that broke the chain of co-operation. This maximizes benefit in a way that far outweighs the cost.

                  Stop putting words in my mouth.

                  • youcantreadthis@quokk.au
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    2
                    ·
                    13 hours ago

                    No i actually understand an amount of psychology and moral philosophy beyond some shit i read in a pop-sci or cold war history book so ive got a little more depth than a superficial understanding of ‘game theory’, but i do assess the people around me! The reliably bad ones are called cops and they will always do the worst thing.

                    More laws helps nothing. Laws are just excuses to not fix problems.