I was wondering if there’s any Blockchain project that could be integrated with and fingerprint all GenAI content. Every Blockchain token node could be linked to an image, or video produced in their meta data. And companies could verify if the generated text appearing on their platforms is AI or not.

I think with government regulations we could reduce the number of scams, and misinfo.

Though I do understand it’s limitations and that it would not be super difficult getting past such restrictions if the party were willing enough.

And it wouldn’t work for text, and code I suppose.

But it would be very helpful for indentify AI content much more easily.

Privacy could be maintained as it is with bitcoin and crypto.

But I wonder how you would get everyone to integrate such a system. Both the governments and the companies.


I remember reading about this discussion in some thread months ago, but I can’t find it.

I didn’t know where to post this, but I assume everyone on lemmy is some kind of wiz, so even in this subreddit someone might have the answer.

  • 6nk06@sh.itjust.works
    link
    fedilink
    arrow-up
    31
    ·
    25 days ago

    fingerprint all GenAI content

    People can generate a billion images every seconds. Also what about images generated locally and not put in that blockchain?

    Once again, the blockchain is a solution looking for a problem to solve.

    • pugnaciousfarter@literature.cafeOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      24 days ago

      No no. I have nothing to do with Blockchain. I barely understand it.

      I was just thinking about methods of mitigating harm from misinfo and scam.

      And since Blockchain provides privacy but at the same time provides some information to the public I thought it might have worked.

  • xxce2AAb@feddit.dk
    link
    fedilink
    arrow-up
    12
    ·
    25 days ago

    How would you enforce the entry of all generated AI content into said blockchain? Big providers might be persuaded or forced to participate, but it’s not like someone - specifically well-funded bad actors - couldn’t just set up their own servers at will.

    • pugnaciousfarter@literature.cafeOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      25 days ago

      Ahh yes, my intention was for such tech to make it difficult and mitigate harm not entirely stop it

      If something lacked the fingerprint, it would immediately be suspicious.

      But yeah I don’t completely understand the tech, so I want to know if there’s any such project or research on the same.

      Block chain is an interesting form of fingerprinting that allows privacy for actors but not complete privacy.

      • Lumidaub@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        25 days ago

        If something lacked the fingerprint, it would immediately be suspicious.

        Suspicious of what? Being non-AI?

          • Lumidaub@feddit.org
            link
            fedilink
            arrow-up
            2
            ·
            24 days ago

            People are already witch-hunting real artists because they think their work is too good to be true or whatever. You’d be giving them even more ammunition, putting real art under general suspicion.

  • AbouBenAdhem@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    25 days ago

    I think you’d have better luck doing it the other way around: fingerprint known non-AI content, and treat everything else as potential AI.

    • CannonFodder@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      25 days ago

      Exactly. Use blockchain to trace image/video origins - ie media data plus camera signature and camera operator id with timestamps, and track the sources through every edit, along with edit transform info and editor id. That way any produced clip can be traced back to origins through all edits and then verified to particular real sources. A provable chain of custody of every image. If a video clip came from a trusted known camera operator and stays under the control of entities you trust, you can trust the video. If any links in the ownership or edits are sus, an author can track down and produce the in/out of any edit link to prove the authenticity.

      • pugnaciousfarter@literature.cafeOP
        link
        fedilink
        arrow-up
        1
        ·
        24 days ago

        Oh damn. It’s starting to sound like something that can be used to justify complete control over people’s lives.

        Governments and corporates might use the “security reasons” reason to get even more info and control over us.

        Yeah that was not my intention.

    • pugnaciousfarter@literature.cafeOP
      link
      fedilink
      arrow-up
      1
      ·
      24 days ago

      I did think about that. That was the original idea.

      But I thought it was less feasible to get everyone on board for it than just the AI gen companies.

      But again, I don’t know enough about it. I was wondering if there are any projects try to do something like this.

  • stoy@lemmy.zip
    link
    fedilink
    arrow-up
    6
    ·
    25 days ago

    Anything that relies on external factors for flagging AI images is doomed to fail.

    There will quickly be tools created to strip the metadata from the image, or to create AI pictures but skipping adding the blockchain stuff