• 0 Posts
  • 24 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle




  • Let me give you a hypothetical. Imagine that these murders happened just as they did. But now imagine that Palestine was not brought up. The guy just stayed silent. The 2 Zionists are dead all the same, but there is plausible deniability that this was not done in the name of Palestine. Like I’m saying, there is a more effective way to crash out, if you are going to crash out. Furthermore, you shouldn’t be crashing out on an individual level! When will we have it sink in that there is no individual solution to systemic issues? Organize!

    Learn from your enemy. Many Israeli officials still claim to this day is denying the genocide. It’s not because they think you’ll believe it. It’s about plausible deniability.


  • Wrong. You still have the first amendment. A privilege that the entire political spectrum of the US is a fan of. There is however an exemption to the first amendment. That is incitement. You are all making a case for the fascist government to portray pro-Palestinian sentiment as calls for violence.

    Let me make it perfectly clear. The 2 that were smoked have a direct hand in the progress of the Zionist project. I have no love for them. But if you’re going to crash out, please crash out responsibly.













  • The developers of LemmyNet are being asked for the ability to define a subroutine by which uploaded images are to be preprocessed and denied or passed thereafter. There is no such feature right now. Even if they wanted to use CloudFlare CSAM protection, they couldn’t. That’s the entire problem. This preprocessing routine could use Microsoft PhotoDNA and Google CSAI, it could use a self-hosted alternative as db0 desires or it could even be your own custom solution that doesn’t destroy, but stores CSAM on a computer you own and stops it from being posted.


  • Imagine if you were the owner of a really large computer with CSAM in it. And there is in fact no good way to prevent creeps from putting more into it. And when police come to have a look at your CSAM, you are liable for legal bullshit. Now imagine you had dependents. You would also be well past the point of being respectful.

    On that note, the captain db0 has raised an issue on the github repository of LemmyNet, requesting essentially the ability to add middleware that checks the nature of uploaded images (issue #3920 if anyone wants to check). Point being, the ball is squarely in their court now.


  • Traditional hash like MD5 and SHA256 are not locality-sensitive. Can’t be used to detect match with certain degree. Otherwise, yes you are correct. Perceptual hashes can create false positive. Very unlikely, but yes it is possible. This is not a problem with perfect solution. Extraordinary edge cases must be resolved on a case by case basis.

    And yes, simplest solution must be implemented first always. Tracking post reputation, captcha before post, wait for account to mature before can post, etc. The problem is that right now the only defense we have access to are mods. Mods are people, usually with eyeballs. Eyeballs which will be poisoned by CSAM so we can post memes and funnies without issues. This is not fair to them. We must do all we can, and if all we can includes perceptual hashing, we have moral obligation to do so.