marketingsite.blogg.se

Little snitch endpoint security
Little snitch endpoint security










  1. #Little snitch endpoint security code#
  2. #Little snitch endpoint security free#

Well, it’s a well-designed trap carefully constructed to have the strongest possible effects on our lizard brains. But it's still technically possible (and likely happens). I assume the bar to snoop on someone would most likely be lowered if laws like this are passed.

#Little snitch endpoint security code#

I'd also add, while e2ee is strong, unless you're verifying and compiling all the code you run on your device yourself, Apple/Google/etc still maintains the technical ability to snoop on you regardless of this law. They're just throwing the oppositions legitimate concerns under the bus. These approaches are not how you reach a good solution that balances both sides of the argument. "it's just so they can spy on you!" is roughly equivalent to "you view CSAM and that's why you don't want to protect our kids". Instead of taking part in a constructive debate by arguing specific points, they try to paint a false motive on the opposition. While you haven't done this, many others have. It's also wrong and harmful to attribute motive based on assumptions. And it's factually wrong to claim otherwise. I'm simply trying to point out that CSAM is a real problem. I'm not arguing the surveillance state isn't licking their chops.

little snitch endpoint security

I'm not arguing this law proposes the correct solution that should be adopted. If one can't recognize it's at least a poor attempt to balance the right to privacy with protecting kids, then I struggle to see how they're being honest with themselves. Many are quick to adopt the assumption that this law is an intentional and cleverly designed "attack on our rights to e2ee", as fact. I'd argue society would generally agree CSAM is unacceptable and we're allowing it right now. The "government will spy on you!"-is also emotional weight cynically being used to get people to go along with technical solutions that support CSAM. The same could be said of of the "other side" of this argument though. I understand your argument, and I appreciate it. So we're approaching a time when someone can just generate some images locally with stable diffusion or whatever of what looks like CSAM (but no actual children involved), get it into your possession (either remotely or just dropping an SD card in your house somewhere you don't see) and now you are liable for decades in prison.hopefully you don't have any enemies. We already know very rich and powerful people are routinely involved in child sex rings.Īnd the AI image generation stuff makes it even more dangerous and ridiculous. It's almost as if the powers that be REALLY don't want the general public to know who's behind some of this stuff. Is it illegal to possess photos/videos of someone getting stabbed, shot, beheaded, etc? Surely there are people who derive sexual pleasure from that material too, yet it proliferates and no one really cares. No other crime scene evidence is guarded in that way. It's kind of creepy in a way when you think about it.

little snitch endpoint security

They have helped solved many cold case murders that police never could.

#Little snitch endpoint security free#

Distributing it should obviously still be illegal, but banning simple possession doesn't stop traffickers (obviously) and eliminates an enormous free workforce of amateur sleuths who I am sure would seriously help the problem. I am sure that if internet sleuths were allowed to investigate such material, like law enforcement is, many, many more child abusers would ultimately get caught.

little snitch endpoint security

The pedophiles are voluntarily taking pictures and videos of themselves committing crimes and sharing it on the internet. Framed as crime scene evidence, it is kind of strange that simply possessing it, for any reason, is a serious crime worth decades in prison.












Little snitch endpoint security