Hello everyone,
We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
We keep working on a solution, we have a few things in the works but that won’t help us now.
Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.
Edit: @[email protected] the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.
But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.
Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.
Fuck these trolls
troll is too mild of an adjective for these people
How about “pedophile”? I mean, they had to have the images to post them.
“Terrorist”. Having the images doesn’t mean they liked them, they used them to terrorize a whole community though.
“Petophilile enabled Terrorist” or “petophilic terrorist” depending on the person
It still means they can tolerate CSAM or are normalized to it enough that they can feel anything other than discust during “shipping and handling”.
All of your comments have “banned” next to them for me (logged in via lemmy.world using Liftoff) - any idea why?
I assume you’re not actually banned…?
They were banned because they were defending pedophilia (advocating for them to be able to get off to what turns them on) and also trolling very aggressively. You can look at them in the Modlog on the Website, not sure if Apps implement the modlog yet though.
Ah thanks, I’ve seen it a few times but thought it was a bug. Why are people like this!
I have not clue, people can be quite toxic and horrible. Also noticed that they reduced his ban, not sure why, defending pedophilia is pretty bad and definitely carries legal risk but it’s not my call to make.
Yeah, got banned for forgetting that some axioms give people free pass to say whatever they want, no matter how they say it… and replying in kind is forbidden. My bad.
You were banned because you were arguing for why people shouldn’t be arrested for possession of CSAM material, trolling and straw-manning in the replies, and on top of that attempting to seriously and honestly advocate for pedophiles on another community, which is at best borderline illegal (anyone can check the modlog on that one if they don’t believe me, I wouldn’t make such claims if they weren’t true).
So to summerize you were banned for
- Trolling
- Promoting Illegal Activities (Pedophilla and CSAM)
Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.
The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go “oh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.
Sounds like a digital form of SWATing.
And not just the instance admins would be at risk as well. Any time you view an image your device is making a local copy of it. Meaning every person who viewed the image even accidentally is at risk as well.
Yeah honestly report all of those accounts to law enforcement. It’s unlikely they’d be able to do much, I assume, but these people are literally distributing CSAM.
That’s not a troll, CSAM goes well beyond trolling, pedophile would be a more accurate term for them.
Yeah. A troll might post something like a ton of oversized images of pig buttholes. Who the fuck even has access to CSAM to post? That’s something you only have on hand if you’re a predator already. Nor is it something you can shrug off like “lol I was only trolling”. It’s a crime that will send you to jail for years. It’s a major crime that gets entire police units dedicated to it. It’s a huuuuge deal and I cannot even fathom what kind of person would risk years in prison to sabotage an internet forum.
My thoughts exactly, like if they were just spamming goatsee or something, that would be one thing…
But this raises several questions, and they can only have grimdark answers.
Dont forget they are doing this to harm others, they deserve the name “e-terrorist” or simmlar. hey are still absolutely pedophiles. Their bombing out a space, not trying to set up shop.
I would definately agree that this would very likely count as cyber terrorism, and if it doesn’t it should.
deleted by creator
Simply having it to post makes you culpable. It’s way beyond trolling.
deleted by creator
deleted by creator
What he’s saying is FBI CHECK THAT GUY’S HARD DRIVE!
deleted by creator
deleted by creator
deleted by creator
Do you know why we have possession laws against CSAM in the first place? It’s because people buy and sell abuse material in underground markets, it’s another way they profit off the abuse of children. This is nothing like drug possession laws (which are stupid) because the product is literally a direct product of the abuse of children that many of the people in possession likely helped the criminals in order to obtain it (either directly or by paying them for it).
So yes in this case it does make sense to criminally charge people for possession of something like this considering the direct connection CSAM has to child trafficking and child sexual abuse and when you defend it by going against possession laws it makes it seem like you support these criminals.
Oh by the way, MAP stuff really doesn’t look good on you (that’s in your comment history). Yeah maybe you think I’m a terrible person because I think drugs should be treated less harshly but you have literally said in other comments that Pedophiles should be allowed to get off on what turns them on (which I remind you is exploitation of Minors). That is a very different stance than people shouldn’t be beaten and arrested for snorting coke, you’re literally advocating for people to be allowed to produce and consume abuse material and claiming that it’s acceptable for people to be pedophiles, and pursue their attractions instead of getting help. I don’t know how you don’t see what is wrong with that. Like seriously this is either really bad trolling (way too far) or, you’re one of them.
Too bad, I’m “kid agnostic”; they might as well be cars or dragons —drawn or otherwise—, I don’t care whether they’re “kid” or “grownup” cars or dragons.
I think I now know which one it is if that statement from the horse’s mouth is to be believed…
A lot of stuff the government bans doesnt align with morrality. In this instance it fucken does
Yes getting someone to drop this on your hard drive even if its explicitly labeled “cache” is equivilant to evidence planting. It puts you in danger of our laws falsely finding you guilty (misunderstandings are a thing, i dont know the level of risk). The advice by our governments is “delete it emedately”. Follow it as completely as you can. Most devices dont broadcast your harddrive contents without warning, giving you time to delete it. For this being false for iphones, Its a risk to ones personal freedom to go on lemmy on an iphone until we can get this CSAM issue resolved.
Yes, its a virus, the FBI will target anyone who is a host. anyone Who has it on their drive, (edit: intent may be relivant, im no expert). The only way to stay safe is to rid yourself of it. Delete it.
Lemmy mods, keep yourself safe
please dont use an iphone to moderate, if your on linux (i think windows too) use bleachbit on your browser cache and do the “vaccuum” operation.
On android
-
to clear cache or better written instructions here
- go to where you see all your apps
- find your client
- tap and hold on its icon
- tap “app info”
- go to “storage”
- tap “clear cache”
- (if your parranoid “clear data” and loose you sign in, settings and other local data).
-
To manually vaccum, vant find better instructions
- download an app called “termux”, it doesnt need any permissions for this task.
- when you see a black screen with text, type
clear
and hit enter, - Then type or paste
{ echo writing big file of random; cat /dev/urandom >file-gets-big; rm file-gets-big -v; }
- And hit enter
Your phone and the program
cat
will complain About being out of storage, ifrm
gets run, it will be fixed again. If it still complains or termux crashes, uninstall and reinstall termux, the vaccum process is finishedSome people know at a glance wether these steps are safe or not, others do not. never follow instructions you dont understand, verify that i havent led you do somthing dumb.
-
I’d say the proper word is ‘criminal.’
A person who is attracted to children is an evil and disgusting person, someone being a pedophile isn’t just “liking something”, they are a monster.
deleted by creator
This is a serious problem we are discussing, please don’t use this as an opportunity to inject bad-faith arguments.
Edit: Wow your post history is a lot of the same garbage, there is no point in attempting to reason with you, you seem to be defending the act of CSAM or just trolling (really awful and severe trolling I might add, CSAM isn’t something to joke or troll about).
deleted by creator
Ah I see what’s going on, you’re salty that they closed the shitposting community so you’re trolling here, going so far as to compare gays and jews to pedophillia (which is extremely bigoted and incorrect) or downplay the horrific acts that led to the closing of that community and registrations to protect the rest of the Instance’s well being.
Also I’d appreciate it if you didn’t edit what I said when quoting me, thanks.
Criminals.
Trolls? In most regions of the planet, I am fairly certain their actions would be considered criminal.
Removed by mod
The Internet is essentially a small microbiome of beautiful flora and fauna that grew on top of a lake of sewage.
The Internet is a reflection of humanity, minus some of the fear of getting punched in the face.
Yeah, back in the Limewire/Napster/etc days, it wasn’t unheard of for people to troll by relabeling CSAM as a popular movie or TV show. Oh, you wanted to download the latest Friends episode? Congrats, now you have CSAM because a troll uploaded it with the title “Friends S10E7.mov”
I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don’t worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won’t. Dm me If you wish to apply for mod.
Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.
@[email protected] this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.
Hopefully the devs will take the lesson from this incident and put some better tools together.
There’s a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren’t already aware.
Or we’ll finally accept that the core Lemmy devs aren’t capable of producing a functioning piece of software and fork it.
Its not easy to build a social media app, forking it won’t make it any easier to solve this particular problem. Joining forces to tackle an inevitable problem is the only solution. The Lemmy devs are more than willing to accept pull requests for software improvements.
And who’s gonna maintain the fork? Even less developers from a split community? You have absolutely no idea what you’re talking about.
This isn’t your fault. Thank you for all you have done in regards to this situation thus far.
It’s not your fault, these people attacked and we don’t have the proper moderation tools to defend ourselves yet. Hopefully in the future this will change though. As it stands you did the best that you could.
Thanks for your work. The community was appreciated.
You didn’t do anything wrong, this isn’t your fault and we’re grateful for the effort. These monsters will be slain, and we will get our community back.
I love your community and I know it is hard for you to handle this but it isn’t your fault! I hope no one here blames you because it’s 100% the fault of these sick freaks posting CSAM.
You do a great job. I’ve reported quite a few shit heads there and it gets handled well and quickly. You have no way of knowing if some roach is gonna die after getting squashed or if they are going to keep coming back
Really feel for you having to deal with this.
You’ve already had to take all that on, don’t add self-blame on top of it. This wasn’t your fault and no reasonable person would blame you. I really feel for what you and the admins have had to endure.
Don’t hesitate to reach out to supports or speak to a mental health professional if you’ve picked up trauma from the shit you’ve had to see. There’s no shame in getting help.
As so many others have said, there’s no need for an apology. Thank you for all of the work that you have been doing!
The fact that you are staying on as mod speaks to your character and commitment to the community.
Please, please, please do not blame yourself for this. This is not your fault. You did what you were supposed to do as a mod and stepped up and asked for help when you needed to, lemmy just needs better tools. Please take care of yourself.
This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.
The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.
The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.
Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.
Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:
Talk to your children about online safety and the dangers of CSAM.
Teach your children about the importance of keeping their personal information private. Monitor your children’s online activity.
Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.
So far I have not seen such disgusting material, but I’m saving this comment in case I ever need the information.
Are there any other numbers or sites people can contact in countries other than the USA?
The amount of people in these comments asking the mods not to cave is bonkers.
This isn’t Reddit. These are hobbyists without legal teams to a) fend off false allegations or b) comply with laws that they don’t have any deep understanding of.
This is flat out disgusting. Extremely questionable someone having an arsenal of this crap to spread to begin with. I hope they catch charges.
See that’s the part of this that bothers me most… Why do they have so much of it? Why do they feel comfortable letting others know they have so much of it? Why are they posting it on an open forum?
The worst part is, there is not a single god damn answer to ANY of those that wouldn’t keep a sane person up at night… shudder
deleted by creator
You must realize the logical fallacy in that statement, right?
Not that I’m familiar with Rust at all, but… perhaps we need to talk about this.
The only thing that could have prevented this is better moderation tools. And while a lot of the instance admins have been asking for this, it doesn’t seem to be on the developers roadmap for the time being. There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.
Lets be productive. What exactly are the moderation features needed, and what would be easiest to implement into the Lemmy source code? Are you talking about a mass-ban of users from specific instances? A ban of new accounts from instances? Like, what moderation tool exactly is needed here?
Speculating:
Restricting posting from accounts that don’t meet some adjustable criteria. Like account age, comment count, prior moderation action, average comment length (upvote quota maybe not, because not all instances use it)
Automatic hash comparison of uploaded images with database of registered illegal content.
On various old-school forums, there’s a simple (and automated) system of trust that progresses from new users (who might be spam)… where every new user might need a manual “approve post” before it shows up. (And this existed in Reddit in some communities too).
And then full powers granted to the user eventually (or in the case of StackOverlow, automated access to the moderator queue).
Could they not just change one pixel to get another hash?
I guess it’d be a matter of incorporating something that hashes whatever it is that’s being uploaded. One takes that hash and checks it against a database of known CSAM. If match, stop upload, ban user and complain to closest officer of the law. Reddit uses PhotoDNA and CSAI-Match. This is not a simple task.
None of that really works anymore in the age of AI inpainting. Hashes / Perceptual worked well before but the people doing this are specifically interested in causing destruction and chaos with this content. they don’t need it to be authentic to do that.
It’s a problem that requires AI on the defensive side but even that is just going to be eternal arms race. This problem cannot be solved with technology, only mitigated.
The ability to exchange hashes on moderation actions against content may offer a way out, but it will change the decentralized nature of everything - basically bringing us back to the early days of the usenet, Usenet Death Penaty, etc.
Not true.
A simple CAPTCHA got rid of a huge set of idiotic script-kiddies. CSAM being what it is, could (and should) result in an immediate IP ban. So if you’re “dumb” enough to try to upload a well-known CSAM hash, then you absolutely deserve the harshest immediate ban automatically.
You’re pretty much like the story of the economist who refuses to believe that $20 exists on a sidewalk. “Oh, but if that $20 really existed on the sidewalk there, then it would have been arbitraged away already”. Well guess what? Human nature ain’t economic theory. Human nature ain’t cybersecurity.
Idiots will do dumb, easy attacks because they’re dumb and easy. We need to defend against the dumb-and-easy attacks, before spending more time working on the harder, rarer attacks.
You don’t get their ip when they post from other instances. I’m surprised this hasn’t resulted in defed.
Well, my home instance has defederated from lemmy.world due to this, that’s why I had to create a local account here.
I mean defedding the instances the CSAM is coming from but also yes.
Couldn’t one small change in the picture change the whole hash?
Good question. Yes. Also artefacts from compression can fuck it up. However hash comparison returns percentage of match. If match is good enough, it is CSAM. Davai ban. There is bigger issue however for developers of Lemmy, I assume. It is a philosophical pizdec. It is that if we elect to use PhotoDNA and CSAI Match, Lemmy is now at the whims of Microsoft and Google respectively.
Mod tools are not Lemmy. Give admins and mods an option. Even a paid one. Hell. Admins of Lemmy.world could have us donate extra to cover costs of api services.
I agree. Perhaps what Lemmy developers can do is they can put slot for generic middleware before whatever the POST request is in Lemmy API for uploading content? This way, owner of instance can choose to put whatever middleware for CSAM they want. This way, we are not dependent on developers of Lemmy for solution to pedo problem.
One bit, in fact. Luckily there are other ways of comparing images without actually showing them to human eyes that allow you to calculate a percentage of similarity.
The best feature the current Lemmy devs could work on is making the process to onboard new devs smoother. We shouldn’t expect anything more than that for the near future.
I haven’t actually tried cloning and compiling, so if anyone has comments here they’re more than welcome.
I think it would be an AI autoscan that flags some posts for mod approval before they show up to the public and perhaps more fine-grained controls for how media is posted like for instance only allowing certain image hosting sites and no directly uploaded images.
Reddit had automod which was highly configurable.
Reddit automod is also a source for all the porn communities. Have you ever checked automod comment history?
Yeah, I have. Like 2/3 of automod comments are in porn communities.
What? Reddit automod is not a source for porn. What would be happening is the large quantity of content it reacts to there.
It literally reads your config in your wiki and performs actions based on that. The porn communities using it are using it to moderate their subs. You can look at the post history. https://www.reddit.com/user/AutoModerator It is commenting on posts IN those communities as a reaction to triggers but isn’t posting porn (unless they put in their config)
Not worth it if you don’t moderate on reddit but read the how to docs for reddit automod, it is an excellent tool for spam management and the source is open prior to reddit acquiring it and making it shit. https://www.reddit.com/wiki/automoderator/full-documentation
No shit, ya don’t say?
Where the hell you think I got that list from? I literally filtered every single subreddit that AutoModerator replied in for like three months.
Bruh you’re preaching to the person that accumulated the data. That’s the data it puked up. I can’t help it that most of them happen to be filth communities.
So you should understand that what you said is invalid. Automod doesn’t post porn without a subreddit owner configuring it to and just because it posts 2/3 to NSFW subs doesn’t mean it is posting content just working more there.
We could 100% take advantage of a similar tool, maybe we some better controls on what mods can make it do. I’m working to bring BotDefence to Lemmy because it is needed.
You completely missed the point.
By the statistics of the data I found, most of the subreddits using AutoModerator are filth communities.
So you can reverse that, check AutoModerator comment history, and find a treasure trove of filth.
I can’t help that these are the facts I dug up, but yeah AutoModerator is most active in porn communities.
Too stupid to argue with. You don’t even understand your own “data”.
I was just discussing this under another post and turns out that the Germans have already developed a rule-based auto moderator that they use on their instance:
https://github.com/Dakkaron/SquareModBot
This could be adopted by
lemmy.world
by simply modifying the config fileProbably hashing and scanning any uploaded media against some of the known DBs of CSAM hashes.
Iirc that’s how Reddit/FB/Insta/Etc. handle it
There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.
this doesn’t seem like a respectful comment to make. People have responsibilities; they aren’t paid for this. It doesn’t seem to fair to make criticisms of something when we aren’t doing anything to provide a solution. A better comment would be “there are just 2 full time developers on this project and they have other priorities. we are working on increasing the number of full time developers.”
Imagine if you were the owner of a really large computer with CSAM in it. And there is in fact no good way to prevent creeps from putting more into it. And when police come to have a look at your CSAM, you are liable for legal bullshit. Now imagine you had dependents. You would also be well past the point of being respectful.
On that note, the captain db0 has raised an issue on the github repository of LemmyNet, requesting essentially the ability to add middleware that checks the nature of uploaded images (issue #3920 if anyone wants to check). Point being, the ball is squarely in their court now.
I think the FBI or eqivilant keeps a record of hashes for a known CASM and middleware should be able to compare to that. Hopefully, if a match is found, kill the post and forward all info on to LE.
Interesting. But aren’t hashes unique to a specific photo? Just a single change to the photo would inevitably change its hash.
I think Apple was going to implement a similar system and deploy to all iPhones/Macs in some iOS/macOS update. However was eventually 86’d due to privacy concerns from many people and the possible for abuse and/or false positives.
A system like this might work on a small scale though as part of moderating tools. Not sure where you would get a constantly updated database of CSAM hashes though.
Interesting. But aren’t hashes unique to a specific photo? Just a single change to the photo would inevitably change its hash.
Most people are lazy and stupid, so maybe hash checking is enough to catch a huge portion (probably more than 50%, maybe even 80% or 90%?) of the CSAM that doesn’t bother (or know how) to do that?
I’m almost positive they’ve been developing an image recognition AI that will make slightly altering csam photos obsolete.
Here’s hoping.
A hash would change if even one bit changed in that file. This could be from corruption, automated resizing by any photo processing tools (i.e., most sites will resize photos if you give them one too big), saving a lossy file time again (adding more jpg), etc… This is why there aren’t many automated tools for this detection. Sites that have tried by using skin tones in a photo have failed spectacularly.
I’ve never heard of this FBI middleware. Does anyone have the link to this? I’d like to understand what tools are available to combat this as I’ve been considering starting my own instance for some time now.
In my utopia world, the FBI has a team updating the DB.
The utopia algorithim would do multiple subsets of the picture so cropping or watermarking wouldn’t break the test (assume the ‘crux’ of the CSAM would be most likely unaltered?) , maybe handle simple image transformations (color, tint, gamma, etc.) with a formula.
What you’re talking about is digital (aka forensic) watermarking.
IMO scanning images before posting them to a forum is a distinct and utterly completely different world than having your photo collection scanned. Especially in context and scale
I agree with you, I’d just gently suggest that it’s borne of what is probably significant upset at having to deal with what they’re having to deal with.
I mean, the “other priorities” comment does seem to be in bad taste. But as for the comment on the future of Lemmy, I dunno. I feel like they’re just being realistic. I think the majority of us understand the devs have lives but if things don’t get sorted out soon enough it could impact the future of Lemmy.
Thing is, if this continues to be a problem and if the userbase/admins of instances are organised, we can shift those priorities. They may not have envisioned this being a problem with the work they decided to work on for the next several months. Truly, the solution is to get more developers involved so that more can happen at once.
Seriously. We need to cut them some slack because nobody expected Reddit to go full Elon in May.
Exactly, and Mastodon had been kinda gunning for Twitter for years before Elon went full Elon, so they were primed for the influx. Lemmy I think expected to have years to go before it’s userbase would similarly skyrocket.
Yeah, Reddit was famously open to third party developers for 15 years or so, and now they and their bootlickers are claiming they didn’t know that there were third party apps using the API to browse the whole site.
Even the Apollo dev said nothing but good things about Reddit because they were very transparent with him until they decided to paywall the API. Nobody saw this coming.
Wish I was a dev. I’d jump in to help so fast.
Maybe you could start with making pull-requests to help and maybe also writting them an application on Matrix. I’m not being snarky just pointing out that it’s easier to help than you might think.
I think taco butt plug meant that they aren’t a developer, like at all, so can’t help with coding or PRs or anything.
Fair enough.
I have no idea what a pull request or matrix is but I’ll start reading about them.
Matrix is a secure chat protocol used by the Devs to message eachother.
A pull request is a way of proposing and contributing code on git-based platforms like github, gitlab, and codeberg.
Yea, thank you. I found the github list but yea… guess it’s a good time to learn!
You don’t become a developer by wishing. Here’s a tutorial if you want to learn
(edit: Rust, not Go)
Thank you!!!
I can’t seem to find the AMA thread from the devs but I remember they said they actually are being paid by some dutch organisation
Funded by https://nlnet.nl/
plus ~4400$/month from donations from
https://opencollective.com/lemmy
https://www.patreon.com/dessalines
https://liberapay.com/dessalines/
https://liberapay.com/nutomic/
They also take bitcoin, etherium, monero and cardano.
DEVELOPERS produce a software to help people post images and text online. Nothing bad about that.
ADMINS install the developers software on a server and run it as an instance.
MODS (if any exist besides the admin) moderate the instance to keep illegal content off the site.
USERS may choose to use the software to post CSAM.
None of these groups of people have paid for or are getting paid for their time. USERS generally don’t take much legal risk for what’s posted, as instance owners don’t ask for personally identifiable information from users.
Sites like reddit, although we all hate it, do make a profit, and some of that profit is used to pay “trust and safety” teams who are paid (generally not very well, usually in underdeveloped or developing countries) to wade through thousands of pictures of CSAM, SA, DV/IPV and other violent material, taking it down as it gets posted to facebook, reddit, other major online properties.
—-
Developers, admins and mods are generally doing this in their free time. Not sure how many people realize this but developers, admins and mods are also people who need to eat - developers have a skill of developing software, so many open source devs are also employed and contribute to open source in their off time. Admins may be existing sysadmins at companies but admin lemmy instances in their off time. Mods do it to protect the community and the instance itself.
USERS can be a bit self-important at times. We get it, you all generate the content on this site. Some content isn’t just unwanted though, it’s illegal and if not responded to quickly could mean not only a shutdown instance but also possible jailtime for admins, who ultimately will be the ones who are running a “reddit-like site” or “a haven for child porn”.
Good bot
People have responsibilities
Exactly - when you create a site, you have a responsibility to make sure it’s not used to distribute child porn.
Exactly - when you create a site, you have a responsibility to make sure it’s not used to distribute child porn.
1 6
Body
Cancel Preview Reply
That burden should not rest on 2 people.
Then the logical conclusion is that the 2 people should find some other people to share the burden.
I really don’t see how my statement is controversial. This is sadly how the internet works, regardless of how much or how little you can invest into your site - you need mechanisms to fight off against such spam and malice.
I hope the devs take this seriously as an existential threat to the fediverse. Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers. If taking the community down is the only option here, that’s extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.
Fucking bastards. I don’t even know what beef they have with the community and why, but using THAT method to get them to shut down is nothing short of despicable. What absolute scum.
Please get some legal advice, this is so fucked up.
Genuine question: won’t they just move to spamming CSAM in other communities?
With how slow Lemmy moves anyways, it wouldn’t be hard to make everything “mod approved” if it’s a picture/video.
Or it could even just ask 50 random instance users to approve it. To escape this, >50% of accounts would have to be bots, which is unlikely.
Sounds like the 4chan raids of old.
Batten down, report the offender’s to the authorities, and then clean up the mess!
Good job so far _
We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
It’s likely that we’ll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don’t blame them).
I assume you’ve contacted the FBI, but if not PLEASE DO.
How does closing lemmyshitpost do anything to solve the issue? Isn’t it a foregone conclusion that the offenders would just start targeting other communities or was there something unique about lemmyshitpost that made it more susceptible?
It stops their instance hosting CSAM and removes their legal liability to deal with something they don’t have the capacity to at this point in time.
How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?
How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?
But that’s not what happened. They didn’t take the server offline. They banned a community. If some remote person had access to my pc and they were loading it up with child porn, I would not expect that deleting the folder would fix the problem. So I don’t understand what your analogy is trying to accomplish because it’s faulty.
Also, I think you are confusing my question as some kind of disapproval. It isn’t. If closing a community solves the problem then I fully support the admin team actions.
I’m just questioning whether that really solves the problem or not. It was a community created on Lemmy.world, not some other instance. So if the perpetrators were capable of posting to it, they are capable of posting to any community on lemmy.world. You get that, yeah?
My question is just a request for clarification. How does shutting down 1 community stop the perpetrators from posting the same stuff to other communities?
Fact of the matter is that these mods are not lawyers, and even if they were not liable, they would not have the means to fight this in court if someone falsely, or legitimately, claimed they were liable. They’re hobbits with day jobs.
I also mod a few large communities here, and if I’m ever in that boat, I would also jump. I have other shit to do, and I don’t have the time or energy to fight trolls like that.
If this was Reddit, I’d let all the paid admins, legal, PR, SysOps, engineers and UX folks figure it out. But this isn’t Reddit. It’s all on the hobbyist mods to figure it out. Many are not going to have the energy to put up with it.
It’s not meant to solve the problem, it’s meant to limit liability.
How does it limit liability when they could continue posting that content to any/every other community on lemmy.world?
But it does remove the immediate issue of CSAM coming from shitpost so world isn’t hosting that content.
It doesn’t solve the bigger moderation problem, but it solves the immediate issue for the mods who don’t want to go to jail for modding a community hosting CSM.
Doesn’t that send a clear message to the perpetrators that they can cause any community to be shut down and killed and all they have to do is post CSAM to it? What makes you or anyone else think that, upon seeing that lemmyshitpost is gone, that the perpetrators will all just quit. Was lemmyshitpost the only community they were able to post in?
Yup. The perpetrators win.
If you were in their shoes, would you want to risk going to jail for kiddy porn, risk having your name associated with CSM online, or drain your personal savings account to fight these folks?
These mods are not protected by a well funded private legal team. This isn’t Reddit.
You don’t have to explain how liability works. I get it. What I don’t get is how removing that specific community is going to limit their liability when the perpetrators will just target a different community.
Sign-ups are manual approval applications, no more automated sign-ups from them, if they have existing accounts and target another community it’ll be closed as well and those accounts banned, there isn’t a stream of new accounts though because all accounts going forward need to be manually approved.
One of the ways you avoid liability is you show that you’re actively taking measures to prevent illegal content.
The perps are taking a big risk as well. Finding and uploading csam means being in possession of it. So we can at least take solace in knowing it’s not a tool that just anyone wiill use to take down a community.
Uploading to websites counts as distribution. The authorities will actually care about this. It’s not just some small thing that is technically a crime. It’s big time crime being used for skme thing petty.
So while the perp might win in the short term, they are risking their lives using this tactic. I’m not terribly worried about it becoming a common tactic
I’d anything, if I were the one doing this, I’d be worried that I might be pissing off the wrong group of people. If they keep at it and become a bigger problem, everyone is going to be looking for them. And then that person is going to big boy prison.
That is a great point. I don’t know if the admin team are proactively reporting that activity to law enforcement, but I hope they are.
They also changed the account sign ups to be application only so people can’t create accounts without being approved.