FBXL Social

Just catching up with the WeDistribute.org articles about the harrassment of fediverse developers. I noticed at the time the vitriol against Ryan for adding ATProto support to BridgyFed...

https://wedistribute.org/2024/02/tear-down-walls-not-bridges/

... and stuck up for him.

But I had no idea what was happening to that poor guy from ContentNation;

https://wedistribute.org/2024/03/contentnation-mastodons-toxicity/

Seriously WTF?!?

The Mastodon inquisition is rapidly becoming the Israel of the fediverse. Using historical mistreatment of them and theirs, and the need to defend themselves from further abuse, as an excuse for some horrific behaviour of their own. Eg trying to frame the ContentNation dev as a CSAM host.

This is beyond the pale. Something must be done.

replies
1
announces
0
likes
0

@Hyolobrika @strypey

Wow. I just can't imagine what goes through these people's heads.

"Ha, I'm so pious and justified in everything I do, and I dislike what this person is doing, so there's nothing wrong with me literally uploading CSAM to his service in order to get him shut down"

They tell you that there is no neutral ground, that you have to pick a side, and you're either with them or against them.

Do you want to be on the side that posts fucking child porn to get what they want?

@Hyolobrika @strypey

Let me put it to you this way..... I have some serious connections and can get access to some really fucked up shit, physical and digital, pretty readily.

Even given all that, I don't think I could get my hands on CSAM if I wanted to pull a stunt like that. So that makes me think a few thoughts about the crowd that's so anti-public-meaning-public, and just so happens to have CSAM on hand (*cough* mastodon.art)

Maybe... just maybe... They're trying to hide child abuse?

@Hyolobrika @strypey Anyway, this is why as:Public will **NEVER** store images. It's a storage nightmare.

We *may* do CLIP interrogation on every image to allow you to search a *description* of the image, but storing the image itself is a fucking no-go.

I'll also point out that all of these losers stopped targeting as:Public literally as soon as they figured out I wasn't going away.

@r000t @Hyolobrika @strypey mastodon is for sickos!!!

@r000t @Hyolobrika @strypey >Maybe... just maybe... They're trying to hide child abuse?
A few old-timers like @p could tell you about the reason why archiveteam.org abandoned their efforts at archiving fedi in its early days and how a certain instance targeting "LGBTQ+ youth" is related to that.

@jasdemi
> This disgusting behavior is something I haven't seen on other social media platforms.
They pick a victim and harass them until they get their way

I assume this behaviour is imported from centralised platforms, where the only possible way to get anything changed is to dogpile the platform operators. Some of the people doing it here may not know there are more constructive options available to them. Others may just be stuck in the habit. Yet others are just trolls, who get off on it.

@lamp
> mastodon is for sickos!!!

All generalisations are false ; )

... posted from a Mastodon account.

@r000t @Hyolobrika

@r000t
> just so happens to have CSAM on hand

I believe lolicon counts as CSAM in Germany, where CN is hosted, so that's not a high bar. I'm pretty sure anyone with a net connection could find lolicon if sufficiently motivated.

Also differing age of consent laws create grey areas. In NZ it's 16, so legitimate porn made here with a 17yo performer could be technically considered CSAM in a country with an age of consent of 18 or over.

@Hyolobrika

@r000t
> Maybe... just maybe... They're trying to hide child abuse?

Let's not add fuel to the fire by throwing wild accusations around. Isn't that the kind of thing this thread is objecting to?

@Hyolobrika

@r000t
> as:Public will **NEVER** store images

as:Public?

@Hyolobrika

@strypey @Hyolobrika I mean these are the same people that sent the FBI to my house twice over a search engine and I have actually run into CSAM on mastodon.art where the admin actively refuses to do anything about it, choosing instead to harass anywhere I have an account.

@strypey @Hyolobrika https://aspublic.org/

Search engine that sits on websocket endpoints on instances, so it doesn't need to scrape.

The existence of this search engine led to multiple police visits and the forced-disabling of public timeline access on Mastodon at the software level.

Remember that part of the allure of Mastodon in the first place was that Twitter was locking down APIs so they could be sold. Mastodon is now doing the same thing, causing the same problems, for "safety"

@mint
> A few old-timers like @p could tell you about the reason why archiveteam.org abandoned their efforts at archiving fedi in its early days

I would like to read more about this. I'd love for some of the public fediverse to be archived, but it's pretty obvious that it needs to be selective and opt-in. Given the ability to opt-in to Mastodon posts being keyword searchable now, this has become more tractable.

@p @r000t @Hyolobrika

@r000t
> sent the FBI to my house twice over a search engine

I utterly condemn this, on so many levels. But two wrongs don't make a right. Let's not descend to their level.

(1/2)

@Hyolobrika

Although if this is the case;

@r000t
> I have actually run into CSAM on mastodon.art where the admin actively refuses to do anything about it

... I hope you reported it to the appropriate authorities. There is no excuse for hosting CSAM (1). The only effective way to prevent its production is old-fashioned detective work to find and free the children.

(1) CSAM is images of real humans. Lolicon, while distasteful, doesn't count, because no human is abused in its production.

@Hyolobrika

@r000t
> part of the allure of Mastodon in the first place was that Twitter was locking down APIs so they could be sold. Mastodon is now doing the same thing, causing the same problems, for "safety"

I guarantee I've been in as many frustrating arguments as you, with people who want the online equivalent of suspending object permanence so they can teleport, without any of the other logical consequences. This is a downside of trying to make decentralised social software ; )

(1/?)

@Hyolobrika

The root problem (no offence) is the lack of a pan-fediverse set of standards, defining what kinds of expectations can be met by social software, and how to label them so people can pick the settings that match their expectations.

I don't think it's unreasonable to want a posting type visible to anyone logged into a fediverse app, but not on the web. But "Public" is not a sensible word to use in describing that scope.

(2/2)

@r000t @Hyolobrika

@strypey @mint @r000t @Hyolobrika

> I would like to read more about this.

I'm not the expert, but I was there; the rough timeline was that they announced a closure, someone started archiving them, the adults running the site that billed itself as a place for LGBT kids to express their sexuality had things like half-dressed 12-year-old girls posting a selfies in bed and saying they were "bored", and that's the public timeline. The adults that ran the site were enthusiastic about this until the archiving started. The justification for the freakout was "What if their parents go reading the archive of a website about gay children hornyposting, and then find their own kid, and then kick the kid out of the house?" There was enough of a backlash that people are still freaking out about "archiving".

> I'd love for some of the public fediverse to be archived, but it's pretty obvious that it needs to be selective and opt-in.

Does it? Once a piece of data leaves your computer, you cannot control it. The whole thing reeks of an attempt at preemptive damage control, content-licensing, whatever. When I stood up this machine and pointed fsebugoutzone.org at it, I wrote in the instance description that it was a temporary domain for FSE, I put in the blog post that this was a temporary domain for FSE, I put "FSE" in the domain, and several Mastodongs freaked out regardless, and screenshat and archive.is'd (Are they worried I'm going to take down the blog post? Why would I announce something that I didn't want people to know?), and I didn't opt in, I didn't agree to anything, I never gave my consent to be quoted or to have anything I wrote reproduced. Quite the opposite: these people believe that it's important that they keep a copy I *cannot* delete. This is not just ignoring consent, but actively rejecting it. It does not bother me, though: they have a right to remember what they want and if they want to use a machine to help them remember, do I have some kind of right to not be quoted, copied, screenshitted, etc.? On the balance, I think the people that are most worried about this are doing something shady and they don't want anyone to connect the dots. berries.space is one such example.

Aside from that, everything you say on here is archived indefinitely by large swaths of fedi: there are 29,508 active servers as of this minute, and a large number of them have copies of this entire conversation.

Not just that, but a public archive is beneficial to the public. We had that run-in with the FBI a while back: I was trying to figure out how pedophiles kept landing on FSE, and poring over the logs, and I see boardreader.com. I start tugging at that thread and boardreader.com, owned by SocialGist, had been archiving the entirety of fedi, but only doing it by hammering FSE's public global timeline (through residential proxies and impersonating a browser), and then attributing everything on every server to to FSE. So I complain, and very shortly after that I get an email from the FBI, and apparently SocialGist had some agreement with the feds, and they were feeding data to the FBI and the FBI wanted me to hand over information on a user on an external server, information I did not have.

You can also see these other sites, there are "AI" companies using fedi to train their models, search engines building indexes, or people archiving all of it for undisclosed reasons that remain a mystery. What do they have? At least if there is a public archive, you can find out.

> Given the ability to opt-in to Mastodon posts being keyword searchable now, this has become more tractable.

It's a cruel joke. Gargrolon Mastodon is willing to sell a false sense of security for the sake of branding.
01--death_march.mp3