I have had far more abuse on mastodon than I have had on any other network. I built a tool specifically because minority run instances were having trouble with ban evasion and requested this specific tool, only to get brigaded, lied about, and abused once I released it. It went well beyond "this tool is bad"- people made up lies about what it did, lies about who I am, and coordinated harassment campaigns.
No one in their right mind should write software for this platform.
I have had far more abuse on mastodon than I have had on any other network. I built a tool specifically because minority run instances were having trouble with ban evasion and requested this specific tool, only to get brigaded, lied about, and abused once I released it. It went well beyond "this tool is bad"- people made up lies about what it did, lies about who I am, and coordinated harassment campaigns. No one in their right mind should write software for this platform.
Fragment of an article:

From Bad to Worse

Aside from simply blocking the domain and moving on, community members decided to have a little bit of extra fun, attempting to “make the crawler crash“, send angry emails to the service operator, and more. After some study of how the site worked, one person had the malicious idea to send a remote post containing child pornography to the site, before getting someone else to report Content Nation for Child Sexual Abuse Material.

(Screenshot of a post from "sash":

Hi #fediverse I wanted to let you know what happened to my site in the last hours. Some admins (not mentioning them here) thought: my service is a scraper. It's just an ActivityPub service like Mastodon with a search function and a web view. Some trolls took that as an excuse to start or abuse an existing mastodon instance and posted illegal stuff. Used my service to displayed that as web page and complained to my hoster. Now there is one federated service less. Admins, watch out for those users)

To be clear: someone searched a list of known illegal material, loaded that remote content onto Content Nation locally, and then put up a red flag for someone to file a report. Given the server’s jurisdiction being in Germany, this could have been catastrophic: Germany’s laws regarding CSAM stipulate a one-year prison term minimum for possession of this kind of material. (…)
Fragment of an article: From Bad to Worse Aside from simply blocking the domain and moving on, community members decided to have a little bit of extra fun, attempting to “make the crawler crash“, send angry emails to the service operator, and more. After some study of how the site worked, one person had the malicious idea to send a remote post containing child pornography to the site, before getting someone else to report Content Nation for Child Sexual Abuse Material. (Screenshot of a post from "sash": Hi #fediverse I wanted to let you know what happened to my site in the last hours. Some admins (not mentioning them here) thought: my service is a scraper. It's just an ActivityPub service like Mastodon with a search function and a web view. Some trolls took that as an excuse to start or abuse an existing mastodon instance and posted illegal stuff. Used my service to displayed that as web page and complained to my hoster. Now there is one federated service less. Admins, watch out for those users) To be clear: someone searched a list of known illegal material, loaded that remote content onto Content Nation locally, and then put up a red flag for someone to file a report. Given the server’s jurisdiction being in Germany, this could have been catastrophic: Germany’s laws regarding CSAM stipulate a one-year prison term minimum for possession of this kind of material. (…)
Fragment of an article:

From Bad to Worse

Aside from simply blocking the domain and moving on, community members decided to have a little bit of extra fun, attempting to “make the crawler crash“, send angry emails to the service operator, and more. After some study of how the site worked, one person had the malicious idea to send a remote post containing child pornography to the site, before getting someone else to report Content Nation for Child Sexual Abuse Material.

(Screenshot of a post from "sash":

Hi #fediverse I wanted to let you know what happened to my site in the last hours. Some admins (not mentioning them here) thought: my service is a scraper. It's just an ActivityPub service like Mastodon with a search function and a web view. Some trolls took that as an excuse to start or abuse an existing mastodon instance and posted illegal stuff. Used my service to displayed that as web page and complained to my hoster. Now there is one federated service less. Admins, watch out for those users)

To be clear: someone searched a list of known illegal material, loaded that remote content onto Content Nation locally, and then put up a red flag for someone to file a report. Given the server’s jurisdiction being in Germany, this could have been catastrophic: Germany’s laws regarding CSAM stipulate a one-year prison term minimum for possession of this kind of material. (…)
Fragment of an article: From Bad to Worse Aside from simply blocking the domain and moving on, community members decided to have a little bit of extra fun, attempting to “make the crawler crash“, send angry emails to the service operator, and more. After some study of how the site worked, one person had the malicious idea to send a remote post containing child pornography to the site, before getting someone else to report Content Nation for Child Sexual Abuse Material. (Screenshot of a post from "sash": Hi #fediverse I wanted to let you know what happened to my site in the last hours. Some admins (not mentioning them here) thought: my service is a scraper. It's just an ActivityPub service like Mastodon with a search function and a web view. Some trolls took that as an excuse to start or abuse an existing mastodon instance and posted illegal stuff. Used my service to displayed that as web page and complained to my hoster. Now there is one federated service less. Admins, watch out for those users) To be clear: someone searched a list of known illegal material, loaded that remote content onto Content Nation locally, and then put up a red flag for someone to file a report. Given the server’s jurisdiction being in Germany, this could have been catastrophic: Germany’s laws regarding CSAM stipulate a one-year prison term minimum for possession of this kind of material. (…)
Rob Hafner @tedivm@hachyderm.io:

I have had far more abuse on mastodon than I have had on any other network. I built a tool specifically because minority run instances were having trouble with ban evasion and requested this specific tool, only to get brigaded, lied about, and abused once I released it. It went well beyond "this tool is bad"- people made up lies about what it did, lies about who I am, and coordinated harassment campaigns.
No one in their right mind should write software for this platform.
Rob Hafner @tedivm@hachyderm.io: I have had far more abuse on mastodon than I have had on any other network. I built a tool specifically because minority run instances were having trouble with ban evasion and requested this specific tool, only to get brigaded, lied about, and abused once I released it. It went well beyond "this tool is bad"- people made up lies about what it did, lies about who I am, and coordinated harassment campaigns. No one in their right mind should write software for this platform.