The curated feed, now run by a team of six moderators, is the meeting ground for hundreds of thousands of Black users on Bluesky. Is it ready to meet the moment?
The problem in many cases isn’t that they don’t literally see it but that they aren’t aware of what constitutes racism a lot of the time. That’s the primary issue here. That and they don’t listen to those that have to endure the harrasment, or don’t believe them.
The problem in many cases isn’t that they don’t literally see it but that they aren’t aware of what constitutes racism a lot of the time.
I agree with this part, “in many cases” sure,
That’s the primary issue here.
…but I think this a strong claim to make unless you have data to back it up.
I believe you and I are likely speaking from our own anecdotal experience on the platform, and for all we know, most people are in instance bubbles and are also speaking from their own perspectives.
If the “primary issue” is “why do some people not report seeing racism?” and the two possible explanations are either “they see it but are not aware” and “they actually never see it”, then unless we have accurate data from all those bubbles, we can’t make any claims about which is the real explanation.
But if you have data on this, that would change everything.
How can Mastodon fix this? How is this a Mastodon issue vs. any kind of social media?
Mastodon is open source, as well, right? It feels like someone should be able to fork it if they’re really ignoring useful features that would help people.
Yeah it’s not a mastodon issue any more than racist speech is an issue with our ability to vocalize as humans.
Similarly, the solution to people saying racist things isn’t for all speech to be policed by a central authority, it’s for societies themselves to learn to identify and reject racism.
I am glad you asked. Whilst it is primarily a social issue, the lack of good moderation tools tie into it.
The main developer behind mastodon is well known for not giving a shit about moderation and has attacked those who actually do good moderation before both in words and by making the moderation tools confusing and in some uses useless for users and admins alike.
The flagship instance of mastodon proves this as it is still not behind any kind of verification when you try to sign up to it meaning spam bots and the worst people can openly sign up which is something instances and people who care about moderation don’t allow as checking users aren’t nefarious is a good first step.
It’s also got many many users on it which is likely intentional as a lot of social media creators both commercial and non commercial think it is a numbers game and that is all that matters rather creating something sustainable and pleasant.
This had a knock-on effect in that many of the tools he created for moderation he changed, such as making the reporting tools for users have one reporting like “I don’t like this” or something effectively go in to the bin, never seen by admins or moderators because he in part runs one if the biggest mostly unchecked/not well moderated instances and so instead of moderating well wanted to do less work in moderating which big fucking red flag there that your instance is too big and you aren’t really concerned with moderation.
Not caring about moderation means that other instance admins and users have to do your work for you whilst you can gleefully ignore all the problems you cause, especially if you also develop the code and don’t give users or admins the tools they need or keep making changes to make it less and less something unique and to be cherished and more and more corporate which is exactly what it seemed like he wanted in the first place, for it to be twitter but with a bit more care though lessening that as time went on.
So whilst it is a social issue, tools and the way the technology is thought about and the way the technology is presented to users and admins alike helps as does listening to what is useful and what is not.
That is what mastodon could do: give a shit and stop messing with moderation tools whilst developing better ones and listening to users and admins, especially the most vulnerable or those who care about what is needed.
P.S. Oh and he also removed one of the timelines from the mobile app he developed for using mastodon because on his instance it was a mess and so thought it was useless because he refused to do good moderation, I think that says a lot as well.
The problem in many cases isn’t that they don’t literally see it but that they aren’t aware of what constitutes racism a lot of the time. That’s the primary issue here. That and they don’t listen to those that have to endure the harrasment, or don’t believe them.
I agree with this part, “in many cases” sure,
…but I think this a strong claim to make unless you have data to back it up.
I believe you and I are likely speaking from our own anecdotal experience on the platform, and for all we know, most people are in instance bubbles and are also speaking from their own perspectives.
If the “primary issue” is “why do some people not report seeing racism?” and the two possible explanations are either “they see it but are not aware” and “they actually never see it”, then unless we have accurate data from all those bubbles, we can’t make any claims about which is the real explanation.
But if you have data on this, that would change everything.
How can Mastodon fix this? How is this a Mastodon issue vs. any kind of social media?
Mastodon is open source, as well, right? It feels like someone should be able to fork it if they’re really ignoring useful features that would help people.
Yeah it’s not a mastodon issue any more than racist speech is an issue with our ability to vocalize as humans.
Similarly, the solution to people saying racist things isn’t for all speech to be policed by a central authority, it’s for societies themselves to learn to identify and reject racism.
I am glad you asked. Whilst it is primarily a social issue, the lack of good moderation tools tie into it.
The main developer behind mastodon is well known for not giving a shit about moderation and has attacked those who actually do good moderation before both in words and by making the moderation tools confusing and in some uses useless for users and admins alike.
The flagship instance of mastodon proves this as it is still not behind any kind of verification when you try to sign up to it meaning spam bots and the worst people can openly sign up which is something instances and people who care about moderation don’t allow as checking users aren’t nefarious is a good first step.
It’s also got many many users on it which is likely intentional as a lot of social media creators both commercial and non commercial think it is a numbers game and that is all that matters rather creating something sustainable and pleasant.
This had a knock-on effect in that many of the tools he created for moderation he changed, such as making the reporting tools for users have one reporting like “I don’t like this” or something effectively go in to the bin, never seen by admins or moderators because he in part runs one if the biggest mostly unchecked/not well moderated instances and so instead of moderating well wanted to do less work in moderating which big fucking red flag there that your instance is too big and you aren’t really concerned with moderation.
Not caring about moderation means that other instance admins and users have to do your work for you whilst you can gleefully ignore all the problems you cause, especially if you also develop the code and don’t give users or admins the tools they need or keep making changes to make it less and less something unique and to be cherished and more and more corporate which is exactly what it seemed like he wanted in the first place, for it to be twitter but with a bit more care though lessening that as time went on.
So whilst it is a social issue, tools and the way the technology is thought about and the way the technology is presented to users and admins alike helps as does listening to what is useful and what is not.
That is what mastodon could do: give a shit and stop messing with moderation tools whilst developing better ones and listening to users and admins, especially the most vulnerable or those who care about what is needed.
P.S. Oh and he also removed one of the timelines from the mobile app he developed for using mastodon because on his instance it was a mess and so thought it was useless because he refused to do good moderation, I think that says a lot as well.