It's a tale as old as the internet.
Social platforms like Facebook and YouTube continue rolling out content moderation tools meant to keep harmful content like hate speech, misinformation, and incitements to violence off their platforms. Reports show that users keep successfully posting terrible stuff anyway.
The latest example concerns anti-semitic content. There is a rise in anti-semitism around the world, and that's translating to (and fueled by) posts on social media.
As reported by the New York Times, two new studies — from the Center for Countering Digital Hate, and the Anti-Defamation League, respectively — show that social platforms including Facebook (and Instagram), Twitter, YouTube, Reddit, Twitch, TikTok, and Roblox (yes, even Roblox) remove only a small percentage of content the organizations reported as anti-semitic.
Specifically, the Center for Countering Digital Hate reports that Facebook removed just over 10 percent of the "hundreds" of pieces of content the researchers reported, while Twitter acted on around 11 percent. YouTube responded to 21 percent, and TikTok acted on 18 percent.
The ADL's report had similar findings, though Twitter fared better. It assigned grades to platforms based on their responsiveness to reports on hate speech. Twitter got a B- (congrats!), Facebook and TikTok received C- grades, and Roblox got a dismal D.
The director of the ADL summed up the findings appropriately for the Times: The results are depressing, but unsurprising.
That's because these reports are just the latest examples of how users keep successfully publishing content that social platforms say they prohibit, with objectionable content running the gamut from hate speech to Covid-19 misinformation to incitements to violence.
In late July, The Washington Post reported on studies that showed how anti-COVID vaccine propaganda and misinformation easily flourished on Facebook and YouTube. Countering this sort of content has been a priority for social platforms, but the content President Biden described as "killing people" is still thriving.
In April, BuzzFeed News shared the results of an internal Facebook report that detailed how "Stop the Steal" organizers successfully incited people to violence in Facebook groups and posts the platform failed to act on.
Thanks in part to the abhorrent rhetoric of politicians like Donald Trump, throughout the pandemic, Asian people have faced a flood of hate and harassment on social media. As CNET reported in April, the ADL found that "17% of Asian Americans said in January they experienced severe online harassment compared with 11% during the same period last year, the largest uptick compared with other groups."
These reports all fall under the categories of content Facebook and, for the most part, other platforms ban in their community standards. And in the cases of anti-semitism, COVID misinformation, Stop the Steal organizing, and Asian hate, posts on social networks all appear to correlate with real-world violence or death.
Facebook publishes a quarterly Community Standards Enforcement Report, which shows data about numbers of reports of prohibited content, and Facebook's responsiveness. The most recent report from May on the first three months of 2021 says that the prevalence of hate speech is declining, and comprises just "0.05-0.06%, or 5 to 6 views per 10,000 views." Facebook also says its advancements in AI have led it to "proactively detect" — rather than rely on user reports — 97 percent of the hate speech content it removes.
The recent reports on anti-semitic posts show the gaps in these figures, however. That 97 percent refers to posts the social media company removes, but there were still plenty of posts for the non-profit organizations to find (and report themselves). On the combined platforms, 7.3 million people viewed the anti-semitic posts, the Center for Countering Digital Hate found.
The fact that we've heard this story before doesn't make it any less upsetting. These companies continually pledge to improve and acknowledge that they have work to do. In the meantime, people die.
from Mashable https://ift.tt/2TJHAte
Comments
Post a Comment