This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.
You can find the current article at its original source at https://www.theguardian.com/technology/2018/may/09/santa-clarita-principles-could-help-tech-firms-with-self-regulation
The article has changed 5 times. There is an RSS feed of changes available.
Version 0 | Version 1 |
---|---|
Santa Clarita Principles could help tech firms with self-regulation | Santa Clarita Principles could help tech firms with self-regulation |
(35 minutes later) | |
Social networks should publish the number of posts they remove, provide detailed information for users whose content is deleted about why, and offer the chance to appeal against enforcement efforts, according to a groundbreaking effort to provide a set of principles for large-scale moderation of online content. | Social networks should publish the number of posts they remove, provide detailed information for users whose content is deleted about why, and offer the chance to appeal against enforcement efforts, according to a groundbreaking effort to provide a set of principles for large-scale moderation of online content. |
The Santa Clarita Principles, agreed at a conference in the Californian town this week, were proposed by a group of academics and non-profit organisations including the Electronic Frontier Foundation, ACLU, and the Center for Democracy and Technology. | The Santa Clarita Principles, agreed at a conference in the Californian town this week, were proposed by a group of academics and non-profit organisations including the Electronic Frontier Foundation, ACLU, and the Center for Democracy and Technology. |
They are intended to provide a guiding light for tech companies keen on self-regulation, akin to similar sets of principles established by other industries – most famously the Asilomar principles, drawn up in 1975 to regulate genetic engineering. | They are intended to provide a guiding light for tech companies keen on self-regulation, akin to similar sets of principles established by other industries – most famously the Asilomar principles, drawn up in 1975 to regulate genetic engineering. |
The principles are made up of three key recommendations: Numbers, Notice, and Appeal. “Companies should publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines,” the first principle advises. | The principles are made up of three key recommendations: Numbers, Notice, and Appeal. “Companies should publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines,” the first principle advises. |
Of the major content sites only YouTube currently provides such a report, and in less detail than the principle recommends: it calls for information including the number of posts and accounts flagged and suspended, broken down by category of rule violated, format of content, and locations, among other things. YouTube’s content moderation transparency report revealed the company removed 8.3m videos in the first quarter of 2018. | |
The second principle, Notice, recommends that “companies should provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension. | The second principle, Notice, recommends that “companies should provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension. |
“In general, companies should provide detailed guidance to the community about what content is prohibited, including examples of permissible and impermissible content and the guidelines used by reviewers.” Many companies keep such detailed guidelines secret, arguing that explaining the law lets users find loopholes they can abuse. | “In general, companies should provide detailed guidance to the community about what content is prohibited, including examples of permissible and impermissible content and the guidelines used by reviewers.” Many companies keep such detailed guidelines secret, arguing that explaining the law lets users find loopholes they can abuse. |
In 2017, the Guardian published Facebook’s community moderation guidelines, revealing some examples of how the company draws the line on sex, violence and hate speech. Last month, almost a year later, Facebook finally decided to publish the documents itself. Mark Zuckerberg said the publication was a step towards his goal “to develop a more democratic and independent system for determining Facebook’s community standards”. | In 2017, the Guardian published Facebook’s community moderation guidelines, revealing some examples of how the company draws the line on sex, violence and hate speech. Last month, almost a year later, Facebook finally decided to publish the documents itself. Mark Zuckerberg said the publication was a step towards his goal “to develop a more democratic and independent system for determining Facebook’s community standards”. |
Finally, the principles call for a right to appeal. “Companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension.” Most companies allow for some sort of appeal, in principle, although many users report little success in overturning incorrect decisions in practice. | Finally, the principles call for a right to appeal. “Companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension.” Most companies allow for some sort of appeal, in principle, although many users report little success in overturning incorrect decisions in practice. |
Instead, observers have noted that the press has increasingly become an independent ombudsman for large content companies, with many of the most flagrant mistakes only being overturned when journalists highlight them. Twitter, for example, “is slow or unresponsive to harassment reports until they’re picked up by the media,” according to Buzzfeed writer Charlie Warzel. | Instead, observers have noted that the press has increasingly become an independent ombudsman for large content companies, with many of the most flagrant mistakes only being overturned when journalists highlight them. Twitter, for example, “is slow or unresponsive to harassment reports until they’re picked up by the media,” according to Buzzfeed writer Charlie Warzel. |
Facebook’s Zuckerberg has said he wants a more explicit appeals process. “Over the long term, what I’d really like to get to is an independent appeal,” he said, in an interview with Vox. “So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. | Facebook’s Zuckerberg has said he wants a more explicit appeals process. “Over the long term, what I’d really like to get to is an independent appeal,” he said, in an interview with Vox. “So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. |
“You can imagine some sort of structure, almost like a supreme court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.” | “You can imagine some sort of structure, almost like a supreme court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.” |
Neither Facebook, Google nor Twitter commented for this article. | Neither Facebook, Google nor Twitter commented for this article. |
Technology | Technology |
Digital media | Digital media |
news | news |
Share on Facebook | Share on Facebook |
Share on Twitter | Share on Twitter |
Share via Email | Share via Email |
Share on LinkedIn | Share on LinkedIn |
Share on Pinterest | Share on Pinterest |
Share on Google+ | Share on Google+ |
Share on WhatsApp | Share on WhatsApp |
Share on Messenger | Share on Messenger |
Reuse this content | Reuse this content |