This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.guardian.co.uk/technology/2013/jul/29/twitter-urged-responsible-online-abuse

The article has changed 6 times. There is an RSS feed of changes available.

Version 1 Version 2
Publishers or platforms? Media giants may be forced to choose Publishers or platforms? Media giants may be forced to choose
(about 7 hours later)
Twitter has spent the past few days doing a good impression of a company hoping that the public outcry over rape threats against Caroline Criado-Perez would quietly go away. It is easy to see why. To adopt many of the measures being demanded by those who want social media giants to take more responsibility for the content their users produce would mean a huge shift in their business models – and even the way they define themselves.Twitter has spent the past few days doing a good impression of a company hoping that the public outcry over rape threats against Caroline Criado-Perez would quietly go away. It is easy to see why. To adopt many of the measures being demanded by those who want social media giants to take more responsibility for the content their users produce would mean a huge shift in their business models – and even the way they define themselves.
Companies such as Twitter and Facebook are keen to describe their sites as enabling communications, rather than publishing content – a crucial distinction which means that they are not liable for trolling or abuse. But for anyone on the receiving end of violent abuse and threats of rape or murder, the sheer size and profitability of their operations must mean that this distinction is becoming increasingly untenable.Companies such as Twitter and Facebook are keen to describe their sites as enabling communications, rather than publishing content – a crucial distinction which means that they are not liable for trolling or abuse. But for anyone on the receiving end of violent abuse and threats of rape or murder, the sheer size and profitability of their operations must mean that this distinction is becoming increasingly untenable.
Both companies insist though that legally they are communications companies – just conduits for information who cannot be held liable for that content – in the same way BT cannot be sued over obscene phone calls. However, both also operate teams to investigate "flagged" content and remove it where they feel it is justified.Both companies insist though that legally they are communications companies – just conduits for information who cannot be held liable for that content – in the same way BT cannot be sued over obscene phone calls. However, both also operate teams to investigate "flagged" content and remove it where they feel it is justified.
Being a communications company rather than a publisher means significantly less responsibility and expense, because it can claim to be a platform for discussion, rather than a publisher of opinion which could be held to be libellous or threatening.Being a communications company rather than a publisher means significantly less responsibility and expense, because it can claim to be a platform for discussion, rather than a publisher of opinion which could be held to be libellous or threatening.
But that position is becoming increasingly less credible, said Charlotte Harris, a partner at law firm Mishcon de Reya. "With popularity and power comes responsibility. They want to make sure they can avoid responsibility for individual tweets and tweeters, but it's not enough to say that if someone breaks the terms of use their account will be suspended," she said.But that position is becoming increasingly less credible, said Charlotte Harris, a partner at law firm Mishcon de Reya. "With popularity and power comes responsibility. They want to make sure they can avoid responsibility for individual tweets and tweeters, but it's not enough to say that if someone breaks the terms of use their account will be suspended," she said.
"If they break the law, Twitter should be more helpful, and that goes for racial abuse and violence as well as the terrible abuse we have seen recently against women. When these issues come up time and time again over a number of years we have to try to address them – and Twitter hasn't."If they break the law, Twitter should be more helpful, and that goes for racial abuse and violence as well as the terrible abuse we have seen recently against women. When these issues come up time and time again over a number of years we have to try to address them – and Twitter hasn't.
"Twitter needs to be more responsible, to act against people who use the site to abuse people and protect those who are abused.""Twitter needs to be more responsible, to act against people who use the site to abuse people and protect those who are abused."
Harris said that Twitter's reputation as a democratic environment was being spoilt by people who were able to act irresponsibly, and that the company's lack of action encouraged that atmosphere, rather than educating its users. Where public opinion had been behind Twitter during the Ryan Giggs superinjunction case, Harris said that mood was now changing.Harris said that Twitter's reputation as a democratic environment was being spoilt by people who were able to act irresponsibly, and that the company's lack of action encouraged that atmosphere, rather than educating its users. Where public opinion had been behind Twitter during the Ryan Giggs superinjunction case, Harris said that mood was now changing.
Harris's comments echo those of Andy Trotter, media spokesman for the Association of Chief Police Officers, who has also called for social media firms to act more responsibly when users break the law. "Hiding behind a veil of free speech is not teaching users what freedom of speech is really about. There's an eerie newness to these type of very public, very humiliating and very confident attacks."Harris's comments echo those of Andy Trotter, media spokesman for the Association of Chief Police Officers, who has also called for social media firms to act more responsibly when users break the law. "Hiding behind a veil of free speech is not teaching users what freedom of speech is really about. There's an eerie newness to these type of very public, very humiliating and very confident attacks."
Niri Shan, head of media law at Taylor Wessing, said that the McAlpine case – in which the former Conservative chairman sued a number of people, including the Commons Speaker's wife Sally Bercow, over tweets following a Newsnight programme – was a turning point in public understanding of social media and the law. "McAlpine was seminal from a defamation standpoint because ordinary members of the public realised that if you tweet or retweet something defamatory you could be subject to a libel action," he said. Niri Shan, head of media law at Taylor Wessing, which represents some social media firms, said that the McAlpine case – in which the former Conservative chairman sued a number of people, including the Commons Speaker's wife Sally Bercow, over tweets following a Newsnight programme – was a turning point in public understanding of social media and the law. "McAlpine was seminal from a defamation standpoint because ordinary members of the public realised that if you tweet or retweet something defamatory you could be subject to a libel action," he said.
"There's a strategy of making an example of high profile cases, and that gets in to the public consciousness. People have to understand there are repercussions – they can't just fire off tweets.""There's a strategy of making an example of high profile cases, and that gets in to the public consciousness. People have to understand there are repercussions – they can't just fire off tweets."
Shan, who has previously represented Twitter, pointed to a case between Google and former Tory local councillor Payam Tamiz, who unsuccessfully sued Google for libel over comments published by a third party on its Blogger platform. The judge ruled that Google could not be regarded as a publisher under the established principles of common law, and that even if it was it would be exempt under regulation 19 of the European Union's electronic commerce directive 2002. Shan pointed to a case between Google and former Tory local councillor Payam Tamiz, who unsuccessfully sued Google for libel over comments published by a third party on its Blogger platform. The judge ruled that Google could not be regarded as a publisher under the established principles of common law, and that even if it was it would be exempt under regulation 19 of the European Union's electronic commerce directive 2002.
At present, Facebook has a team of moderators who examine images which have been flagged by users, and then decide whether they are permitted on its networks. The staff have to work to a huge guidance document which used to specify, for example, that pictures of bodily fluids (excepting semen) were permitted, but not any showing breastfeeding mothers.At present, Facebook has a team of moderators who examine images which have been flagged by users, and then decide whether they are permitted on its networks. The staff have to work to a huge guidance document which used to specify, for example, that pictures of bodily fluids (excepting semen) were permitted, but not any showing breastfeeding mothers.
Facebook spokesman Iain Mackenzie said the quality of engagement on the site was improved by operating a "real identity culture". "Compelling people to be themselves removes one of the key factors that enables many trolls – namely the ability to hide behind a pseudonym. Public opinion is enough to dissuade most people from engaging in aggressive behaviour."Facebook spokesman Iain Mackenzie said the quality of engagement on the site was improved by operating a "real identity culture". "Compelling people to be themselves removes one of the key factors that enables many trolls – namely the ability to hide behind a pseudonym. Public opinion is enough to dissuade most people from engaging in aggressive behaviour."
Mackenzie said false accounts are deleted, and pointed to the site's second tier of protection, in its community standards and statement of rights and responsibilities which outline the site's rules on hate speech, bullying, threats and harassment.Mackenzie said false accounts are deleted, and pointed to the site's second tier of protection, in its community standards and statement of rights and responsibilities which outline the site's rules on hate speech, bullying, threats and harassment.
"Every piece of content on Facebook has an associated 'report' option that escalates it to your user operations team for review. Additionally, individuals can block anyone who is harassing them, ensuring they will be unable to interact further. Facebook tackles malicious behaviour through a combination of social mechanisms and technological solutions appropriate for a mass scale online opportunity," he said."Every piece of content on Facebook has an associated 'report' option that escalates it to your user operations team for review. Additionally, individuals can block anyone who is harassing them, ensuring they will be unable to interact further. Facebook tackles malicious behaviour through a combination of social mechanisms and technological solutions appropriate for a mass scale online opportunity," he said.
The scale of both Facebook and Twitter, which have 1 billion and 200 million global users respectively, make the task of proactively moderating content almost impossible. If both sites were classified as publishers and consequently under heavier legal obligation to remove offensive content, the cost and resource required would be extensive.The scale of both Facebook and Twitter, which have 1 billion and 200 million global users respectively, make the task of proactively moderating content almost impossible. If both sites were classified as publishers and consequently under heavier legal obligation to remove offensive content, the cost and resource required would be extensive.
Facebook would not comment on the current number of moderators but said that every report was reviewed by a person. "There is always an active debate about what content is appropriate to be shared on Facebook and every policy has its fringe cases that challenge us and the guidelines we work to; different things will shock, challenge and occasionally offend different people. The protection of individuals is more fixed. Our yardstick will always be to guard people from threats, persecution and intimidation while using the service," Mackenzie said.Facebook would not comment on the current number of moderators but said that every report was reviewed by a person. "There is always an active debate about what content is appropriate to be shared on Facebook and every policy has its fringe cases that challenge us and the guidelines we work to; different things will shock, challenge and occasionally offend different people. The protection of individuals is more fixed. Our yardstick will always be to guard people from threats, persecution and intimidation while using the service," Mackenzie said.
Twitter has declined to give any detail about how its moderators work, though its Trust & Safety team does have people worldwide so that it can cover different time zones. Each tweet or account that is reported is examined on a case-by-case basis.Twitter has declined to give any detail about how its moderators work, though its Trust & Safety team does have people worldwide so that it can cover different time zones. Each tweet or account that is reported is examined on a case-by-case basis.
The challenge that the site constantly faces is how to make it easy to report abuse without making it too easy – which would lead either to abuse of that system itself (by using it against someone to take them off Twitter), or too many reports, which could become unmanageable.The challenge that the site constantly faces is how to make it easy to report abuse without making it too easy – which would lead either to abuse of that system itself (by using it against someone to take them off Twitter), or too many reports, which could become unmanageable.
• This article was amended on 30 July 2013 to correct a quote by Iain Mackenzie• This article was amended on 30 July 2013 to correct a quote by Iain Mackenzie