This article is from the source 'bbc' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.bbc.co.uk/news/technology-39744016

The article has changed 6 times. There is an RSS feed of changes available.

Version 4 Version 5
Social media giants 'shamefully far' from tackling illegal content Social media giants 'shamefully far' from tackling illegal content
(about 20 hours later)
Social media firms are "shamefully far" from tackling illegal and dangerous content, says a parliamentary report.Social media firms are "shamefully far" from tackling illegal and dangerous content, says a parliamentary report.
Hate speech, terror recruitment videos and sexual images of children all took too long to be removed, said the Home Affairs Select Committee report.Hate speech, terror recruitment videos and sexual images of children all took too long to be removed, said the Home Affairs Select Committee report.
The government should consider making the sites help pay to police content, it said.The government should consider making the sites help pay to police content, it said.
But a former Facebook executive told the BBC the report "bashes companies" but offers few real solutions.But a former Facebook executive told the BBC the report "bashes companies" but offers few real solutions.
The cross-party committee took evidence from Facebook, Twitter and Google, the parent company of YouTube, for its report.The cross-party committee took evidence from Facebook, Twitter and Google, the parent company of YouTube, for its report.
It said they had made efforts to tackle abuse and extremism on their platforms, but "nowhere near enough is being done".It said they had made efforts to tackle abuse and extremism on their platforms, but "nowhere near enough is being done".
The committee said it had found "repeated examples of social media companies failing to remove illegal content when asked to do so".The committee said it had found "repeated examples of social media companies failing to remove illegal content when asked to do so".
It said the largest firms were "big enough, rich enough and clever enough" to sort the problem out, and that it was "shameful" that they had failed to use the same ingenuity to protect public safety as they had to protect their own income.It said the largest firms were "big enough, rich enough and clever enough" to sort the problem out, and that it was "shameful" that they had failed to use the same ingenuity to protect public safety as they had to protect their own income.
"White Genocide" and "Ban Islam""White Genocide" and "Ban Islam"
Among the examples the committee found were:Among the examples the committee found were:
The MPs said it was "unacceptable" that social media companies relied on users to report content, saying they were "outsourcing" the role "at zero expense".The MPs said it was "unacceptable" that social media companies relied on users to report content, saying they were "outsourcing" the role "at zero expense".
Yet the companies expected the police - funded by the taxpayer - to bear the costs of keeping them clean of extremism.Yet the companies expected the police - funded by the taxpayer - to bear the costs of keeping them clean of extremism.
The report's recommendations include:The report's recommendations include:
"Social media companies' failure to deal with illegal and dangerous material online is a disgrace," said committee chairwoman Yvette Cooper."Social media companies' failure to deal with illegal and dangerous material online is a disgrace," said committee chairwoman Yvette Cooper.
Ms Cooper said the committee's inquiry into hate crime more broadly was curtailed when the general election was called and their recommendations had to be limited to dealing with social media companies and online hate.Ms Cooper said the committee's inquiry into hate crime more broadly was curtailed when the general election was called and their recommendations had to be limited to dealing with social media companies and online hate.
Home Secretary Amber Rudd said she expected to see social media companies take "early and effective action" and promised to study the committee's recommendations.Home Secretary Amber Rudd said she expected to see social media companies take "early and effective action" and promised to study the committee's recommendations.
In a statement Simon Milner, Facebook's policy director, said: "We agree with the Committee that there is more we can do to disrupt people wanting to spread hate and extremism online."In a statement Simon Milner, Facebook's policy director, said: "We agree with the Committee that there is more we can do to disrupt people wanting to spread hate and extremism online."
He said the social network was working with King's College, London and the Institute for Strategic Dialogue to make its efforts to curb hate speech more effective.He said the social network was working with King's College, London and the Institute for Strategic Dialogue to make its efforts to curb hate speech more effective.
Mr Milner added that Facebook had developed "quick and easy ways" for people to report content so it could be reviewed and, if necessary, removed.Mr Milner added that Facebook had developed "quick and easy ways" for people to report content so it could be reviewed and, if necessary, removed.
Twitter and Google have not yet responded to a BBC request for comment. "We take this issue very seriously," a spokesman for Google said, adding that the firm would continue to address "these challenging and complex problems".
"We've recently tightened our advertising policies and enforcement; made algorithmic updates; and are expanding our partnerships with specialist organisations working in this field."
Twitter has not yet responded to a BBC request for comment.
The firms had previously told the committee that they worked hard to make sure freedom of expression was protected within the law.The firms had previously told the committee that they worked hard to make sure freedom of expression was protected within the law.
Football parallelsFootball parallels
A former European policy manager for Facebook, Luc Delany, told BBC Radio Four's Today programme the report had failed to look at more than a decade of work the industry had done with police and successive governments on the problem.A former European policy manager for Facebook, Luc Delany, told BBC Radio Four's Today programme the report had failed to look at more than a decade of work the industry had done with police and successive governments on the problem.
He said: "It bashes companies and gets a few big headlines for the committee but the solutions proposed don't really play out in reality."He said: "It bashes companies and gets a few big headlines for the committee but the solutions proposed don't really play out in reality."
However, committee member and Labour MP Naz Shah disagreed: "It's not headline-grabbing when people are making money off terrorist content, headline grabbing when we've got child abuse online.However, committee member and Labour MP Naz Shah disagreed: "It's not headline-grabbing when people are making money off terrorist content, headline grabbing when we've got child abuse online.
"Not working fast enough is not acceptable to the committee, to myself or anybody quite frankly.""Not working fast enough is not acceptable to the committee, to myself or anybody quite frankly."
Ms Shah suggested that internet companies giving money to the Met's counter-terrorism unit would be similar to football clubs contributing to the cost of policing matches.Ms Shah suggested that internet companies giving money to the Met's counter-terrorism unit would be similar to football clubs contributing to the cost of policing matches.
But Mr Delany rejected this, adding a football stadium was a "fixed place" with "one obvious and historic type of behaviour" whereas social media companies had hundreds of millions of users and hours of content.But Mr Delany rejected this, adding a football stadium was a "fixed place" with "one obvious and historic type of behaviour" whereas social media companies had hundreds of millions of users and hours of content.
The child protection charity NSPCC has also called for fines for social networks that fail to protect children.The child protection charity NSPCC has also called for fines for social networks that fail to protect children.
Internet companies' voluntary regulations on child protection are "not up to scratch", the charity's chief executive, Peter Wanless, said last week.Internet companies' voluntary regulations on child protection are "not up to scratch", the charity's chief executive, Peter Wanless, said last week.