This article is from the source 'bbc' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.bbc.co.uk/news/technology-39744016

The article has changed 6 times. There is an RSS feed of changes available.

Version 0 Version 1
Social media giants 'shamefully far' from tackling illegal content Social media giants 'shamefully far' from tackling illegal content
(about 7 hours later)
Social media firms are "shamefully far" from tackling illegal and dangerous content, says a parliamentary report.Social media firms are "shamefully far" from tackling illegal and dangerous content, says a parliamentary report.
Hate speech, terror recruitment videos and sexual images of children all took too long to be removed, said the Home Affairs Select Committee report.Hate speech, terror recruitment videos and sexual images of children all took too long to be removed, said the Home Affairs Select Committee report.
It called for a review of UK laws and stronger enforcement around illegal material. The government should consider making the sites pay to help police what people post, it said.
And the government should consider making the sites pay to help police what people post, it said. But a former Facebook executive told the BBC the report "bashes companies" but offers few real solutions.
The cross-party committee took evidence from Facebook, Twitter and Google, the parent company of YouTube, for its report.The cross-party committee took evidence from Facebook, Twitter and Google, the parent company of YouTube, for its report.
It said they had made efforts to tackle abuse and extremism on their platforms, but "nowhere near enough is being done".It said they had made efforts to tackle abuse and extremism on their platforms, but "nowhere near enough is being done".
'Meaningful fines' The committee said it had found "repeated examples of social media companies failing to remove illegal content when asked to do so".
The committee said it had found "repeated examples of social media companies failing to remove illegal content when asked to do so", including terrorist recruitment material, promotion of sexual abuse of children and incitement to racial hatred.
It said the largest firms were "big enough, rich enough and clever enough" to sort the problem out, and that it was "shameful" that they had failed to use the same ingenuity to protect public safety as they had to protect their own income.It said the largest firms were "big enough, rich enough and clever enough" to sort the problem out, and that it was "shameful" that they had failed to use the same ingenuity to protect public safety as they had to protect their own income.
"White Genocide" and "Ban Islam"
Among the examples the committee found were:
The MPs said it was "unacceptable" that social media companies relied on users to report content, saying they were "outsourcing" the role "at zero expense".The MPs said it was "unacceptable" that social media companies relied on users to report content, saying they were "outsourcing" the role "at zero expense".
Yet the companies expected the police - funded by the taxpayer - to bear the costs of keeping them clean of extremism.Yet the companies expected the police - funded by the taxpayer - to bear the costs of keeping them clean of extremism.
The report's recommendations include:The report's recommendations include:
"Social media companies' failure to deal with illegal and dangerous material online is a disgrace," said committee chairwoman Yvette Cooper."Social media companies' failure to deal with illegal and dangerous material online is a disgrace," said committee chairwoman Yvette Cooper.
"They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse.
"Yet repeatedly, they have failed to do so. It is shameful."
Ms Cooper said the committee's inquiry into hate crime more broadly was curtailed when the general election was called and their recommendations had to be limited to dealing with social media companies and online hate.Ms Cooper said the committee's inquiry into hate crime more broadly was curtailed when the general election was called and their recommendations had to be limited to dealing with social media companies and online hate.
Home Secretary Amber Rudd said she expected to see social media companies take "early and effective action" and promised to study the committee's recommendations.Home Secretary Amber Rudd said she expected to see social media companies take "early and effective action" and promised to study the committee's recommendations.
Facebook, Twitter and Google did not respond to a BBC request for comment on the committee's findings.Facebook, Twitter and Google did not respond to a BBC request for comment on the committee's findings.
The firms had previously told the committee that they worked hard to make sure freedom of expression was protected within the law.The firms had previously told the committee that they worked hard to make sure freedom of expression was protected within the law.
Child protection fines Football parallels
Last week, the NSPCC called for fines for social networks which failed to protect children. A former European policy manager for Facebook, Luc Delaney, told BBC Radio Four's Today programme the report had failed to look at more than a decade of work the industry had done with police and successive governments on the problem.
NSPCC chief executive Peter Wanless said social media sites should face penalties if children saw inappropriate material. He said: "It bashes companies and gets a few big headlines for the committee but the solutions proposed don't really play out in reality."
He also said the government should consider age-rating sites in the same way as the British Board of Film Classification rates films. However, committee member and Labour MP Naz Shah disagreed: "It's not headline-grabbing when people are making money off terrorist content, headline grabbing when we've got child abuse online.
Internet companies' voluntary regulations on child protection were "not up to scratch" , he said. "Not working fast enough is not acceptable to the committee, to myself or anybody quite frankly."
"Online safety is one of the biggest risks facing children and young people today and one which the government of the day needs to tackle head on," he added. Ms Shah suggested that internet companies giving money to the Met's counter-terrorism unit would be similar to football clubs contributing to the cost of policing matches.
But Mr Delaney rejected this, adding a football stadium was a "fixed place" with "one obvious and historic type of behaviour" whereas social media companies had hundreds of millions of users and hours of content.
The child protection charity NSPCC has also called for fines for social networks that fail to protect children.
Internet companies' voluntary regulations on child protection are "not up to scratch", the charity's chief executive, Peter Wanless, said last week.