This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/technology/commentisfree/2017/mar/08/facebook-sexualised-images-children-safety

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
Facebook's stance on sexualised images of children is hypocritical Facebook's stance on sexualised images of children is hypocritical
(about 1 hour later)
Have you heard the one about the journalist and the paedophile on Facebook? When the journalist tried to raise the alarm about worrying pictures of children on the social network, Facebook reported him to the police.Have you heard the one about the journalist and the paedophile on Facebook? When the journalist tried to raise the alarm about worrying pictures of children on the social network, Facebook reported him to the police.
It’s a terrible joke, but an even worse indictment of the response and responsibility of one of the most powerful media companies in the world to a reporter trying to raise valid public-interest concerns. It’s a terrible joke, but an even worse indictment of the response and responsibility of one of the most powerful media companies in the world to a reporter trying to raise valid public interest concerns.
It began with a BBC investigation in 2016, which found that paedophiles were using secret groups on Facebook to post and swap sexually suggestive images of children. The social-media platform promised improvements to its moderation policy, employing “thousands” of human moderators to check the content 24/7, it said. It began with a BBC investigation in 2016, which found that paedophiles were using secret groups on Facebook to post and swap sexually suggestive images of children. Facebook promised improvements to its moderation policy, and said it was employing “thousands” of moderators to check the content 24/7.
Having decided to check this claim a year later, the BBC used Facebook’s own system of reporting abuse for about 100 images, only to find that just 18 images were removed as a result. It was only when the BBC approached the network about the findings that things took a decidedly Kafkaesque turn. Having decided to check this claim a year later, the BBC used Facebook’s own system of reporting abuse to question about 100 images, only to find that just 18 were removed as a result. When the BBC approached the network about the findings things took a decidedly Kafkaesque turn.
According to the BBC, Facebook’s director of policy Simon Milner agreed to be interviewed, but only on condition that the journalist provided examples of the material that had not been removed by moderators. When the BBC forwarded screengrabs of the images to prove its story, Facebook’s legal team went straight to the police. According to the BBC, Facebook’s director of policy Simon Milner agreed to be interviewed, but only on condition that the journalist provided examples of the material that had not been removed by moderators. When the BBC forwarded screengrabs of the images, Facebook’s legal team went straight to the police.
In a response to the Guardian, the social network did not confess to an error of judgment but insisted that the law demands a reference to the police when such images are “shared”. As part of its policy to “make Facebook safer” a spokesman said: “It is against the law for anyone to distribute images of child exploitation.” In a response to the Guardian, the social network did not confess to an error of judgment but insisted that the law demands a referral to the police when such images are “shared”. A spokesman said: “It is against the law for anyone to distribute images of child exploitation.”
In its defence, Facebook referred to the Pete Townsend case in which the Who drummer was arrested after downloading images of abused children, then subsequently insisted he was carrying out research for a campaign against internet porn involving children. But the BBC reporter in question, Angus Crawford, had been dealing with Facebook for a year. Facebook had no evidence of his interest in child pornography, but plenty of his interest in public-service journalism. In its defence, Facebook referred to the Pete Townshend case in which the musician was arrested after accessing a website containing images of child abuse, then subsequently insisted he was carrying out research for a campaign against internet porn involving children. But the BBC reporter in question, Angus Crawford, had been dealing with Facebook for a year. Facebook had no evidence of his interest in child pornography, but plenty of his interest in public service journalism.
This underlines how difficult it is to complain about disturbing content and then be assured that Facebook will respondThis underlines how difficult it is to complain about disturbing content and then be assured that Facebook will respond
If nothing else, this episode underlines how difficult it is to complain about disturbing content and then be assured that Facebook will respond. The cynical among us could even look at Facebook’s behaviour and see an attempt to frighten off journalists who want to hold the powerful to account. Crawford, now possibly under investigation by the police, is an employee of the BBC, one of the few news outlets not increasingly dependent on Facebook for revenue.If nothing else, this episode underlines how difficult it is to complain about disturbing content and then be assured that Facebook will respond. The cynical among us could even look at Facebook’s behaviour and see an attempt to frighten off journalists who want to hold the powerful to account. Crawford, now possibly under investigation by the police, is an employee of the BBC, one of the few news outlets not increasingly dependent on Facebook for revenue.
In this case, Facebook’s reaction has been so bad that it has even made press regulator Ipso look good, a sentence I don’t often write. In 2015, a woman saw to her horror that pixelated photographs of children used by the Lancashire Evening Post to illustrate a story about a “paedophile website” included two of her own children, taken without her consent from a Facebook profile and recognised by friends. The photographs were not pornographic, merely “sensitive” and and so the local paper used them highly pixelated and “smaller than a postage stamp” in its print edition only. In this case, Facebook’s reaction has been so bad that it has even made press regulator Ipso look good, a sentence I don’t often write. In 2015, a woman saw to her horror that pixelated photographs of children used by the Lancashire Evening Post to illustrate a story about a “paedophile website” included two of her own children, taken without her consent from a Facebook profile and recognised by friends. The photographs were not pornographic, merely “sensitive”, and the paper pixelated them and they appeared “smaller than a postage stamp” in its print edition only.
In its response to the mother’s complaint, Ipso admitted that the newspaper “had performed a valuable public service” with its report on the issue. It nevertheless upheld the complaint and ordered the Post to print its verdict.In its response to the mother’s complaint, Ipso admitted that the newspaper “had performed a valuable public service” with its report on the issue. It nevertheless upheld the complaint and ordered the Post to print its verdict.
Facebook’s response to such a story would be, understandably, that no one should post sensitive photos online without checking how onerous their privacy settings are. Such a response is of course in line with the “individual responsibility” that runs riot online. And yet a company with revenues of $4bn last year cannot shift all the responsibility for what is happening on its site on to the users. It surely cannot believe that a police investigation into a news reporter doing his job is the best result of an attempt to test that it “wants everyone to feel safe when using Facebook”. Who here is providing a “valuable public service”? The website failing to take down objectionable images, or the reporter highlighting its failure? Facebook’s response to such a story would be, understandably, that no one should post sensitive photos online without checking their privacy settings. Such a response is, of course, in line with the “individual responsibility” that runs riot online. And yet a company with revenues of $4bn last year cannot shift all the responsibility for what is happening on its site on to the users. It surely cannot believe that a police investigation into a news reporter doing his job is the best result of an attempt to test that it “wants everyone to feel safe when using Facebook”.
In 2012, women’s groups complained about the use by British tabloids of sexualised images of women and children at the Leveson inquiry. Asked by Lord Leveson what the difference was between pornographic publications and the tabloids in question, Anna van Heeswijk of Object, a human rights organisation set up to challenge the sexual objectification of women, said: “The difference is how they are regulated.” Mention the word “regulation” to Facebook and other members of the digerati and they will recoil, or laugh at the stupidity of the out-of-touch dinosaurs who just don’t get technology or complicated algorithms. Who here is providing a “valuable public service”? The website failing to take down objectionable images, or the reporter highlighting its failure?
Yet this case proves that, instead of laughing, they should take responsibility for their technology and harness it to make sure complaints are acted upon. If Facebook fails to do this, the law that means anyone sharing an image must be reported immediately should be used more effectively on the social media giant itself. In 2012, women’s groups complained at the Leveson inquiry about the use by British tabloids of sexualised images of women and children. Asked by Lord Leveson what the difference was between pornographic publications and the tabloids in question, Anna van Heeswijk of Object, a human rights organisation set up to challenge the sexual objectification of women, said: “The difference is how they are regulated.” Mention the word “regulation” to Facebook and other members of the digerati and they will recoil, or laugh at the stupidity of the out-of-touch dinosaurs who just don’t get technology or complicated algorithms.
Yet this case proves that, instead of laughing, they should take responsibility for their technology and harness it to make sure complaints are acted on. If Facebook fails to do this, the law that means anyone sharing an image must be reported immediately should be used more effectively on the social media giant itself.