This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.guardian.co.uk/technology/2012/jun/21/neo-nazis-money-youtube-advertising

The article has changed 5 times. There is an RSS feed of changes available.

Version 2 Version 3
How neo-Nazis may be making money from YouTube advertising How neo-Nazis may be making money from YouTube advertising
(4 months later)
Neo-Nazis may be using YouTube's revenue-sharing system on adverts to obtain payments from companies such as Virgin Media, BT and O2 without their knowledge or consent.Neo-Nazis may be using YouTube's revenue-sharing system on adverts to obtain payments from companies such as Virgin Media, BT and O2 without their knowledge or consent.
By putting videos on YouTube, extremist groups including Blood & Honour and Combat 18 have been benefiting from the automatic addition of ads to their videos. Revenue-sharing agreements under Google's Adsense programme allow YouTube members posting non-copyrighted videos to benefit from ads that appearin a panel to the right of the videos.By putting videos on YouTube, extremist groups including Blood & Honour and Combat 18 have been benefiting from the automatic addition of ads to their videos. Revenue-sharing agreements under Google's Adsense programme allow YouTube members posting non-copyrighted videos to benefit from ads that appearin a panel to the right of the videos.
Some of the ad revenue is paid to the video owner and extremist groups have used this aspect of Google's business model to generate funding. When it was alerted to this, Google deleted the videos – but there is no indication it has put in place any protections to prevent a repetition.Some of the ad revenue is paid to the video owner and extremist groups have used this aspect of Google's business model to generate funding. When it was alerted to this, Google deleted the videos – but there is no indication it has put in place any protections to prevent a repetition.
Videos uploaded to neo-Nazi channels often appear to have the intention of rallying support by inciting hatred against minority and ethnic groups, despite YouTube's rules stating: "We do not permit hate speech (speech which attacks or demeans a group based on race or ethnic origin, religion, disability, gender, age, veteran status, and sexual orientation/gender identity)."Videos uploaded to neo-Nazi channels often appear to have the intention of rallying support by inciting hatred against minority and ethnic groups, despite YouTube's rules stating: "We do not permit hate speech (speech which attacks or demeans a group based on race or ethnic origin, religion, disability, gender, age, veteran status, and sexual orientation/gender identity)."
Google said the volume of uploads – with 60 hours of video uploaded every minute – meant nearly 10 years' worth of content was uploaded to the site every day. It does not screen content; instead it relies on users of the website to flag inappropriate videos as a form of self-regulatory crowd-sourced control.Google said the volume of uploads – with 60 hours of video uploaded every minute – meant nearly 10 years' worth of content was uploaded to the site every day. It does not screen content; instead it relies on users of the website to flag inappropriate videos as a form of self-regulatory crowd-sourced control.
"YouTube's community guidelines prohibit hate speech, and we encourage our users to flag material that they believe breaks the rules," a Google spokesperson said. "We review all flagged videos quickly, and we promptly remove material that violates our guidelines."YouTube's community guidelines prohibit hate speech, and we encourage our users to flag material that they believe breaks the rules," a Google spokesperson said. "We review all flagged videos quickly, and we promptly remove material that violates our guidelines.
"Videos with ads showing because of content claims created by YouTube's automated content ID system are subject to the same removal policies after they've been flagged by users.""Videos with ads showing because of content claims created by YouTube's automated content ID system are subject to the same removal policies after they've been flagged by users."
However, Combat 18 members or sympathisers who view the video would have no incentive to flag the content, so repeated viewings would benefit the group through revenue sharing. Under the terms of AdSense, if non-copyrighted videos prove popular the user is invited to join Google's partner programme. YouTube channels are used to provide links to extremist materials and neo-Nazi websites, where discussion groups and literature can be accessed.However, Combat 18 members or sympathisers who view the video would have no incentive to flag the content, so repeated viewings would benefit the group through revenue sharing. Under the terms of AdSense, if non-copyrighted videos prove popular the user is invited to join Google's partner programme. YouTube channels are used to provide links to extremist materials and neo-Nazi websites, where discussion groups and literature can be accessed.
Included in these links are: the Turner Diaries, linked to a number of hate crimes such as the Oklahoma City bombing in 1995 by Timothy McVeigh; and the diaries of the white supremacist Kevin Harpham, sentenced to 32 years in prison in December for planting a backpack bomb on the route of the Martin Luther King Jr Day parade in Washington.Included in these links are: the Turner Diaries, linked to a number of hate crimes such as the Oklahoma City bombing in 1995 by Timothy McVeigh; and the diaries of the white supremacist Kevin Harpham, sentenced to 32 years in prison in December for planting a backpack bomb on the route of the Martin Luther King Jr Day parade in Washington.
German police have been investigating the YouTube account of one of the National Socialist Underground members arrested in February concerning the murder of 10 Turkish immigrants in a series of racist killings spanning almost 10 years. David Copeland, the London nail bomber, and Anders Breivik, who carried out the 2011 Norway attacks, are known to have sourced ideas and suspected to have received support from online communities.German police have been investigating the YouTube account of one of the National Socialist Underground members arrested in February concerning the murder of 10 Turkish immigrants in a series of racist killings spanning almost 10 years. David Copeland, the London nail bomber, and Anders Breivik, who carried out the 2011 Norway attacks, are known to have sourced ideas and suspected to have received support from online communities.
When told its ads were being associated with neo-Nazi content, Virgin Media said: "Virgin Media has a strict policy on its ad placement, so we are concerned about ads appearing against unrelated and unsuitable content on YouTube. We are currently engaged with our advertising partners and Google to understand what measures can be put into place to prevent these occurrences going forwards."When told its ads were being associated with neo-Nazi content, Virgin Media said: "Virgin Media has a strict policy on its ad placement, so we are concerned about ads appearing against unrelated and unsuitable content on YouTube. We are currently engaged with our advertising partners and Google to understand what measures can be put into place to prevent these occurrences going forwards."
In December, the actor and TV presenter and writer Stephen Fry – whose mother's aunt and cousins died in Auschwitz – tweeted: "Disturbing that such blue chip companies are, in a way, supporting neo-Nazi YouTube content."In December, the actor and TV presenter and writer Stephen Fry – whose mother's aunt and cousins died in Auschwitz – tweeted: "Disturbing that such blue chip companies are, in a way, supporting neo-Nazi YouTube content."
Robert Levine, the former executive editor of Billboard magazine who writes about copyright technology, says it is an ethical problem: "I've looked at these videos. It's very disturbing stuff.Robert Levine, the former executive editor of Billboard magazine who writes about copyright technology, says it is an ethical problem: "I've looked at these videos. It's very disturbing stuff.
"Like Google in general, YouTube hides its corporate irresponsibility behind freedom of speech. But there are times when it seems more interested in its own freedom to sell advertising.""Like Google in general, YouTube hides its corporate irresponsibility behind freedom of speech. But there are times when it seems more interested in its own freedom to sell advertising."
The science fiction author and columnist Cory Doctorow argues against intervention. "I don't know if I agree with the underlying 'dangerous and irresponsible' premise. I'm a free speech advocate, and I believe that the answer to bad speech is more speech," he said.The science fiction author and columnist Cory Doctorow argues against intervention. "I don't know if I agree with the underlying 'dangerous and irresponsible' premise. I'm a free speech advocate, and I believe that the answer to bad speech is more speech," he said.
"Despite having hailed from Canada, where we have hate speech laws, I've concluded that they're generally worse than their alternatives, and are generally used by powerful people against people with less power and that actual 'hate crimes' are generally crimes per se and don't need further 'supercrime' status in order to successfully prosecute them.""Despite having hailed from Canada, where we have hate speech laws, I've concluded that they're generally worse than their alternatives, and are generally used by powerful people against people with less power and that actual 'hate crimes' are generally crimes per se and don't need further 'supercrime' status in order to successfully prosecute them."
A YouTube spokesperson said: "Seeing ads next to videos on YouTube does not indicate that those content providers are making money as a member of our partner programme. Ads can also be shown next to videos if they contain material that rights holders like the music industry have registered through YouTube's automated Content ID system."A YouTube spokesperson said: "Seeing ads next to videos on YouTube does not indicate that those content providers are making money as a member of our partner programme. Ads can also be shown next to videos if they contain material that rights holders like the music industry have registered through YouTube's automated Content ID system."
• This article was amended on 22 June 2012 to change the headline, which originally said "How neo-Nazis are making money from YouTube advertising" and the first paragraph, which began "Neo-Nazis are using YouTube's revenue-sharing system…", and to append a quote from a YouTube spokesperson received after the article was originally published.• This article was amended on 22 June 2012 to change the headline, which originally said "How neo-Nazis are making money from YouTube advertising" and the first paragraph, which began "Neo-Nazis are using YouTube's revenue-sharing system…", and to append a quote from a YouTube spokesperson received after the article was originally published.
Turn autoplay off
Turn autoplay on
Please activate cookies in order to turn autoplay off
Edition: UK
About us
Today's paper
Subscribe
Extremists post videos to exploit Google Adsense and share revenues from advertisers such as Virgin Media, BT and O2
Neo-Nazis may be using YouTube's revenue-sharing system on adverts to obtain payments from companies such as Virgin Media, BT and O2 without their knowledge or consent.
By putting videos on YouTube, extremist groups including Blood & Honour and Combat 18 have been benefiting from the automatic addition of ads to their videos. Revenue-sharing agreements under Google's Adsense programme allow YouTube members posting non-copyrighted videos to benefit from ads that appearin a panel to the right of the videos.
Some of the ad revenue is paid to the video owner and extremist groups have used this aspect of Google's business model to generate funding. When it was alerted to this, Google deleted the videos – but there is no indication it has put in place any protections to prevent a repetition.
Videos uploaded to neo-Nazi channels often appear to have the intention of rallying support by inciting hatred against minority and ethnic groups, despite YouTube's rules stating: "We do not permit hate speech (speech which attacks or demeans a group based on race or ethnic origin, religion, disability, gender, age, veteran status, and sexual orientation/gender identity)."
Google said the volume of uploads – with 60 hours of video uploaded every minute – meant nearly 10 years' worth of content was uploaded to the site every day. It does not screen content; instead it relies on users of the website to flag inappropriate videos as a form of self-regulatory crowd-sourced control.
"YouTube's community guidelines prohibit hate speech, and we encourage our users to flag material that they believe breaks the rules," a Google spokesperson said. "We review all flagged videos quickly, and we promptly remove material that violates our guidelines.
"Videos with ads showing because of content claims created by YouTube's automated content ID system are subject to the same removal policies after they've been flagged by users."
However, Combat 18 members or sympathisers who view the video would have no incentive to flag the content, so repeated viewings would benefit the group through revenue sharing. Under the terms of AdSense, if non-copyrighted videos prove popular the user is invited to join Google's partner programme. YouTube channels are used to provide links to extremist materials and neo-Nazi websites, where discussion groups and literature can be accessed.
Included in these links are: the Turner Diaries, linked to a number of hate crimes such as the Oklahoma City bombing in 1995 by Timothy McVeigh; and the diaries of the white supremacist Kevin Harpham, sentenced to 32 years in prison in December for planting a backpack bomb on the route of the Martin Luther King Jr Day parade in Washington.
German police have been investigating the YouTube account of one of the National Socialist Underground members arrested in February concerning the murder of 10 Turkish immigrants in a series of racist killings spanning almost 10 years. David Copeland, the London nail bomber, and Anders Breivik, who carried out the 2011 Norway attacks, are known to have sourced ideas and suspected to have received support from online communities.
When told its ads were being associated with neo-Nazi content, Virgin Media said: "Virgin Media has a strict policy on its ad placement, so we are concerned about ads appearing against unrelated and unsuitable content on YouTube. We are currently engaged with our advertising partners and Google to understand what measures can be put into place to prevent these occurrences going forwards."
In December, the actor and TV presenter and writer Stephen Fry – whose mother's aunt and cousins died in Auschwitz – tweeted: "Disturbing that such blue chip companies are, in a way, supporting neo-Nazi YouTube content."
Robert Levine, the former executive editor of Billboard magazine who writes about copyright technology, says it is an ethical problem: "I've looked at these videos. It's very disturbing stuff.
"Like Google in general, YouTube hides its corporate irresponsibility behind freedom of speech. But there are times when it seems more interested in its own freedom to sell advertising."
The science fiction author and columnist Cory Doctorow argues against intervention. "I don't know if I agree with the underlying 'dangerous and irresponsible' premise. I'm a free speech advocate, and I believe that the answer to bad speech is more speech," he said.
"Despite having hailed from Canada, where we have hate speech laws, I've concluded that they're generally worse than their alternatives, and are generally used by powerful people against people with less power and that actual 'hate crimes' are generally crimes per se and don't need further 'supercrime' status in order to successfully prosecute them."
A YouTube spokesperson said: "Seeing ads next to videos on YouTube does not indicate that those content providers are making money as a member of our partner programme. Ads can also be shown next to videos if they contain material that rights holders like the music industry have registered through YouTube's automated Content ID system."
• This article was amended on 22 June 2012 to change the headline, which originally said "How neo-Nazis are making money from YouTube advertising" and the first paragraph, which began "Neo-Nazis are using YouTube's revenue-sharing system…", and to append a quote from a YouTube spokesperson received after the article was originally published.