This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2019/04/03/opinion/facebook-youtube-disinformation.html

The article has changed 4 times. There is an RSS feed of changes available.

Version 2 Version 3
Big Tech’s Original Sin Big Tech Was Designed to Be Toxic
(2 days later)
When it comes to explaining why Facebook, YouTube and Twitter have become hotbeds for extremism, propaganda and bigotry, there’s a tendency to overcomplicate things.When it comes to explaining why Facebook, YouTube and Twitter have become hotbeds for extremism, propaganda and bigotry, there’s a tendency to overcomplicate things.
That’s understandable. The algorithms that govern the platforms are unknowable trade secrets. There are, in some cases, billions of users to account for. There are meaty issues of free speech and copyright law playing out in real time across borders. Technology is confusing!That’s understandable. The algorithms that govern the platforms are unknowable trade secrets. There are, in some cases, billions of users to account for. There are meaty issues of free speech and copyright law playing out in real time across borders. Technology is confusing!
And, yes, it’s true that the tech companies are dealing with thorny problems that most likely have no universally satisfying outcome. Big Tech’s problems are indeed dizzying and manifold, but the last few years have taught us that there’s an Occam’s razor quality to any explanation of the toxicity of our online platforms. The original sin, it seems, isn’t all that complicated; it’s the prioritization of growth — above all else and at the expense of those of us who use the services.And, yes, it’s true that the tech companies are dealing with thorny problems that most likely have no universally satisfying outcome. Big Tech’s problems are indeed dizzying and manifold, but the last few years have taught us that there’s an Occam’s razor quality to any explanation of the toxicity of our online platforms. The original sin, it seems, isn’t all that complicated; it’s the prioritization of growth — above all else and at the expense of those of us who use the services.
The most recent example came on Tuesday morning when Bloomberg News published a story chronicling YouTube’s struggles to quash misinformation, conspiracies and incendiary content. According to the report, current and former YouTube employees said that the company had ignored warnings to change YouTube’s recommendation engine and that they were, in some cases, discouraged from seeking out videos that might violate YouTube’s rules to preserve a sense of plausible deniability.The most recent example came on Tuesday morning when Bloomberg News published a story chronicling YouTube’s struggles to quash misinformation, conspiracies and incendiary content. According to the report, current and former YouTube employees said that the company had ignored warnings to change YouTube’s recommendation engine and that they were, in some cases, discouraged from seeking out videos that might violate YouTube’s rules to preserve a sense of plausible deniability.
Employee fears “were sacrificed for engagement.” Ambitious goals were set to hit one billion hours of viewing a day, and algorithms were tweaked accordingly. “We were so in the weeds trying to hit our goals and drive usage of the site,” one former senior manager confessed in the piece, “I don’t know if we really picked up our heads.”Employee fears “were sacrificed for engagement.” Ambitious goals were set to hit one billion hours of viewing a day, and algorithms were tweaked accordingly. “We were so in the weeds trying to hit our goals and drive usage of the site,” one former senior manager confessed in the piece, “I don’t know if we really picked up our heads.”
[As technology advances, will it continue to blur the lines between public and private? Sign up for Charlie Warzel’s limited-run newsletter to explore what's at stake and what you can do about it.]
The story shouldn’t be shocking. Reports from inside YouTube echo insider tales from other large tech platforms — a familiar pattern of ruthless optimization for users and of attention gone wrong.The story shouldn’t be shocking. Reports from inside YouTube echo insider tales from other large tech platforms — a familiar pattern of ruthless optimization for users and of attention gone wrong.
The YouTube employee’s quote reminded me of a conversation I’d had a few years back with a former member of Twitter’s product team. “Leading up to I.P.O., it was all about revenue growth, from C.E.O. on down,” the employee told me, suggesting that, in a frantic race to look good for Wall Street, the company deprioritized fixing systemic issues, including its rampant harassment problem. Others inside Twitter back then agreed.The YouTube employee’s quote reminded me of a conversation I’d had a few years back with a former member of Twitter’s product team. “Leading up to I.P.O., it was all about revenue growth, from C.E.O. on down,” the employee told me, suggesting that, in a frantic race to look good for Wall Street, the company deprioritized fixing systemic issues, including its rampant harassment problem. Others inside Twitter back then agreed.
“I did see a lot of decisions being made in terms of growth when it came to how to handle abuse, which I get,” Leslie Miley, a former engineering manager at Twitter, told me at the time.“I did see a lot of decisions being made in terms of growth when it came to how to handle abuse, which I get,” Leslie Miley, a former engineering manager at Twitter, told me at the time.
Growth at any cost. It’s a mantra familiar inside Facebook as well, as an internal memo surfaced last year by BuzzFeed News revealed.Growth at any cost. It’s a mantra familiar inside Facebook as well, as an internal memo surfaced last year by BuzzFeed News revealed.
“We connect people. Period. That’s why all the work we do in growth is justified,” a Facebook senior executive, Andrew Bosworth, wrote in the memo in June 2016. In a separate section, while advancing the argument that more human connections via Facebook are a “de facto good,” he appears to grapple with the downsides of connection. “Maybe it costs someone a life by exposing someone to bullies … maybe someone dies in a terrorist attack coordinated on our tools,” he wrote.“We connect people. Period. That’s why all the work we do in growth is justified,” a Facebook senior executive, Andrew Bosworth, wrote in the memo in June 2016. In a separate section, while advancing the argument that more human connections via Facebook are a “de facto good,” he appears to grapple with the downsides of connection. “Maybe it costs someone a life by exposing someone to bullies … maybe someone dies in a terrorist attack coordinated on our tools,” he wrote.
Mr. Bosworth has suggested the memo was a thought experiment. “I don’t agree with the post today and I didn’t agree with it when I wrote it,” he said.Mr. Bosworth has suggested the memo was a thought experiment. “I don’t agree with the post today and I didn’t agree with it when I wrote it,” he said.
But Facebook’s rapid rise to two billion-plus users, numerous privacy debacles and a steady stream of reported negative revelations suggest that, like its counterparts, the company’s quest for expansion trumped pressing concerns of privacy and transparency. A New York Times investigation last year reported that, “bent on growth,” Facebook executives “ignored warning signs” that Facebook could “disrupt elections, broadcast viral propaganda and inspire deadly campaigns of hate around the globe.”But Facebook’s rapid rise to two billion-plus users, numerous privacy debacles and a steady stream of reported negative revelations suggest that, like its counterparts, the company’s quest for expansion trumped pressing concerns of privacy and transparency. A New York Times investigation last year reported that, “bent on growth,” Facebook executives “ignored warning signs” that Facebook could “disrupt elections, broadcast viral propaganda and inspire deadly campaigns of hate around the globe.”
Scale is also seductive at an engineering level, bottom line aside. Adding users and engagement, in one interpretation, might signal that you’re giving people what they want. In 2017, I asked a former senior Facebook employee if staff members had felt a sense of blame for Facebook’s inability to stop the spread of misinformation that plagued the platform during the 2016 election. Not exactly, the employee explained:Scale is also seductive at an engineering level, bottom line aside. Adding users and engagement, in one interpretation, might signal that you’re giving people what they want. In 2017, I asked a former senior Facebook employee if staff members had felt a sense of blame for Facebook’s inability to stop the spread of misinformation that plagued the platform during the 2016 election. Not exactly, the employee explained:
“They believe that to the extent that something flourishes or goes viral on Facebook — it’s not a reflection of the company’s role, but a reflection of what people want. And that deeply rational engineer’s view tends to absolve them of some of the responsibility, probably.”“They believe that to the extent that something flourishes or goes viral on Facebook — it’s not a reflection of the company’s role, but a reflection of what people want. And that deeply rational engineer’s view tends to absolve them of some of the responsibility, probably.”
We can see this sensibility today in the way the platforms tend to obfuscate and deflect responsibility. Just last week, a YouTube executive argued that its recommendation algorithms weren’t designed to nudge users toward more extreme videos. Similarly, Twitter has and will continue to argue it was not designed specifically to be disproportionately hostile to women and people of color. And Facebook will argue that it was certainly not designed to help foreign countries interfere in our elections.We can see this sensibility today in the way the platforms tend to obfuscate and deflect responsibility. Just last week, a YouTube executive argued that its recommendation algorithms weren’t designed to nudge users toward more extreme videos. Similarly, Twitter has and will continue to argue it was not designed specifically to be disproportionately hostile to women and people of color. And Facebook will argue that it was certainly not designed to help foreign countries interfere in our elections.
But this defensive posture seems only concerned with intent. Even if we take the platforms at their word that they did not intend to profit from extremism or to become hubs for radicalization online, that doesn’t mean it doesn’t happen. Intent is far less important than the actual outcomes. And the outcomes appear to suggest that these platforms were intentionally designed to keep you glued to your screen for one more video, one more retweet, one more outraged share of a hate read.But this defensive posture seems only concerned with intent. Even if we take the platforms at their word that they did not intend to profit from extremism or to become hubs for radicalization online, that doesn’t mean it doesn’t happen. Intent is far less important than the actual outcomes. And the outcomes appear to suggest that these platforms were intentionally designed to keep you glued to your screen for one more video, one more retweet, one more outraged share of a hate read.
How can the mess be unwound? Can it be fixed? Those are the complicated questions. But the answer as to how we got here in the first place is much less complicated than Big Tech wants you to think. Intentionally or not, it was designed this way.How can the mess be unwound? Can it be fixed? Those are the complicated questions. But the answer as to how we got here in the first place is much less complicated than Big Tech wants you to think. Intentionally or not, it was designed this way.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email:letters@nytimes.com.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email:letters@nytimes.com.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.