This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2018/12/31/technology/facebook-suicide-screening-algorithm.html

The article has changed 3 times. There is an RSS feed of changes available.

Version 0 Version 1
In Screening for Suicide Risk, Facebook Takes On Tricky Public Health Role In Screening for Suicide Risk, Facebook Takes On Tricky Public Health Role
(about 2 hours later)
A police officer on the late shift in an Ohio town recently received an unusual call from Facebook.A police officer on the late shift in an Ohio town recently received an unusual call from Facebook.
Earlier that day, a local woman wrote a Facebook post saying she was walking home and intended to kill herself when she got there, according to a police report on the case. Facebook called to warn the Police Department about the suicide threat.Earlier that day, a local woman wrote a Facebook post saying she was walking home and intended to kill herself when she got there, according to a police report on the case. Facebook called to warn the Police Department about the suicide threat.
The officer who took the call quickly located the woman, but she denied having suicidal thoughts, the police report said. Even so, the officer believed she might harm herself and told the woman that she must go to a hospital — either voluntarily or in police custody. He ultimately drove her to a hospital for a mental health work-up, an evaluation prompted by Facebook’s intervention. (The New York Times withheld some details of the case for privacy reasons.)The officer who took the call quickly located the woman, but she denied having suicidal thoughts, the police report said. Even so, the officer believed she might harm herself and told the woman that she must go to a hospital — either voluntarily or in police custody. He ultimately drove her to a hospital for a mental health work-up, an evaluation prompted by Facebook’s intervention. (The New York Times withheld some details of the case for privacy reasons.)
Police stations from Massachusetts to Mumbai have received similar alerts from Facebook over the last 18 months as part of what is most likely the world’s largest suicide threat screening and alert program. The social network ramped up the effort after several people live-streamed their suicides on Facebook Live in early 2017. It now utilizes both algorithms and user reports to flag possible suicide threats.Police stations from Massachusetts to Mumbai have received similar alerts from Facebook over the last 18 months as part of what is most likely the world’s largest suicide threat screening and alert program. The social network ramped up the effort after several people live-streamed their suicides on Facebook Live in early 2017. It now utilizes both algorithms and user reports to flag possible suicide threats.
Facebook’s rise as a global arbiter of mental distress puts the social network in a tricky position at a time when it is under investigation for privacy lapses by regulators in the United States, Canada and the European Union — as well as facing heightened scrutiny for failing to respond quickly to election interference and ethnic hatred campaigns on its site. Even as Facebook’s chief executive, Mark Zuckerberg, has apologized for improper harvesting of user data, the company grappled last month with fresh revelations about special data-sharing deals with tech companies.Facebook’s rise as a global arbiter of mental distress puts the social network in a tricky position at a time when it is under investigation for privacy lapses by regulators in the United States, Canada and the European Union — as well as facing heightened scrutiny for failing to respond quickly to election interference and ethnic hatred campaigns on its site. Even as Facebook’s chief executive, Mark Zuckerberg, has apologized for improper harvesting of user data, the company grappled last month with fresh revelations about special data-sharing deals with tech companies.
The anti-suicide campaign gives Facebook an opportunity to frame its work as a good news story. Suicide is the second-leading cause of death among people ages 15 to 29 worldwide, according to the World Health Organization. Some mental health experts and police officials said Facebook had aided officers in locating and stopping people who were clearly about to harm themselves.The anti-suicide campaign gives Facebook an opportunity to frame its work as a good news story. Suicide is the second-leading cause of death among people ages 15 to 29 worldwide, according to the World Health Organization. Some mental health experts and police officials said Facebook had aided officers in locating and stopping people who were clearly about to harm themselves.
Facebook has computer algorithms that scan the posts, comments and videos of users in the United States and other countries for indications of immediate suicide risk. When a post is flagged, by the technology or a concerned user, it moves to human reviewers at the company, who are empowered to call local law enforcement.Facebook has computer algorithms that scan the posts, comments and videos of users in the United States and other countries for indications of immediate suicide risk. When a post is flagged, by the technology or a concerned user, it moves to human reviewers at the company, who are empowered to call local law enforcement.
“In the last year, we’ve helped first responders quickly reach around 3,500 people globally who needed help,” Mr. Zuckerberg wrote in a November post about the efforts.“In the last year, we’ve helped first responders quickly reach around 3,500 people globally who needed help,” Mr. Zuckerberg wrote in a November post about the efforts.
But other mental health experts said Facebook’s calls to the police could also cause harm — such as unintentionally precipitating suicide, compelling nonsuicidal people to undergo psychiatric evaluations, or prompting arrests or shootings.But other mental health experts said Facebook’s calls to the police could also cause harm — such as unintentionally precipitating suicide, compelling nonsuicidal people to undergo psychiatric evaluations, or prompting arrests or shootings.
And, they said, it is unclear whether the company’s approach is accurate, effective or safe. Facebook said that, for privacy reasons, it did not track the outcomes of its calls to the police. And it has not disclosed exactly how its reviewers decide whether to call emergency responders. Facebook, critics said, has assumed the authority of a public health agency while protecting its process as if it were a corporate secret.And, they said, it is unclear whether the company’s approach is accurate, effective or safe. Facebook said that, for privacy reasons, it did not track the outcomes of its calls to the police. And it has not disclosed exactly how its reviewers decide whether to call emergency responders. Facebook, critics said, has assumed the authority of a public health agency while protecting its process as if it were a corporate secret.
“It’s hard to know what Facebook is actually picking up on, what they are actually acting on, and are they giving the appropriate response to the appropriate risk,” said Dr. John Torous, director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Boston. “It’s black box medicine.”“It’s hard to know what Facebook is actually picking up on, what they are actually acting on, and are they giving the appropriate response to the appropriate risk,” said Dr. John Torous, director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Boston. “It’s black box medicine.”
Facebook said it worked with suicide prevention experts to develop a comprehensive program to quickly connect users in distress with friends and send them contact information for help lines. It said experts also helped train dedicated Facebook teams, who have experience in law enforcement and crisis response, to review the most urgent cases. Those reviewers contact emergency services only in a minority of cases, when users appear at imminent risk of serious self-harm, the company said.Facebook said it worked with suicide prevention experts to develop a comprehensive program to quickly connect users in distress with friends and send them contact information for help lines. It said experts also helped train dedicated Facebook teams, who have experience in law enforcement and crisis response, to review the most urgent cases. Those reviewers contact emergency services only in a minority of cases, when users appear at imminent risk of serious self-harm, the company said.
“While our efforts are not perfect, we have decided to err on the side of providing people who need help with resources as soon as possible,” Emily Cain, a Facebook spokeswoman, said in a statement.“While our efforts are not perfect, we have decided to err on the side of providing people who need help with resources as soon as possible,” Emily Cain, a Facebook spokeswoman, said in a statement.
[Share an experience about Facebook’s suicide prevention program with The New York Times.][Share an experience about Facebook’s suicide prevention program with The New York Times.]
In a September post, Facebook described how it had developed a pattern recognition system to automatically score certain user posts and comments for likelihood of suicidal thoughts. The system automatically escalates high-scoring posts, as well as posts submitted by concerned users, to specially trained reviewers.In a September post, Facebook described how it had developed a pattern recognition system to automatically score certain user posts and comments for likelihood of suicidal thoughts. The system automatically escalates high-scoring posts, as well as posts submitted by concerned users, to specially trained reviewers.
“Facebook has always been way ahead of the pack,” said John Draper, director of the National Suicide Prevention Lifeline, “not only in suicide prevention, but in taking an extra step toward innovation and engaging us with really intelligent and forward-thinking approaches.” (Vibrant Emotional Health, the nonprofit group administering the Lifeline, has advised and received funding from Facebook.)“Facebook has always been way ahead of the pack,” said John Draper, director of the National Suicide Prevention Lifeline, “not only in suicide prevention, but in taking an extra step toward innovation and engaging us with really intelligent and forward-thinking approaches.” (Vibrant Emotional Health, the nonprofit group administering the Lifeline, has advised and received funding from Facebook.)
Facebook said its suicide risk scoring system worked worldwide in English, Spanish, Portuguese and Arabic — except for in the European Union, where data protection laws restrict the collection of personal details like health information. To opt out, people can stop posting on Facebook or delete their Facebook accounts. Facebook said its suicide risk scoring system worked worldwide in English, Spanish, Portuguese and Arabic — except for in the European Union, where data protection laws restrict the collection of personal details like health information. There is no way of opting out, short of not posting on, or deleting, your Facebook account.
A review of four police reports, obtained by The Times under Freedom of Information Act requests, suggests that Facebook’s approach has had mixed results. Except for the Ohio case, police departments redacted the names of the people flagged by Facebook.A review of four police reports, obtained by The Times under Freedom of Information Act requests, suggests that Facebook’s approach has had mixed results. Except for the Ohio case, police departments redacted the names of the people flagged by Facebook.
In one case in May, a Facebook representative helped police officers in Rock Hill, S.C., locate a man who was streaming a suicide attempt on Facebook Live. On a recording of the call to the police station, the Facebook representative described the background in the video — trees, a street sign — to a police operator and provided the latitude and longitude of the man’s phone.In one case in May, a Facebook representative helped police officers in Rock Hill, S.C., locate a man who was streaming a suicide attempt on Facebook Live. On a recording of the call to the police station, the Facebook representative described the background in the video — trees, a street sign — to a police operator and provided the latitude and longitude of the man’s phone.
The Police Department credited Facebook with helping officers track down the man, who tried to flee and was taken to a hospital.The Police Department credited Facebook with helping officers track down the man, who tried to flee and was taken to a hospital.
“Two people called the police that night, but they couldn’t tell us where he was,” said Courtney Davis, a Rock Hill police telecommunications operator, who fielded the call from Facebook. “Facebook could.”“Two people called the police that night, but they couldn’t tell us where he was,” said Courtney Davis, a Rock Hill police telecommunications operator, who fielded the call from Facebook. “Facebook could.”
The Police Department in Mashpee, Mass., had a different experience. Just before 5:16 a.m. on Aug. 23, 2017, a Mashpee police dispatcher received a call from a neighboring Police Department about a man who was streaming his suicide on Facebook Live. Officers arrived at the man’s home a few minutes later, but by the time they got to him, he no longer had a pulse, according to police records.The Police Department in Mashpee, Mass., had a different experience. Just before 5:16 a.m. on Aug. 23, 2017, a Mashpee police dispatcher received a call from a neighboring Police Department about a man who was streaming his suicide on Facebook Live. Officers arrived at the man’s home a few minutes later, but by the time they got to him, he no longer had a pulse, according to police records.
At 6:09 a.m., the report said, a Facebook representative called to alert the police to the suicide threat.At 6:09 a.m., the report said, a Facebook representative called to alert the police to the suicide threat.
Scott W. Carline, chief of the Mashpee Police Department, declined to comment. But he said of Facebook, “I’d like to see them improve upon the suicide prevention tools they have in place to identify warning signs that could potentially become fatal.”Scott W. Carline, chief of the Mashpee Police Department, declined to comment. But he said of Facebook, “I’d like to see them improve upon the suicide prevention tools they have in place to identify warning signs that could potentially become fatal.”
Facebook’s Ms. Cain said that, in some cases, help unfortunately did not arrive in time. “We really feel for those people and their loved ones when that occurs,” she said.Facebook’s Ms. Cain said that, in some cases, help unfortunately did not arrive in time. “We really feel for those people and their loved ones when that occurs,” she said.
The fourth case, in May 2017, involved a teenager in Macon, Ga., who was streaming a suicide attempt. Facebook called the police after officers had already found the teenager at her home, a spokeswoman for the Bibb County sheriff’s office said. The teen survived the attempt.The fourth case, in May 2017, involved a teenager in Macon, Ga., who was streaming a suicide attempt. Facebook called the police after officers had already found the teenager at her home, a spokeswoman for the Bibb County sheriff’s office said. The teen survived the attempt.
Some health researchers are also trying to predict suicide risk, but they are using more transparent methodology and collecting evidence on the results.Some health researchers are also trying to predict suicide risk, but they are using more transparent methodology and collecting evidence on the results.
The Department of Veterans Affairs has developed a suicide risk prediction program that uses A.I. to scan veterans’ medical records for certain medicines and illnesses. If the system identifies a veteran as high risk, the V.A. offers mental health appointments and other services. Preliminary findings from a V.A. study reported fewer deaths over all among veterans in the program compared with nonparticipating veterans.The Department of Veterans Affairs has developed a suicide risk prediction program that uses A.I. to scan veterans’ medical records for certain medicines and illnesses. If the system identifies a veteran as high risk, the V.A. offers mental health appointments and other services. Preliminary findings from a V.A. study reported fewer deaths over all among veterans in the program compared with nonparticipating veterans.
In a forthcoming article in a Yale law journal, Mason Marks, a health law scholar, argues that Facebook’s suicide risk scoring software, along with its calls to the police that may lead to mandatory psychiatric evaluations, constitutes the practice of medicine. He says government agencies should regulate the program, requiring Facebook to produce safety and effectiveness evidence.In a forthcoming article in a Yale law journal, Mason Marks, a health law scholar, argues that Facebook’s suicide risk scoring software, along with its calls to the police that may lead to mandatory psychiatric evaluations, constitutes the practice of medicine. He says government agencies should regulate the program, requiring Facebook to produce safety and effectiveness evidence.
“In this climate in which trust in Facebook is really eroding, it concerns me that Facebook is just saying, ‘Trust us here,’” said Mr. Marks, a fellow at Yale Law School and New York University School of Law.“In this climate in which trust in Facebook is really eroding, it concerns me that Facebook is just saying, ‘Trust us here,’” said Mr. Marks, a fellow at Yale Law School and New York University School of Law.
Facebook’s Ms. Cain disagreed that the program amounted to health screening. “These are complex issues,” she said, “which is why we have been working closely with experts.”Facebook’s Ms. Cain disagreed that the program amounted to health screening. “These are complex issues,” she said, “which is why we have been working closely with experts.”
[If you have thoughts of suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255 (TALK) or go to SpeakingOfSuicide.com/resources for additional resources.]