This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2019/10/08/opinion/tech-companies-facial-recognition.html

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
Why You Shouldn’t Believe Tech Companies Why You Shouldn’t Believe Tech Companies
(about 3 hours later)
This article is part of a limited-run newsletter. You can sign up here.This article is part of a limited-run newsletter. You can sign up here.
A few years ago, during an interview with Twitter’s C.E.O., Jack Dorsey, something broke in my brain. I was in San Francisco to grill him on a number of thorny issues about online harassment and content moderation. His answers, while aspirational, clarified very little and seemed to require blind good faith to accept.A few years ago, during an interview with Twitter’s C.E.O., Jack Dorsey, something broke in my brain. I was in San Francisco to grill him on a number of thorny issues about online harassment and content moderation. His answers, while aspirational, clarified very little and seemed to require blind good faith to accept.
As he spoke to me about the inevitable changes and evolution of the company that would, someday, lead to a platform full of healthy, productive conversations, I found my mind wandering. What if instead of relying on the inevitable march of progress, Dorsey and Twitter cut their losses and closed up shop (Ashley Feinberg, far braver than me, actually asked Dorsey this point-blank in a subsequent interview)? Would we be better off?As he spoke to me about the inevitable changes and evolution of the company that would, someday, lead to a platform full of healthy, productive conversations, I found my mind wandering. What if instead of relying on the inevitable march of progress, Dorsey and Twitter cut their losses and closed up shop (Ashley Feinberg, far braver than me, actually asked Dorsey this point-blank in a subsequent interview)? Would we be better off?
I’d been covering technology for a decade, and during the course of the interview I began to see Silicon Valley’s problems as far more existential than I’d thought. My model had shifted from “How we can coexist with these platforms in a healthy way?” to “Should we should exist with them at all?”I’d been covering technology for a decade, and during the course of the interview I began to see Silicon Valley’s problems as far more existential than I’d thought. My model had shifted from “How we can coexist with these platforms in a healthy way?” to “Should we should exist with them at all?”
This same question has been bouncing around my head throughout our Privacy Project. Since technology companies insidiously frame digital privacy as a trade-off, there’s an implication that we must sacrifice something precious to gain something precious. We take this as a given — the cost of doing business. The price of technological progress. But there’s no reason it has to be this way. Which brings me to two wonderful pieces I read this week.This same question has been bouncing around my head throughout our Privacy Project. Since technology companies insidiously frame digital privacy as a trade-off, there’s an implication that we must sacrifice something precious to gain something precious. We take this as a given — the cost of doing business. The price of technological progress. But there’s no reason it has to be this way. Which brings me to two wonderful pieces I read this week.
The first is by Rose Eveleth in Vox on “the biggest lie tech people tell themselves.” In it, she argues that technology executives view and market invasive new technology — from smart diapers to facial recognition — “as inevitable evolutions” but in reality, they’re anything but. “Evolution doesn’t have meetings about the market, the environment, the customer base,” she writes. “Evolution doesn’t patent things or do focus groups. Evolution doesn’t spend millions of dollars lobbying Congress to ensure that its plans go unfettered.”The first is by Rose Eveleth in Vox on “the biggest lie tech people tell themselves.” In it, she argues that technology executives view and market invasive new technology — from smart diapers to facial recognition — “as inevitable evolutions” but in reality, they’re anything but. “Evolution doesn’t have meetings about the market, the environment, the customer base,” she writes. “Evolution doesn’t patent things or do focus groups. Evolution doesn’t spend millions of dollars lobbying Congress to ensure that its plans go unfettered.”
Eveleth points out that tech companies promote a narrative of evolution to coax us into desiring these new products. In this narrative, Amazon’s Echo is not just a plastic speaker with a robot voice and a few always-on microphones; it’s the inevitable march of civilization toward maximum utility and productivity. To oppose the product isn’t consumer choice but instead a darker form of Luddite-ism. It’s to be against human progress.Eveleth points out that tech companies promote a narrative of evolution to coax us into desiring these new products. In this narrative, Amazon’s Echo is not just a plastic speaker with a robot voice and a few always-on microphones; it’s the inevitable march of civilization toward maximum utility and productivity. To oppose the product isn’t consumer choice but instead a darker form of Luddite-ism. It’s to be against human progress.
I felt this acutely during Amazon’s recent product launch where the company proposed wiring every aspect of our lives with censors and microphones in the name of progress. But none of this technology is inevitable. Yes, Facebook, Google, Amazon and others have captured powerful segments of the market and frequently use their muscle to dictate what the future ought to look like (Eveleth points out that “often consumers don’t have much power of selection at all”). But there’s something dangerous about buying into the idea that technology must evolve at the expense of our right to privacy. I felt this acutely during Amazon’s recent product launch where the company proposed wiring every aspect of our lives with sensors and microphones in the name of progress. But none of this technology is inevitable. Yes, Facebook, Google, Amazon and others have captured powerful segments of the market and frequently use their muscle to dictate what the future ought to look like (Eveleth points out that “often consumers don’t have much power of selection at all”). But there’s something dangerous about buying into the idea that technology must evolve at the expense of our right to privacy.
There are ways to stop this so-called evolution, as M.I.T. Technology Review’s Angela Chen detailed last week in a piece called “This is how you kick facial recognition out of your town.” Chen offers a multipronged approach utilizing corporate pressure and civil rights law to ban facial recognition while the technology is still in its infancy.There are ways to stop this so-called evolution, as M.I.T. Technology Review’s Angela Chen detailed last week in a piece called “This is how you kick facial recognition out of your town.” Chen offers a multipronged approach utilizing corporate pressure and civil rights law to ban facial recognition while the technology is still in its infancy.
As Chen’s piece suggests, stopping facial recognition isn’t easy. Even cities that have banned government use of the technology are only as strong as their weakest link, meaning that landlords and private companies can still surveil and potentially sell that information back to those who’ve been banned from using the tech. But technology doesn’t evolve on its own to mine more of our data; it’s engineered by companies to do so. Those companies, as Chen argues, are more responsive to vast corporate pressure than we might assume. The same goes with legislation and regulation.As Chen’s piece suggests, stopping facial recognition isn’t easy. Even cities that have banned government use of the technology are only as strong as their weakest link, meaning that landlords and private companies can still surveil and potentially sell that information back to those who’ve been banned from using the tech. But technology doesn’t evolve on its own to mine more of our data; it’s engineered by companies to do so. Those companies, as Chen argues, are more responsive to vast corporate pressure than we might assume. The same goes with legislation and regulation.
Eveleth ends her piece with a call to action: “It’s time to question what ‘progress’ actually means.” Though my brain has been broken by a decade of studying the consequences of technology, her words still hit me hard. They are a powerful reminder for all of us. It doesn’t have to be this way. The surveillance state is not inevitable.Eveleth ends her piece with a call to action: “It’s time to question what ‘progress’ actually means.” Though my brain has been broken by a decade of studying the consequences of technology, her words still hit me hard. They are a powerful reminder for all of us. It doesn’t have to be this way. The surveillance state is not inevitable.
Speaking of the evolution of technology, today’s archive pick is a heady editorial from January 1999. It muses about human nature at the dawn of a new millennium and argues that it is perhaps more stable than we think.Speaking of the evolution of technology, today’s archive pick is a heady editorial from January 1999. It muses about human nature at the dawn of a new millennium and argues that it is perhaps more stable than we think.
This veers into Intro to Philosophy territory pretty quickly, but I appreciated the author’s musings on biology and cultural (which is shaped by technology) evolution and how one “works on a nearly geological time scale” (perhaps helpful for Silicon Valley to remember):This veers into Intro to Philosophy territory pretty quickly, but I appreciated the author’s musings on biology and cultural (which is shaped by technology) evolution and how one “works on a nearly geological time scale” (perhaps helpful for Silicon Valley to remember):
We are caught, in a sense, between two kinds of evolution. Biological evolution works on a nearly geological time scale, which suggests that human nature, as a partial product of our genes, is basically constant. Cultural evolution works with shocking swiftness, and so we assume that it is mainly a propulsive, liberating, even revolutionary force. But human culture has always usefully constrained human behavior as well as expressed it. No human society has ever tolerated the entire range of instinctive, “natural” human behavior. That selective intolerance is among the things we mean by civilization.We are caught, in a sense, between two kinds of evolution. Biological evolution works on a nearly geological time scale, which suggests that human nature, as a partial product of our genes, is basically constant. Cultural evolution works with shocking swiftness, and so we assume that it is mainly a propulsive, liberating, even revolutionary force. But human culture has always usefully constrained human behavior as well as expressed it. No human society has ever tolerated the entire range of instinctive, “natural” human behavior. That selective intolerance is among the things we mean by civilization.
[If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter.][If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter.]
Of all the tips included in this newsletter, this one might feel the most cathartic. It could also be one of the more important in terms of limiting your digital footprint. Recently, Google revamped its privacy tools, which include new permissions for deleting your information, including YouTube history, Map location data and Google search history. The tools are easy to use and you can customize the controls to delete your information after a specific amount of time.Of all the tips included in this newsletter, this one might feel the most cathartic. It could also be one of the more important in terms of limiting your digital footprint. Recently, Google revamped its privacy tools, which include new permissions for deleting your information, including YouTube history, Map location data and Google search history. The tools are easy to use and you can customize the controls to delete your information after a specific amount of time.
Here’s how. You can go here to see the activity (meaning searches and YouTube views) that Google is collecting. I found that this link is the quickest way to delete your data by category. I deleted my entire YouTube viewing history, and I set my new permissions to auto-delete my history after three months. I did the same for my Google search history. If you’re having trouble deleting, my newsroom colleague Brian X. Chen also has a handy guide.Here’s how. You can go here to see the activity (meaning searches and YouTube views) that Google is collecting. I found that this link is the quickest way to delete your data by category. I deleted my entire YouTube viewing history, and I set my new permissions to auto-delete my history after three months. I did the same for my Google search history. If you’re having trouble deleting, my newsroom colleague Brian X. Chen also has a handy guide.
What I like about these tools is the ability to keep my history around for a set period. That means when I’m searching Maps or working on a project, Google will remember some of my recent locations and terms and deliver me more relevant results. But there’s no reason it needs to have that information in perpetuity, so the ability to delete it gives me a little peace of mind.What I like about these tools is the ability to keep my history around for a set period. That means when I’m searching Maps or working on a project, Google will remember some of my recent locations and terms and deliver me more relevant results. But there’s no reason it needs to have that information in perpetuity, so the ability to delete it gives me a little peace of mind.
I can’t recommend this tip enough. Yes, your history has been used for years to build a detailed advertising profile, and you can’t really change the fact that the information you coughed up to Google was sliced and diced. But you can take back a small bit of control and see that it’s not used in the future.I can’t recommend this tip enough. Yes, your history has been used for years to build a detailed advertising profile, and you can’t really change the fact that the information you coughed up to Google was sliced and diced. But you can take back a small bit of control and see that it’s not used in the future.
Privacy vs. Security: It’s a False Dilemma.Privacy vs. Security: It’s a False Dilemma.
Most 2020 campaign websites lack key privacy, security safeguards.Most 2020 campaign websites lack key privacy, security safeguards.
The Messy Consequences of the Golden State Killer Case.The Messy Consequences of the Golden State Killer Case.
Like other media companies, The Times collects data on its visitors when they read stories like this one. For more detail please see our privacy policy and our publisher's description of The Times's practices and continued steps to increase transparency and protections.Like other media companies, The Times collects data on its visitors when they read stories like this one. For more detail please see our privacy policy and our publisher's description of The Times's practices and continued steps to increase transparency and protections.
Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.