Tech Design Ethicist Works To Raise Awareness Of Internet Addiction

Jul 10, 2017
Originally published on July 11, 2017 11:49 pm
Copyright 2017 NPR. To see more, visit http://www.npr.org/.

KELLY MCEVERS, HOST:

So here's the thing. When you think about Internet addiction, tech companies actually want people, not just teenagers, to spend time on their apps and their devices. And while that doesn't always lead to addiction, it does become a habit, sometimes an unhealthy one for many people.

Tristan Harris knows all about this. He was a design ethicist at Google. He now runs a nonprofit that tries to raise awareness of this issue, and he's with us now. Welcome.

TRISTAN HARRIS: Thanks for having me.

MCEVERS: So first of all, what is a design ethicist?

HARRIS: Well, it's really studying how do you ethically persuade someone's mind because whether we want to or not, in the tech industry, a handful of technology companies and a handful of people are steering the thoughts, feelings and emotions that are going to show up in 2 billion people's minds today.

MCEVERS: Wow. So just give us an example of how tech companies design products that become hard to ignore, you know, that can become addictive.

HARRIS: Yeah, well, I mean in this story with this young woman, you know, you're talking about YouTube. So video sites are all competing for attention. So if you're YouTube and you say, well, let's get some more attention, let's autoplay the next video. And then when you to autoplay the video, if that works well - let's say getting 5 percent more attention from people - then if Netflix or Facebook look at that, that's shrinking their attention market share. So Netflix needs to autoplay the next video on a countdown. And then that shrinks Facebook's market share, so Facebook looks at that and says, we have to autoplay all of the videos and keep the feed scrolling forever and never stop. And so it's not because companies are deliberately evil or have bad intentions. It's just that this race for attention...

MCEVERS: Right.

HARRIS: ...Creates this perverse economy that keeps everybody sucked in.

MCEVERS: Right. But what should companies do now to design products in a different way so that we don't become addicted to them?

HARRIS: Well, I think of this like a - sort of an environmental metaphor. You know, so essentially 2 billion people's minds are jacked into an open environment run by essentially three private companies - Apple, Google and Facebook. And there's no protections. There's no zoning laws in that city for, say, like, the residential zone. So the residential zone could be something like sleep.

There's no zoning laws for the sleep zone. And so all of these attention companies want to build casinos and extend their footprint into your life and bulldoze all these other boundaries you might want to put up. And so one thing that these companies can do is help create these zoning laws for, let's say, waking up in the morning or going to bed or whatever we want our breaks to look like and have the applications that compete for our attention compete around those zones instead of bulldozing our boundaries.

MCEVERS: I mean Google did create this position, you know, of design ethicist, so companies are thinking about this, right? Are they thinking about it enough?

HARRIS: I would say that we're not thinking about it nearly enough. And I think there's no clear role for this in the tech industry yet. And I think we need to make a whole - not just one person, but we need to have, you know, entire groups of people that are just dedicated to asking, what's best for people, not what's best for engagement. That's going to take resourcing and people and philosophers and anthropologists and sociologists asking, how is this stuff affecting people's minds? And what does it mean to affect people's minds positively or ethically?

MCEVERS: Tristan Harris of Time Well Spent. That's a nonprofit focused on how we use technology. Thank you very much.

HARRIS: Thanks for having me.

(SOUNDBITE OF CIRCLES AROUND THE SUN'S "GILBERT'S GROOVE") Transcript provided by NPR, Copyright NPR.