The ethics of internet culture: a conversation with Taylor Lorenz

Papal virality, Instagram, Twitter, identities, youth, and Christchurch

Taylor Lorenz was in high demand this week. As a prolific journalist at The Atlantic and about-to-be member of Harvard’s prestigious Nieman Fellowship for journalism, that’s perhaps not surprising. Nor was this the first time she’s had a bit of a moment: Lorenz has already served as an in-house expert on social media and the internet for several major companies, while having written and edited for publications as diverse as The Daily Beast, The Hill, People, The Daily Mail, and Business Insider, all while remaining hip and in touch enough to currently serve as a kind of youth zeitgeist translator, on her beat as a technology writer for The Atlantic.

Lorenz is in fact publicly busy enough that she’s one of only two people I personally know to have openly ‘quit email,’ the other being my friend Russ, an 82 year-old retired engineer and MIT alum who literally spends all day, most days, working on a plan to reinvent the bicycle.

I wonder if any of Lorenz’s previous professional experiences, however, could have matched the weight of the events she encountered these past several days, when the nightmarish massacre in Christchurch, New Zealand brought together two of her greatest areas of expertise: political extremism (which she covered for The Hill), and internet culture. As her first Atlantic piece after the shootings said, the Christchurch killer’s manifesto was “designed to troll.” Indeed, his entire heinous act was a calculated effort to manipulate our current norms of Internet communication and connection, for fanatical ends.

Taylor Lorenz

Lorenz responded with characteristic insight, focusing on the ways in which the stylized insider subcultures the Internet supports can be used to confuse, distract, and mobilize millions of people for good and for truly evil ends:

Before people can even begin to grasp the nuances of today’s internet, they can be radicalized by it. Platforms such as YouTube and Facebook can send users barreling into fringe communities where extremist views are normalized and advanced. Because these communities have so successfully adopted irony as a cloaking device for promoting extremism, outsiders are left confused as to what is a real threat and what’s just trolling. The darker corners of the internet are so fragmented that even when they spawn a mass shooting, as in New Zealand, the shooter’s words can be nearly impossible to parse, even for those who are Extremely Online.”

Such insights are among the many reasons I was so grateful to be able to speak with Taylor Lorenz for this week’s installment of my TechCrunch series interrogating the ethics of technology.

As I’ve written in my previous interviews with author and inequality critic Anand Giridharadas, and with award-winning Google exec turned award-winning tech critic James Williams, I come to tech ethics from 25 years of studying religion. My personal approach to religion, however, has essentially always been that it plays a central role in human civilization not only or even primarily because of its theistic beliefs and “faith,” but because of its culture — its traditions, literature, rituals, history, and the content of its communities.

And because I don’t mind comparing technology to religion (not saying they are one and the same, but that there is something to be learned from the comparison), I’d argue that if we really want to understand the ethics of the technologies we are creating, particularly the Internet, we need to explore, as Taylor and I did in our conversation below, “the ethics of internet culture.”

What resulted was, like Lorenz’s work in general, at times whimsical, at times cool enough to fly right over my head, but at all times fascinating and important.

Editor’s Note: we ungated the first of 11 sections of this interview. Reading time: 22 minutes / 5,500 words.

Joking with the Pope

Greg Epstein: Taylor, thanks so much for speaking with me. As you know, I’m writing for TechCrunch about religion, ethics, and technology, and I recently discovered your work when you brought all those together in an unusual way. You subtweeted the Pope, and it went viral.

Taylor Lorenz: I know. [People] were freaking out.

Greg: What was that experience like?

Taylor: The Pope tweeted some insane tweet about how Mary, Jesus’ mother, was the first influencer. He tweeted it out, and everyone was spamming that tweet to me because I write so much about influencers, and I was just laughing. There’s a meme on Instagram about Jesus being the first influencer and how he killed himself or faked his death for more followers.

Because it’s fluid, it’s a lifeline for so many kids. It’s where their social network lives. It’s where identity expression occurs.

I just tweeted it out. I think a lot of people didn’t know the joke, the meme, and I think they just thought that it was new & funny. Also [some people] were saying, “how can you joke about Jesus wanting more followers?” I’m like, the Pope literally compared Mary to a social media influencer, so calm down. My whole family is Irish Catholic.

A bunch of people were sharing my tweet. I was like, oh, god. I’m not trying to lead into some religious controversy, but I did think whether my Irish Catholic mother would laugh. She has a really good sense of humor. I thought, I think she would laugh at this joke. I think it’s fine.

Greg: I loved it because it was a real Rorschach test for me. Sitting there looking at that tweet, I was one of the people who didn’t know that particular meme. I’d like to think I love my memes but …

Taylor: I can’t claim credit.

Greg: No, no, but anyway most of the memes I know are the ones my students happen to tell me about. The point is I’ve spent 15 plus years being a professional atheist. I’ve had my share of religious debates, but I also have had all these debates with others I’ll call Professional Strident Atheists.. who are more aggressive in their anti-religion than I am. And I’m thinking, “Okay, this is clearly a tweet that Richard Dawkins would love. Do I love it? I don’t know. Wait, I think I do!”

Taylor: I treated it with the greatest respect for all faiths. I thought it was funny to drag the Pope on Twitter.

The influence of Instagram

Alexander Spatari via Getty Images

Greg: I want to talk about Instagram. As you’ve written in the Atlantic, “No app is more integral to teens’ social lives than Instagram. While Millennials relied on Facebook to navigate high school and college, connect with friends, and express themselves online, Gen Z’s networks exist almost entirely on Instagram. According to a recent study by the Pew Research Center, 72 percent of teens use the platform, which now has more than 1 billion monthly users. Instagram allows teens to chat with people they know, meet new people, stay in touch with friends from camp or sports, and bond by sharing photos or having discussions.”

Instagram encapsulates a lot of what I’d like to understand about how young people look at the Internet now. Clearly there’s lots that’s positive, conversations that people are really engaged with. Then again, reading through your recent pieces about Instagram’s culture, I’m seeing: teens who are bored; rampant anti-Semitism; a hangout for the alt-right; bullying and jealousy; teenagers being paid to represent the interests of all sorts of companies and corporations; and general misinformation. Looking at the site from that perspective — from the perspective of your writing — it can seem like an alarming place. So, what do you see when you look at Instagram? What kind of culture do you see there and how compelling is it for you?

Taylor: It is so compelling. I’m obsessed with the platform. It’s my job to write about online communities, and Instagram is probably the biggest, most relevant social network right now in America at least in terms of youth culture.

I would say Instagram is just like a microcosm of the broader Internet in a lot of ways. Yes, there is toxic, problematic and awful stuff. There’s also a lot of positive amazing stuff. I actually tend to focus my writing mostly on the more positive things. I write about how people use any social platform, but it ends up being a lot about Instagram because most [young] people are on Instagram, to create and to connect with people.

Like all social platforms, there is a fair amount of misinformation, Nazis, things like that. Those people are always going to be on every platform and they’re going to try to exploit it. I think it’s the job of the platform to police this type of stuff. Instagram has done a better job of mitigating that type of bad content compared to YouTube and Facebook. It’s still there; you can never completely eliminate it.

Greg: Who do you credit for the success that they’ve had? Is that Facebook, the owners? Is it the people who founded it? Is it just some good employees there?

Taylor: No. First, I wouldn’t credit [Facebook] with success. They have a ways to go. Their moderation policies are total BS and they constantly and arbitrarily delete and ban people and content that shouldn’t be banned while letting other content that’s racist, homophobic and awful slide by. Facebook has huge problems with moderation. I don’t think that they take a lot of this stuff seriously. I don’t really want to give them too much credit.

I will say [Instagram]’s founders (Kevin Systrom and Mike Kreiger) definitely wanted it to be a nice place. I think the current CEO, the head of Instagram Adam Mossari, also seems to want to keep it a positive place. Obviously that’s in their business interests, so no surprise. It’s a battle. They still have a long way to go. There’s a ton of toxic stuff on there. It’s just that I think that Facebook, the main Facebook app, and YouTube have been so bad that it makes even a mediocre job look good.

Instagram and positive identities

Greg: Could you say more about what kind positive things are happening on Instagram? And, to what degree do you think these are related to the platform itself?

Taylor: So one huge benefit of Instagram is that it’s a really good platform to express your identity, discover interests, and express those interests. For instance if you’re a young person trying to vent, talk to people, discover, meet new people with interests related to yours or your identity online, Instagram is the perfect place to do that.

Twitter is a cesspool. The people who run it, specifically Jack Dorsey, does not understand that.

Facebook is old web in a lot of ways: you have one profile. They separate people from brands in a weird way. There’s the whole pages versus profile thing. Instagram is a lot more fluid. It lets you set up multiple accounts, which a lot of young people do. Those things are positive because they really give more freedom to users.

It’s also a very visual platform with a lot of commentary that happens around it. Because it’s fluid, it’s a lifeline for so many kids. It’s where their social network lives. It’s where identity expression occurs. It’s how they define themselves and relate to the world. Obviously that can be really bad sometimes, but for the most part it’s good they have an outlet like that.

I’ll give you a couple of specific examples. Say you’re a teen and you’re getting into some kind of artwork. You can revamp your profile or you can start a separate profile that’s just dedicated to this artwork that you love or this band you love or these memes you find funny. Through creating and curating that content around those interests, you can attract lots of other users to follow you and engage with you. It’s a good way to meet people.

A lot of kids also meet in the comments. The norms on Instagram are different than Facebook, which by the way I had in college and and still do though I never go on it because the discovery process there is so broken. There’s no real good way to discover and meet people through shared interests on Facebook, except what I’ll call psycho groups which exist in little bubbles and are often toxic. Whereas on Instagram, which as I said is more fluid, it is easy to discover and meet people that are interested in the same stuff you are or that you want to learn more about.

Greg: So we’re all exploring our identities, right? I get that, of course. But what else is it that makes Instagram a positive place to do so?

Taylor: Another benefit is that Instagram is semiprivate. People who perform their identity on Twitter are doing it for a very public platform. In Facebook you’re performing an identity usually for a set group of friends, people you know. In Instagram, the communities are closed. They’re open in the sense that you can discover things through hashtags or location or if I start seeing what accounts follow who. But you can put a closed profile and make your own private account if you want to do that. It feels more closed because interests are all sectioned off, almost, though easy to discover at the same time so it feels intimate on a big open platform. That’s hard to replicate. Twitter and Facebook can’t because of how their platforms are structured. I think that that’s part of the appeal on Instagram…having a seemingly smaller network within a bigger platform.

The challenges of Twitter

Jack Dorsey

Photo by Anthony Ha / TechCrunch

Greg: When you say that’s the way that the other platforms are structured, you’ve written about how if Twitter wanted to really do better they could eliminate the retweet?

Taylor: That was just a troll. I was joking because Jack [Dorsey, the CEO of Twitter] always makes these stupid statements in the press. Twitter ultimately is broken because of its failed ability to moderate harassment. That’s why the platform’s not scaling. They’ll never be able to recreate what Instagram has.

Greg: So, for you the biggest problem with Twitter is the harassment?

Taylor: It’s a bad user experience. The point is you need to think of the experience of the user. If most of your users find it to be a toxic place where they can’t meet the people that they want to meet or connect with people with shared interests and they get targeted harassment … that’s a bad experience. I mean, there’s tons of harassment on Instagram too but it’s very different than Twitter. Twitter is extremely toxic. No one wants that.

Greg: How do you think Twitter became so toxic?

Taylor: I think that Jack as a CEO is delusional about his own product. I think he doesn’t actually care about the experiences or even really consider valid the experience marginalized people have on his platform. Even just women on the platform. That interview that he gave to Ashley Feinberg was so good because he hung himself with his own words. He basically acknowledged that he talks a lot but he doesn’t do anything.

It’s unfortunate because I think there are many amazing product people at Twitter, but they aren’t focused on the right thing. Harassment is really bad. It shouldn’t all be on the user. [Dorsey] constantly talks about how it’s the user’s job to report it. We don’t see it until you report it. What? That’s insane. You need to develop algorithms to root some of this bad behavior out.

Greg: A lot of it wouldn’t be that hard to find, would it?

Taylor: No, it’s so easy. Literally, it’s funny. A couple years ago I did this experiment where I remade my Twitter profile as a man. It was amazing the difference in my platform experience; it was insane. I just changed my avatar and header and my name. Even just tweeting with a male avatar was a significant reduction in the amount of bad comments that I got. It was really eye-opening.

Products and targeted harassment

Greg: Your experience was that just as a woman on Twitter, people seek you out to target you, to harass you?

Taylor: Constantly. Yeah, I get messages every day telling me about my body, talking to me about how I’m fat or ugly, why are you not married? Just sexist stuff. By the way, I have it pretty good. Imagine the people of color, women of color, other people in marginalized groups. Twitter is a cesspool. The people who run it, specifically Jack Dorsey, does not understand that. He’s so scared of pissing off certain factions that he won’t take action. For instance, Alex Jones…until there’s immense media pressure or a lot of other problematic people, they’re very slow to do anything.

Greg: It is such a huge problem. I waded into this a little bit myself a few years ago when the organization I was executive director of at the time chose — I’m proud to say — to give an award to an Internet activist named Anita Sarkeesian who had been in the middle of Gamergate. We immediately received a ton of criticism for that. It was probably the only major time in my life where I felt a need, not just an ethical obligation which I arguably have all the time, but I felt an immediate visceral need to get involved in engaging with those kinds of trolls. It really was stunning.

At the same time, I think I live with the privilege of being a white cisgendered male on Twitter somewhat removed from what you’re describing where I don’t have to experience it daily. I can’t even imagine what it’s like for you and for some of the other people I’m interested in talking to for this series. It’s a daunting problem in the sense that people like me, if I’m not feeling a need to even know what’s going on there, how are we going to fix this? What would you want to see people do about this problem? What advice do you have for me or others?

Taylor: My advice to you is just have awareness around it. But Jack Dorsey, the CEO of Twitter, it’s his problem to fix, and he’s failing. It’s ironic because so many who receive harassment are activists and journalists and people with a big Twitter platform. Often Twitter argues that oh, for normal users it’s better. That’s not true, but okay, shouldn’t you also worry about the needs of your power users? These are theoretically the people that are driving the most engagement, using your platform the most.

Greg: Right. What exactly would Twitter be without people like you?

Taylor: Without all of these people generating enormous amounts of engagement. And celebrities, too. I think it’s telling that a lot of celebrities will run their own Instagram account but won’t run their own Twitter account. I don’t really write a lot about Twitter harassment issues anymore frankly because I am sick of harassment. It’s part of the reason I quit covering politics, because of the sheer amount of harassment.

Greg: You were covering extremism in politics, to be clear. You were constantly traveling to cover the alt right. Is that correct?

Taylor: Exactly, yeah. It was exhausting. You know, there are a lot of amazing reporters that do it now, like Ben Collins, my old editor, and others who do amazing work on the downsides of these platforms and all the negative ways people are exploiting them. Thankfully I don’t really write about this anymore, intentionally.

Internet culture and organizing

Photo by Romy Arroyo Fernandez/NurPhoto via Getty Images

Greg: We agree that Black Lives Matter activists and the women who really launched this version of the MeToo movement, as well as the original founders of the MeToo movement, are doing really necessary and important work. Has the Internet and its culture actually exacerbated the need for that work?

Taylor: I don’t think it’s just the Internet. I think the broader culture has changed. [People are] waking up to a lot of the bigger injustices in society. A lot of that actually has to do more with the political climate than the Internet. I think it’s that obviously our political climate is expressed through the Internet. I don’t know, sexism didn’t just become a problem in the past few years, right? Or racism and all these injustices.

I think providing a positive experience, and one that makes people feel more productive, healthy, happy, is actually in a lot of these companies’ best interests.

Greg: Of course.

Taylor: I think that a confluence of factors is responsible for these big social movements. It is good to see people, especially activists online trying to push for more equality, trying to represent marginalized people, and give voice to them by utilizing the power of these big open social platforms. That is a positive for sure.

Greg: I’m very sympathetic to what you said; one can’t imagine going back to a pre-Internet era and saying communication was better at that point … that we were having a better open dialogue about racism and sexism back then. No way.

Youth health and tech regulation

Greg: On a separate but related topic, you’re constantly observing young people really closely. How do you feel they’re doing, health-wise?

Taylor: I don’t know. I don’t think it’s inherently bad or good. I think a lot [of discussion about young people] is alarmist and everybody’s like, oh my god, these people are glued to their phones. So are their parents, so is everyone. I mean, most of the people I talk to are pretty healthy. The general anxiety that a lot of teenagers have is eternal. Though obviously certain parts, for example, Instagram definitely magnifies certain anxieties teenagers have.

Greg: You’re talking about visual culture, the body image stuff?

Taylor: Sure. But more importantly I think this is where a lot of people’s social networks live, so it can be vicious. You know, there’s all the subtleties that teenagers experience; they are hyped to every little thing, such as, you don’t like somebody’s photo, you don’t like something that can be read in a certain way. And that’s true of adults too. As these platforms become really important in our social lives, I think that there’s a lot of subtleties that are attributed to every type of interaction on the platform. I don’t know; I’m hesitant to say kids are all being ruined by their phones though I think that it’s important to take a look at how these tech giants take advantage of users as a whole and to think critically about it.

Greg: Are there specific regulations that you’d particularly like to see?

Taylor: I’m not a big policy person. I just know that some of the power that these big companies have should be questioned. You know, another huge problem is that none of these people in Congress understand how the Internet works or understand anything about social media. I mean, it’s scary. I don’t know any lawmaker to really, really understand technology.

Greg: AOC included?

Taylor: Oh, no, she understands how to use technology but I don’t have any idea if she really understands how tech regulation works. I mean, I’m a tech reporter and I can barely [understand it]. I would say Tony Romm at the Washington Post is the best person to answer this question. He understands regulation more than any other tech journalist.

[But] before any kind of regulation can solve this — regulation is slow moving and it’s often applied unfairly or unequally — I think there has to be a broader public awareness about the problems of these platforms. Also people within these companies themselves need to think more critically, and specifically the executives. Hopefully [the executives] will think ethically about some of this; I know that there are people within companies that do.

Capitalism and ethics

Image: gmutlu/iStock

Greg: Part of the issue is that we are not just talking about Internet culture, but also about capitalism.

Taylor: Exactly, that’s a bigger kind of policy question. I would say though: users will quit the platform if it ends up being a bad experience. Your platform won’t grow. I think providing a positive experience, and one that makes people feel more productive, healthy, happy, is actually in a lot of these companies’ best interests. It’s just hard when they’re held up to these standards of growth or making more money every quarter. That’s the inherent tension.

Greg: Without naming names as you’ve been brave enough to do, how much greed overall have you observed in tech circles? How are people thinking about how much money they want to make versus other concerns?

Taylor: People that I talk to within these companies are not out for money. They’re idealistic, like wanting to or thinking that you want to change the world, you want to have impact. I think the problem is that they’re a little bit delusional, only recognizing the benefits and not really recognizing the problem.

We are seeing that with Facebook where there’s backlash; some of the people finally calling out the ways groups have become echo chambers. Facebook as a company hasn’t really done much to tamp down on bad groups. Anti-vaccine people as an example, conspiracy theorists.

I think more people within these companies need to recognize the ways that people are using the platforms for negative things. That’s hard. To feel that something that you’re putting so much time and effort in is being used badly. Or a lot of people within the companies don’t always know how these platforms are being used. They might just work on some small feature and they don’t really know.

Looking at social with multiple lenses

Greg: Is there anything else about the ethics of internet culture we haven’t discussed today that you want to mention?

Taylor: I’m not an ethics expert. I just think there needs to be greater awareness of how people actually use these platforms. I think often people think of them in a very stereotypical way. They’re not interviewing enough people. I think there needs to be a little bit more nuance and understanding of why and how people are using them. It’s not just everyone’s on Instagram just to post pretty photos. That’s a totally very flat way to view this very complex platform.

You do have to make some tough decisions, decide where you draw the line.

Greg: It’s very interesting. You just now made me think back to my days in junior high and high school where I would have six, seven, hour conversations and you could have been critical of the phone company. Oh, they’re profiting off of my sitting around on my phone with my friends. But I was making important connections at the time.

Taylor: Yeah. It’s always really important to recognize that, to look at things from multiple angles. I think you should recognize what people are getting out of it and why they’re behaving that way. It’s so easy to make some alarmist headline, and people will click but it takes a little bit more attempt to dig deeper. Because if you learn more about what’s actually driving people, that can benefit leaders or whoever’s researching these platforms.

Humanity’s report card

Greg: Thank you. The last thing, just for fun … on a scale of 1 to 100, how do you feel we’re doing as a human race? What are the prospects for our human future, from your perspective?

Taylor: Not good. Maybe like 22% or 22 out of 100.

Greg: Wow.

Taylor: It’s really bad.

Greg: But you sounded really optimistic in our interview.

Taylor: Oh. I mean, I’m optimistic about a lot, but the broader human race, [we are] destroying ourselves. It’s not good.

Greg: Aah.

Taylor: Yeah, even though I am really optimistic about certain things, in general I think unfortunately we’re plagued with all of these problems and people are very short-term thinkers. Now that we have all of this power and the ability to affect change on a global scale, it’s really largely negative. Also I’m thinking of climate change. I don’t have a huge amount of faith in people.

Greg: Indeed. Thank you very much for having faith in this conversation. I learned a lot!

The effects of Christchurch

Carl Court via Getty Images

Editor’s Note: The majority of the conversation above was conducted in February, but Epstein followed up this week with some additional questions for Lorenz after the events in Christchurch and her writing about them.

Greg: How worried should we be about the potential of online subcultures to radicalize people? I mean, even one incident like what just happened in Christchurch is far too much, but I think people are also worried this kind of mobilization of white supremacists and their allies, aided by the internet, could become a much more powerful global phenomenon in the years to come. What are your thoughts about that?

Taylor: These platforms have huge potential to radicalize people. I think they’re currently helping to facilitate radicalization, and clearly spread conspiracies and so on. They don’t take the issue seriously enough (and I’m talking about Facebook, Twitter, YouTube, Instagram, etc.) and they all view themselves more as platforms than some kind of curated community that they have to edit the content in, or like a news site. So they really hesitate to moderate, [in part because moderating] gets really hard.

I’m sympathetic to them in the fact that is hard. You know, you do have to make some tough decisions, decide where you draw the line. But at the same time, white supremacy and things like that are running rampant across all these platforms. So you need to do something

Greg: You’ve obviously read about this, this big story, explored by Casey Newton of The Verge and the movie The Cleaners and elsewhere, about the thousands of people who are moderating the content of sites like Facebook, for $28,000 a year without benefits or whatever it is, and how the work is severely impacting their mental health. What do you think should be done about that?

Taylor: I interviewed a bunch of moderators [for a story], and talked to people who are moderating [controversial or offensive content] day to day. These people deserve health care and access to support. I do think that because moderating requires such cultural nuance, and I think that we’re always going to have to rely on humans [to do the work of content moderation]. These people should be treated fairly and compensated fairly.

Greg: Right, and I’m inclined to believe that if it costs the company a few billions to do that then they just they should just have to have a few billions less.

Taylor: Yeah, and I mean that would ultimately be to the benefit of their platform. I think that they may be focused on more short term results.

Greg: What about the role of confusion? You’ve written about the way in which extremists are taking advantage of their ability to cultivate such an insider culture, that they are confusing or hiding in plain sight from people who might seek to to try to shut them down.

Taylor: A [major way] these communities tend to avoid scrutiny is that a lot of them promote their ideas under the guise of humor or irony. And when they’re criticized, they can brush it off like, “oh, it’s a joke, you don’t get it.” But the point is, they’re using the internet to promote white supremacy. So I think that it’s up to the platforms to recognize that, to recognize who these actors are, and moderate them.

I mean, there’s also the issue of the fact that this isn’t only happening on Facebook, Twitter, YouTube and Instagram. There’s a whole big internet out there. And we’ve seen a lot of other kind of clone sites pop up recently, like Gab is a good example where it’s essentially just like a Twitter clone, but specifically catered to extremists and Nazis. And obviously, there’s tons of Nazis and extremists on 8chan and 4chan. The important thing is that they don’t have a presence on these mainstream massive social platforms where they can be normalized and spread their message.

Greg: That’s a sobering and timely message. I hope decision makers at those massive platforms are listening carefully. Thank you so much, once again, for taking the time to speak with me on such a busy week.