Dissecting value systems and exclusion in ‘big tech’, with Jessica Powell (Part Two)

In this second of my two-part conversation on the ethics of technology with Jessica Powell, the former head of PR at Google turned author of the wonderful satirical novel, The Big Disruption: A Totally Fictional But Essentially True Silicon Valley Story, we discuss the meaning of “genius” in the tech world; why Silicon Valley multi-millionaires vote for socialists; and whether it is possible to use the master’s code to destroy his app.

But first, an excerpt that might initially seem absurd, but considering that so many tech companies are currently working on or talking about space travel, it’s hard to be sure whom Powell is even satirizing here:

“Slow down,” Niels said. “Are you joking with me?”

“We have been working on the project for a year,” Gregor said. “Fifty engineers working in secret in Building 1. We’re building a colony on the moon.”

“You mean you have a spaceship and everything? How are you dealing with gravity? Wait, never mind, don’t answer that. What I mean is, since when did Anahata get into the business of humankind?”

“Anahata has always only ever been about humankind. Everything we do is done for — “

“Yeah, yeah, I know, everything we do is to improve humankind. But I mean, a society, Gregor. There are no synergies with our current business. How do you know how to construct a society?”

“Actually, a society is a lot like software. You build it on solid principles, then you iterate. Then you solutionize, and you iterate again.”

“What makes you think you can solve what centuries of wise men have failed to do?”

“Because we have something they don’t have,” Gregor said. He pushed his chair closer, and Niels couldn’t help but lean forward. The broken wooden spindle leaned with him, pushing into his back. But he did not move to swat it away; his eyes were locked on Gregor, their faces almost touching.

“Algorithms,” Gregor whispered.

“You have got to be kidding me,” Niels snorted. “These are humans we’re talking about, not robots. You can’t predict and control human behavior with algorithms.”

“That is an emotional reaction to what is a very logical project. And, yes, an algorithm could have predicted that you would respond that way. Even irrational behavior is rational when seen as a larger grouping of patterns. And as you can imagine, this project is built on patterns of success. Project Y, we call it. It will save Anahata — and, as a result, humankind.”

Greg E.: Let’s talk about the ideas in the book. Your Google-like company, Anahata, is symbolized by a squid that becomes the size of a bus, which is a great symbol of course for what companies like Google have become.

You explore what drives that kind of expansion, and in large part it’s a personal drive. One could say it’s a sex drive, but what I took from it was more a drive to be noticed, needed, recognized, that gets out of proportion.

I was wondering to what extent you felt like some of the crazy foibles of these male engineer characters being obsessed with women and women’s approval, was less about an actual sex drive and more about what the approval of those women would mean to them? They’re finally attractive enough, they’re finally likable enough. How do you feel about that?

Jessica P.: Oh, yeah. I think it definitely has a status thing. So much of the driver of this book is about ego. I think that’s absolutely the case.

Greg E.: At one point, the dichotomy between having a guiding philosophy and just being into your own ego took the form of character you call The Fixer, who is very Zen, and the CEO and big executives go to him essentially saying, “please fix our problem.”

They’re visiting him in some sort of Zen garden. There’s a sand garden, which is a feature of some Buddhist rituals, Zen rituals, and The Fixer is raking the sand, but also raking in millions of dollars just to consult with these people. Then at one point, as if trying to speak in Zen Koan form, one of the executives says, “Oh, if only we could all be rakes.” That was a double entendre, right?

Image via Getty Images / Shendart

Jessica P.: Yeah. I had forgotten about that until you read it back to me.

Greg E.: I guess what I’m asking about is, I want to know from your perspective how out of control is Silicon Valley’s interest in, or obsession with, ideas of being a leader, and being a genius, and being a brilliant man of science as opposed to say, I don’t know, being a human being?

Jessica P.: Wow. I think it’s a problem. If only status could be conferred through extraordinary… I mean, you do see big philanthropy and that’s a whole other topic for another day I guess.

Greg E.: For this series, I also talked to the writer Anand Giridharadas about the problems across the whole philanthropic industry and community.

Jessica P.: If only they saw philanthropy as the same path to greatness as product innovation. Billions are just sitting there, not going back to the world.

What’s unique about the Valley, and quite admirable, is the entrepreneurialism; this attitude of asking big questions and “why not?” But when you feel you can ask anything and do anything, you don’t always contemplate the consequences of the things you’re building. I’m sure when Jack Dorsey built Twitter, he wasn’t thinking it would look like the cesspool that it does today.

Many of us, if we had ended up in the same positions as some of these founders have, might well have [had] the same problems. I do not think that these people are horrible people, that if there had just been someone else in power, that we wouldn’t be having the same conversation.

Part of [the problem] is the total lack of diversity in the industry. In other words, I don’t think Jack Dorsey or Mark Zuckerberg or whoever has to have all the answers.

[The question] is, can they surround themselves with other people, or can capital go to the future founders who don’t all look the same, and don’t all have the same beliefs, educational background, the roughly same age, and socio-economic background. The more diversity you have in these companies, the more likely you are to lead to a better outcome.

Greg E.: One point you emphasize through satire is that we’re not just talking about having more women or more people of color around for the sake of having them around. In my last interview, Moira Wiegel pointed out that if you look at all people working at a big tech company, it may, in fact, be majority women and people of color, but people get objectified based on their roles.

Your novel highlights the objectification of women in departments like human resources and PR. There is a massive dichotomy between the engineers, who are seen and want to see themselves as tech geniuses, and everybody else.

Image via Getty Images / Feodora Chiosea

Jessica P.: Right. That is very, very much reflective of the tech companies. And so you end up with an insular approach. I have yet to work in a tech company where it was, by numbers, diverse.

But even in the case where that might happen, when the ones making the decisions largely look the same and have the same background, and the people who interface with the outside world are a rung down, or a couple rungs down in terms of their status or their influence, then how are you possible really truly building for an outside world and for everyone? You are building for yourself, largely.

Greg E.: As you were saying, the problem is not so much about one evil individual. None of your characters, you’ve said, are exactly likable. Well, I thought there one very likable minor character, but no spoilers. Anyway, what you really seem to be illustrating is a culture where everyone seems very insecure.

Jessica P.: Yes.

Greg E.: People are motivated by their insecurities to do some really absurd things. Even if the zany situations you create weren’t quite so extreme, I’m still picturing insecure people at big tech companies, doing things that are worth laughing about, and maybe crying too.

Jessica P.: Right.

Greg E.: So how can we address the underlying culture? What would make people feel differently about themselves such that their creations would be more constructive for the world?

Jessica P.: Gosh, that’s a great question. One I don’t feel entirely equipped to answer having not studied it. I guess I would say people need to feel safe. That can mean so many different things. [For members of] underrepresented groups, you [often] feel like you’re under a microscope.

Every time you speak up, or decide whether to speak up, it’s with that extra lens which absolutely colors how you present yourself, how you present the information, how you feel you’re being observed and judged, and I think has quite a detrimental impact on people’s self-esteem and their ability to execute.

There’s also safety in another sense. A lot of the tech culture that I came up through…So when I was in high school, with Prodigy, and AOL…I guess we’re late Gen X, [though] we weren’t even called Gen X at the time. Whatever we are, we were the first ones to come online [as a generation].

There was a flaming culture, attacking people. Being more comfortable [communicating] behind a computer than face-to-face. A lot of it’s just about being right and winning the fight.

I definitely have, over the years, seen battles play out in companies where it is entirely ego. It’s about being right versus what’s the right decision for the company. How can we create a culture where people feel more bought into the group and less about individual ego? I think that’s super hard because we’re all human.

In any industry, ego is always going to be a big, big factor. I would think one way you would do it is, you wouldn’t tolerate bad behavior. You wouldn’t tolerate jerks, and I think on the whole in capitalism, in any industry, there’s a lot of tolerance for brilliant jerks.

If you could deal with the jerks early on and make it clear you’re not okay with that kind of behavior, I think you have somewhat of a chance to curb the natural impulses people have to want to be right, and to win, and to have their project be the winning project. But if you allow people to be abusive towards others, and you allow their egos to go unchecked, what kind of message are you sending to the rest of the organization?

Greg E.: So that’s one place in which social critique and satire can actually do some heavy lifting. They reduce the level at which society tolerates brilliant jerks, or people who are maybe intelligent on one level but lack social and emotional intelligence.


Jessica P.: Look at Uber, right?

Greg E.: That was the first example that came to my mind when you said tolerating brilliant jerks.

Jessica P.: Uber’s probably the easiest example of-

Greg E.: The uber-example?

Jessica P.: There you go.

Greg E.: Groan. I also wonder about the idea of safety, inclusion, and a more supportive culture in terms of the “social safety net.” As you mentioned earlier, a lot of people are pretty liberal in Silicon Valley. At one point, as the narrator, you mention that your character, Gregor, the top engineer at Anahata, will only vote for socialists. He does not, however, seem particularly interested in the need for most people to have genuinely better lives.

Jessica P.: He would vote for a socialist policy, but would not necessarily clock the guy sitting outside the Walgreens who’s homeless.

Greg E.: Why do you think a character like Gregor votes for socialists, then?

Jessica P.: It’s part of a militant [worldview] … There are these principles you have. In his case, they might be almost communitarian. Think about his view of what would happen on the moon.

Greg E.: He thinks we’re going to create an ideal society on the moon, right?

Jessica P.: Right. He cites all these different 1840s-ish social movements that were essentially utopian visions.

Greg E.: This might have been my absolute favorite section. [Note: see an excerpt above.]

Jessica P.: Yeah, yeah. It was that section, yeah. So I think you can say, “I don’t agree with this particular policy of Trump’s”, and be outraged, but that’s somewhat abstracted from the human impact.

Not necessarily that you’re not also concerned about the real human impact of [political policies], but that’s a little different from being kind to someone right in front of you in a meeting. Or saying, “Yes, women should have maternity leave,” but not reflecting on the extraordinary disparity in $50 million plus CEO salary, versus contracted workers.

That stuff can all coexist in a very funny way. It probably wouldn’t in the finance industry. I have a friend who works at Goldman, [who told me,] “It’s so funny to watch everything happening with the tech industry because in finance no one ever thought they were doing anything more than making money.”

There was never any kind of reckoning vis-a-vis morals. It was more just like, “We’ve always just wanted to make money. Now we did something that was unethical, or that you guys say was unethical, and there’s this reckoning on a regulatory level, but not the publicly challenging us on what we say versus what we do, in the way that happens with tech.”

To me, it’s quite extraordinary just when you look at CEO pay for example: the disparity between exec pay and all the talk around making the world better for everyone. That’s not to say that the CEOs have to make exactly the same as the janitors, but-

Greg E.: But maybe not thousands of times as much, or even more when you take stock into account.

Jessica P.: You could probably come up with a slightly more humane difference, would be my extraordinary controversial take on it.

Image via Getty Images / Bolkins

Greg E.: I want to ask you about the value system inherent in what we call the tech world. I am still relatively new to tech, but I have spent many years talking confidentially about ethics with students and professors at elite institutions that send a disproportionate share of leaders to Silicon Valley and the like.

What I see is a culture struggling to figure out what values we should aim for. I’m not talking about traditional religion; there are a lot of atheists in this world, a lot of atheists and mention of atheism in your book even. I’m an atheist, so I don’t think we’re going in the direction of Christianity, or traditional Hinduism, or whatever it is.

But what you’re driving at, I think, is in part that if we’re just sort of aiming for innovation, or if we’re just aiming for some kind of vague “leadership,” or genius, we seem to end up really missing something and hurting a lot of people. I don’t think that we know exactly what to aim for.

Jessica P.: There’s probably some truth to that. First of all, I think introspection is very rare. I wouldn’t say it’s necessarily my own strength either, so [I’m not] trying to set myself apart at all.

It’s an uninspiring shift to go from not doing evil– not to make it specifically about Google, but that was such an emblematic rallying cry– [from] “Don’t Be Evil,” to this thing with companies now, where it’s like, ‘do less harm.’ That’s a really hard thing to inspire people with, yet that seems to be the phase we’re in. And maybe that’s good compared to two years ago, but [it doesn’t give] people a way to move forward.

Maybe you’d have to go through [this phase] before you even can get to the other part. It’s probably some sort of marriage of broader principles: making people’s lives easier, but doing so in a more responsible and inclusive way; and how do you still make a profitable business out of it.

Greg E.: Instead of the idea of making lives “easier,” I would love to see more focus on how much suffering human beings actually have to go through; on addressing the suffering if not eliminating it.

Jessica P.: You can’t do that if you’re all the same people in the room.

Greg E.: No, you really can’t. I wonder if the way to have it stop being all the same people in the room, is to be critical of this idea that there’s something called genius, and what you need to do is gather the geniuses.

I think that whole notion may just be false, and your book will help people reexamine that. You take these characters we think of as geniuses, and you show us how they’re deeply human, and how if one really wants to be smart, don’t be like these characters.

Jessica P.: Right. It always bothered me, in tech origin stories, the personalities we prop up and the way we mythologize them. Yes, there are some incredibly smart people, for sure [in this industry]. So much of it, so, so, so much of it is just dumb luck.

By luck, I also mean where you were born, how much money you were born with, where you went to school. And then, yes, being in the right place at the right time, when the world is ready for the particular idea you have.

Greg E.: Which is why I asked you about your background at the beginning of this interview.

Jessica P.: Yeah.

Greg E.: I was very lucky too. I thought of myself as heading in that exact same trajectory in many ways, although I never would have seen it in the tech industry. Much later in my life, I’ve realized that I’m completely a product of luck, and I’d like to live the rest of my life the best way I can given all of that.

Image via Getty Images / saemilee

Greg E.: So anyway, you do a fair amount in the book with the very powerful idea, from Audre Lorde…

Jessica P.: First time in an interview anyone’s ever quoted Audre Lorde to me.

Greg E.: Lorde was a Black feminist civil rights activist…

Jessica P.: Oh, I know all about [her].

Greg E.: Of course! I’m just saying that for the record, not for you. Audre Lorde is a key theme in your book. She’s a heavy hitting feminist thinker, and the idea of hers that you really play around with is her famous quote about what you can or can’t do with the master’s tools.

Jessica P.: Right, right.

Greg E.: I wonder what your final reflections are at this point, having had the book out for a while now. , You spent a lot of time in the master’s house. You learned to use his tools very effectively. How do you reconcile that with the Audre Lorde quote that you invoke?

Jessica P.: I think you need both. You need people outside the house that were never a part of the house, that are upset with it, that put a lot of pressure on it, that force a lot of public scrutiny of it. Tech companies get upset with The New York Times and say that they’re going after them and it’s unfair. I’m sure some of the coverage sometimes isn’t 100% accurate, but it’s directionally accurate.

The fact that there’s that much coverage propels the electorate to put more pressure on policymakers, [and then] policymakers care more. You need people outside calling attention to these issues, not just “I don’t like what you’re doing with my data,” but also, “We’re not a part of this conversation.

We don’t feel like we have control over our experiences on your platforms, yet we are part of your platforms.” All of those voices create an energy and an urgency even if that conversation isn’t always entirely accurate. It’s an important conversation, and creates a lot of pressure.

I do think, particularly when you’re talking about innovation and tech, that the employees within these companies are incredibly important. Because if you’re a tech company, the thing you worry about more than almost anything is losing your top tech talent to one of your competitors.

Because if you suddenly stop being the hottest company and the place people want to work, if your AI researchers are no longer excited about working at your company, that’s a problem. No one wants to be HP. That may be a horrible example. HP may be doing fine, but no one ever sits there and is like, “HP is the company I want to go work for.”

I’m sure people at HP are smart, et cetera, et cetera, but when we talk about the Big Five, they’re in very hot competition for talent. To extend my analogy, the tools of the house, the master’s tools, are the employees.

If those tools go elsewhere, or if there’s a threat of them going elsewhere because they’re unhappy, if they’re walking out of your workplace, and they’re protesting, and they’re telling you they don’t want to be part of certain kinds of work that you’re doing, that is incredible pressure.

If I think of myself, and personally do I wish I had left earlier: I ask myself those questions sometimes. More almost from a psychological health point of view, feeling like I wasn’t entirely doing what I had set out to do when I went off to college.

Larger life purpose type things. To your specific question, I think you absolutely need the people within the building engaged on these issues, which then comes back to the book, and what I was hoping to [achieve]. I really think the employees in these companies are a massive force.

Image via Getty Images / alashi

Greg E.: Have people written to you off the record, to say what it’s doing for them to have your perspective?

Jessica P.: Yeah. It’s obviously very rewarding, but a surreal and sometimes kind of overwhelming experience to have people say they read something you wrote and that it somehow influenced them. It’s very flattering.

It also just feels like a lot of responsibility, but also just gives you a lot of hope… It reminds you too that a lot of what you liked about the industry, which were the people that you worked with, who are I think on the whole fundamentally good people who want to do the right thing, want to be engaged and open-minded. Being able to connect with them matters a lot.

Greg E.: Well, the labor issues within your industry, or the industry, and within Google particular are certainly something that I plan to cover later this year. I hope employees at major tech companies who have thoughts about the ethics of their work and their industry will contact me.

When I’m not writing or editing interviews at TechCrunch, I’m a chaplain – which means my job is to listen, and confidentiality is an absolutely crucial part of what I do. If people want to talk off the record, I’m a vault.

Last question: how optimistic are you about our shared human future?

Jessica P.: Probably depends on the day. If I have to pick a side, I’m going to say optimistic, but I think there’s a lot of work to be done.

Greg E.: Well said. Thank you so much, Jessica.