‘People fix things. Tech doesn’t fix things.’

The AI Now Institute is promoting accountability through ethics

Veena Dubal is an unlikely star in the tech world.

A scholar of labor practices regarding the taxi and ride-hailing industries and an Associate Professor at San Francisco’s U.C. Hastings College of the Law, her work on the ethics of the gig economy has been covered by the New York Times, NBC News, New York Magazine, and other publications. She’s been in public dialogue with Naomi Klein and other famous authors, and penned a prominent op-ed on facial recognition tech in San Francisco — all while winning awards for her contributions to legal scholarship in her area of specialization, labor and employment law.

At the annual symposium of the AI Now Institute, an interdisciplinary research center at New York University, Dubal was a featured speaker. The symposium is the largest annual public gathering of the NYU-affiliated research group that examines AI’s social implications. Held at NYU’s largest theater in the heart of Greenwich Village, the symposium gathered a packed crowd of 800, with hundreds more on the waiting list and several viewing parties offsite. It brought together a relatively young and diverse crowd that, as my seatmate pointed out, contained basically zero of the VC vests ubiquitous at other tech gatherings.

AI Now’s symposium represented the emergence of a no-nonsense, women and people of color-led, charismatic, compassionate, and crazy knowledgeable stream of tech ethics. (As I discussed with New Yorker writer Andrew Marantz recently, not all approaches to tech ethics are created equal). AI Now co-founders Kate Crawford and Meredith Whittaker have built an institution capable of mobilizing significant resources alongside a large, passionate audience. Which may be bad news for companies that design and hawk AI as the all-purpose, all glamorous solution to seemingly every problem, despite the fact that it’s often not even AI doing the work they tout.

veenadubal

Legal scholar Veena Dubal.

As the institute’s work demonstrates, harmful AI can be found across many segments of society, such as policing, housing, the justice system, labor practices and the environmental impacts of some of our largest corporations. AI Now’s diverse and inspiring speaker lineup, however, was a testament to a growing constituency that’s starting to hold reckless tech businesses accountable. As much as the banking class may panic at the thought of a Warren or Sanders presidency, Big Tech’s irresponsible actors and utopian philosopher bros should be keeping a watchful eye on the ascendance — a rise truly based on merit and competence, rather than cheap charisma — of this next generation of critics like Crawford, Whittaker, and Dubal.


I won’t attempt a more detailed review of AI Now’s conference here; the organization will put out an annual report summarizing and expanding on it later this year; and if you’re intrigued by this piece, get on their mailing list and go next year.

Below is my conversation with Dubal, where we discuss why the AI Now Institute is different from so many other tech ethics initiatives and how a scholar of taxis became a must-read name in tech. Our conversation ends with the story of one well-off white male software engineer who experienced surprising failure, only to realize his own disillusionment helped him connect to a much greater purpose than he’d ever envisioned.

Epstein: Let’s start by talking about the AI Now Symposium. What does it mean for you to be here as one of the featured speakers?

Dubal: It’s so awesome for a center like this to to say that what Uber drivers are doing to organize to better their conditions is actually related to tech. For the last half decade at least, I’ve been doing what is considered tech work, but very much at the periphery. Because we weren’t explicitly doing computer science-related work, I think people didn’t think of the research people like me do as being at all [related to tech]… it was “just” labor. It wasn’t tech, even though it is on [workers] backs that the whole tech industry exists. So it’s powerful to be included in this conversation.

And for this particular event, they’ve done such a good job of [inviting speakers] whose research is thought of as on the periphery, but should be at the center in terms of what is really important from an ethics perspective. Ruha Benjamin [a Professor of African American Studies at Princeton and founder of Princeton’s JustData Lab]’s work is amazing and then the two people that I’m on the panel with, Abdi Muse [Executive Director of the Awood Center in Minneapolis, a community organization focused on advocating for and educating Minnesota’s growing East African communities about their labor rights], organizes warehouse workers in Minnesota, who are the reason Amazon can facilitate the transcontinental flow of goods in the way that they do.

AI Now co-founders

AI Now co-founders Meredith Whittaker and Kate Crawford. (Photo: Katherine Tyler)

And Bhairavi Desai [Executive Director of the New York Taxi Worker’s Alliance] — I’ve known her for 10 years and she has, from the very beginning, been fighting this gig nonsense. To have them in the room and centered, to have their voices centered instead of on periphery, is just so awesome for me.

Epstein: It’s very clear that AI Now is dedicated to doing that, maybe even moreso than any other peer organization I can identify. How do you see AI Now, as an organization, positioned among their various peers?

Dubal: It’s a great question. I’ve looked at a couple of other more nonprofity things that do tech and equality, and you are absolutely right; more so than any other organization, [AI Now] centers the people who are often at the periphery. Everything that they do is very deliberative.
They aren’t moving through things really quickly, onto the next project really quickly. Every decision they make is thoughtful, in terms of the people that they hire, for example, or how they do an event, or who they include in an event. It’s just very, very thoughtful, which is not how most things in tech, period, run.

Epstein: They’re not moving fast. They’re not breaking things.

Dubal: Exactly. They’re not breaking things. They’re fixing things. And the other thing is, even The TechEquity Collaborative, a nonprofit in San Francisco, there’s a tech utopian imaginary that guides their work. They really have a belief that the technology is going to fix things. AI now, based on all the interactions I’ve had with them, My sense is that their ethos is very much about how people fix things. Tech doesn’t fix things.

So they’re centering the people who can fix things. They’re in a powerful place, and I think because they’re so sophisticated in the work that they do, they have a powerful voice, which is unusual for people who are interested in the subaltern and in the issues that hurt the most marginalized.

Epstein: Yes. What made me want to come all the way here from Cambridge, MA, where we are not exactly suffering from a shortage of tech ethics initiatives, and what made me decide to miss a lot of the Disrupt conference even though I work for TechCrunch, is that it’s rare that you have an organization that is able to combine to things: genuinely fighting for the marginalized, or helping the subaltern speak; and actually achieving a very significant public voice. Usually it’s maybe one or the other but not both.


Dubal: It’s amazing. I want to talk to Meredith [Whittaker, AI Now Co-founder] about their funding, because I’ve often blamed the lack of a more radical ethical position at other nonprofit institutions in this arena on how funders didn’t want them to do this, but clearly, whoever is funding [AI Now] is not necessarily guiding their hands in a way that is more conservative.

Epstein: They’ve done an extraordinarily savvy job. Now, let’s move on to you and your work. Just several years ago, you were doing an ethnography of the taxi industry in San Francisco. You created a taxi worker project at the Asian Law Caucus, which was a kind of clinic for doing legal work for taxi workers. All great things but they don’t exactly scream “tech!” when one looks at them on the surface. Did you ever think, at that time, that you’d ever be a keynote speaker at a giant AI conference?

Dubal: No, not at all. It’s so funny. Actually, when I was applying for jobs in academia, my husband was, like, “You have to find a way to make people interested in your work. No one is interested in taxi drivers.” And it [was true at that time].

Epstein: When was this?

Dubal: Uber already existed, but it wasn’t big yet. It was 2013, 2014. I was positioning myself as a workers’ rights person on the job market and I said, “It’s too bad people don’t care about taxi drivers.” There’s long been a lot of interest in domestic workers and day laborers, but taxi drivers were sort of viewed as these hairy immigrant men who carry your body from Point A to Point B. People had a tenuous relationship with them even though they were also precarious workers.

Epstein: They were the original ghost workers.

Dubal: Absolutely. But there wasn’t very much interest, and anytime I give a talk, people were like, “Oh, but it’s so hard to get a taxi.” Okay. But I’m not talking about consumer issues. I’m talking about workers’ rights issues. It was very hard for people to have empathy for this group of workers. I think two powerful things have happened since then.

One, for better or for worse, Uber has made people interested in this group of workers that no one was interested in for a really long time. But I think the most powerful thing is that because they are now considered tech workers, because they work through a platform, even though I have to say taxi workers were also using a similar app before Uber started.

Because they are itemized and dispersed and they have this multibillion dollar organization that’s hemorrhaging money and still not giving them any money and just such a messed up situation, people are interested. I’m writing a paper about this right now. That there’s this fascinating alliance happening between the Uber drivers I’m organizing with, and the tech workers that I’m organizing with. They all see themselves as tech workers.

So there’s almost this flattening … Because you have these massive organizations that are using similar business models, like contracting temp workers at Google, and contract workers with Uber. They’re creating such a mass space of workers who are exploited, that these blue collar workers and these white collar workers who, otherwise, wouldn’t see themselves as having any kind of common interests, are forming alliances. In my 20 years of doing advocacy, I have never seen that.

I’ve never been in a room with someone [who holds] an engineering degree and was working at a large company, and with a taxi driver, to have them really see eye to eye that they had shared interests. And the reason that they have shared interests all of a sudden is not just because they have a similar object of hate and exploitation, but also because everyone is concerned about the way we’re all producing data, what’s happening to that data, how our worlds are being re-imagined through that data. We [all] have zero privacy. So the social and political issues they share are the same.

Epstein: I think that’s worth asking more about — how did people who didn’t used to relate to one another come to relate to one another, around these problems?

One thing I’m wondering about that I’d like to get your reaction to is that people like to talk in terms of the 99% and the 1%, but, actually, you know, there was an Atlantic cover story a year or two ago about the 10%, essentially. The writer Matthew Stewart wrote a cover story about “The Birth of the New American Aristocracy,” which talked about how actually, the top 10% or so of the American economy is doing very well. And it occurs to me that those of us who are in that percentile increasingly want and need and feel like we’re entitled to people driving us around. And our ready, affordable access to that sort of service has quickly become a bedrock fact of our lives — a reality that maybe in the past only the 1% or the .1% could really experience.

Dubal: Absolutely.

Epstein: And if you’ve got 30 million people in this country who are doing great financially…

Dubal: Yeah.

Epstein: Then you have the development of essentially a servant class. That’s my rather crude term for it, but many economists have noted what is more formally called the rise of “wealth work,” or a huge sector of the modern economy which is basically low wage work helping meet the day to day needs of high earners who are too busy to meet their own basic needs.
To me, that seems to be a big part of what would be driving increased connections between people from different sectors of, say, the 90%. To what extent do you agree, based on your research?

Dubal: Right. I’ll tell you a story of someone I’m writing about right now, who is involved in doing some of this organizing work. He makes a significant amount of money. He works at one of the Google campuses, and he’s an engineer. After Trump was elected he looked around himself like, “I think I’m doing good because I’m working at this biotech company, basically, a bio-science company, and they have a wonderful mission. But I need to do more.”

And he looked around, like, “Huh. I get all of these benefits from working here, but the cafeteria worker doesn’t, the security guard doesn’t, and, basically, their work makes my work possible.” Which I thought was amazing, that he saw what had never been clear to him before. But the next step he took next is, I think, more critical. Because we observe inequality around us all the time.

He got involved with some other tech workers at his company, and he raised some issues with the company. And he thought of himself as a powerful person. He’s a product manager, program manager, whatever. He said, “there are some labor issues I think we should address,” and he had always thought of this as being a place where he could raise anything, and [upper management] would address [it].

He really believed this. They have these town halls where the head honcho comes and they all talk about issues, and he thought that that was a productive space to bring [his concerns] up. He was silenced immediately.

Suddenly, he felt his job was in jeopardy because he was bringing up these labor issues for other workers. He realized that he was in a precarious position, too. It wasn’t just about the cafeteria worker. What he saw, all of a sudden, was the way power worked and how he did not have power, despite the fact that he had money.

All of a sudden, he was, like, “Oh, I’m an at-will worker just like the cafeteria workers. I can be fired. I’m not in a union. What are the labor laws?

We were sitting at his work talking about these things, and he was so aware of what was going on around him. He was really worried; we got up and went to another place. And it was such a different interview than I had done three months before with him, where he was like, “I’m going to solve all the problems in this company. They listen. They’re responsive.”

His own vulnerability became apparent to him, and so now the relationships he’s building with other tech workers at the company, whether they’re the security guards, the cafeteria workers, or people on his team, are so much more powerful. Because they’re not based in a sense of philanthropy, like, “I’m going to do good for the underdog.” He really relates to them.

I guess that doesn’t answer your question, but it’s one of the more powerful things I’ve seen happen in terms of how people at the top … and he’s probably in that 10%, you know, are viewing the people below them.

AI Now symposium 2019

A panel discussion at the AI Now symposium, October 2, 2019. (Photo: Katherine Tyler)

Epstein: It’s a terrific anecdote for my purposes. I’m fascinated to read more when you’re ready to publish. It also sounds as though he, like me, was one of those people who were raised or socialized to think of ourselves as the savior, the superhero.
And eventually, for every emotionally mature human being, there’s a realization moment that you’re not a superhero. And that it actually disempowers you, and it plays into the hands of the truly powerful, for you to operate under the illusion that you are really so powerful all by yourself.

Dubal: Yes.

Epstein: I think that that’s a myth that’s just starting to crack in the tech world.

Dubal: It happened. The first time I interviewed him, I was like, “Oh, great. This is a tech bro who’s looking out for people,” but then when we continued meeting, and that last time we met was so powerful for me. It gave me goosebumps. I realized [his disillusionment] was actually necessary [in order for him to relate to people]. That was what was going to actually impact change.

Meredith [Whittaker of AI Now] has been amazing at facilitating those kinds of connections. That is really important to her. The more I talked to her, I realized that’s part of the way that she thinks about politics. Much less through the guise of philanthropy and doing good for those who are less fortunate, but through bilateral power building as opposed to top down power building.

Epstein: Fascinating. That’s precisely why I’m keen to observe Whittaker’s and AI Now’s progress and to help my readers stay informed about it.
To wrap up: you wrote a piece, in 2017, called “Winning the Battle, Losing the War.” Where are we now?

Dubal: We are winning the battle and losing the war.

Epstein: No change since 2017?

Dubal: No change, right. I mean, there have been significant wins, and we’re definitely going in that direction. We just won AB5 in California. It’s something that’s likely to spread all over the country and maybe even the world, like the re-regulation of industries that were deregulated years ago, and recognition that all workers, whether or not they are working through an app on their phone, deserve basic rights. People keep telling me, like, “Oh, if Uber drivers got a minimum wage, would that be it?” No. No one just wants a minimum wage. And that’s certainly not sharing power.

It certainly doesn’t address the myriad of issues raised by tech platforms, including the data issues that arise; the privacy issues. But if you have enough money to go home and put food on the table, then you have the capability of going to organizing meetings, then you have the ability to form solidarities with other workers, then you have the time to do those things because you’re not working three jobs.

[AB5] was a huge victory. I organized with Uber drivers in San Francisco, and they’re so elated. They feel they’ve been so exploited over the last six years. They’re so angry that even if they don’t gain anything out of it substantively, materially, the fact that they are making the company feel the burn (no reference to Bernie Sanders), has been hugely empowering. They’ve been able to direct their anger at something. Because Uber did a really good job — and no one has really covered this. I’ve written about this a little bit, but Uber has done a really good job of making it impossible for the workers who want to better their working conditions to have a place to direct their energies.

They can’t go to an office. They can’t go to City Hall because they’re all regulated at the state level. They are regulated by commissions within the state, and in California that’s a CPUC. It’s just a huge building, an office building. Who do you go to? They’ve done such a good job of diffusing the potential to go to a place to feel anger, to have that affective energy. And if this goes through, if AB5 is enforced in some way, it’ll be much easier to do something like that, I think.

Epstein: The Uber situation that you described, it’s another way of understanding the concept of structural oppression. If there is no physical structure in which people can voice their grievances, it underscores the psychological and political structures that are controlling people’s lives.

Dubal: Totally.

Epstein: As I ask at the end of all my TechCrunch interviews: how optimistic are you about our shared human future?

Dubal: Oh, I don’t know. I just did an event with Naomi Klein, and I feel very, very scared about climate apocalypse. I feel like we are definitely making a turn with regard to privacy issues in this country with regard to workers’ rights, and I hope that that’s a stepping stone towards the Green New Deal.

I hope that once people don’t have to worry about getting food on your table on Tuesday, we will have the time to think about 10 years from now. So, I guess that’s kind of hopeful, but I’m really sort of feeling yucky about our existential crises.