CTRL+T podcast: Artificial intelligence may become a human rights issue

Welcome back to another glorious episode of CTRL+T. This week, Henry Pickavet and I explore Amazon’s new cashier-less stores that promise no waiting in line — except to get in — and Uber’s newest C-level executive hire.

Later in the episode, I rage with Safiya Umoja Noble, a professor at the University of Southern California and author of “Algorithms of Oppression: How Search Engines Reinforce Racism.” Full disclosure, I went to USC but Noble was not a professor there at the time. Additional disclosure, I wish I could have had her as a teacher because she’s smart as hell. Final disclosure, Henry applied to USC but was rejected.

In her book, Noble discusses the ways in which algorithms are biased and perpetuate racism. She calls this data discrimination.

“I think that the ways in which people get coded or encoded particularly in search engines can have an incredible amount of harm,” Noble told me on this week’s episode of CTRL+T. “And this is part of what I mean when I say data discrimination.”

Noble’s book came out just this month, but she’s already working on her next research-driven project. Noble is currently exploring artificial intelligence and its potential negative effects.

Specifically, Noble is “trying to think about centering people who are already living in the margin– who are already living under racist and sexist stress and how these technologies might be exacerbating the harm that comes their way.”

In this century, Noble is betting artificial intelligence will become a human rights issue. Check out the rest of the interview on CTRL+T. Be sure to subscribe, rate this sheezy five stars, and help us keep the lights on and the content rolling.