Will the future of work be ethical? Future leader perspectives

In June, TechCrunch Ethicist in Residence Greg M. Epstein attended EmTech Next, a conference organized by the MIT Technology Review. The conference, which took place at MIT’s famous Media Lab, examined how AI and robotics are changing the future of work.

Greg’s essay, Will the Future of Work Be Ethical? reflects on his experiences at the conference, which produced what he calls “a religious crisis, despite the fact that I am not just a confirmed atheist but a professional one as well.” In it, Greg explores themes of inequality, inclusion and what it means to work in technology ethically, within a capitalist system and market economy.

Accompanying the story for Extra Crunch are a series of in-depth interviews Greg conducted around the conference, with scholars, journalists, founders and attendees.

Below he speaks to two conference attendees who had crucial insights to share. Meili Gupta is a high school senior at Phillips Exeter Academy, an elite boarding school in New Hampshire; Gupta attended the EmTech Next conference with her mother and has attended with family in previous years as well; her voice and thoughts on privilege and inequality in education and technology are featured prominently in Greg’s essay.

Walter Erike is an experienced independent and SAP certified independent management consultant who works as a SAP S4 Implementation Senior Manager. He is also pursuing his MBA at Cornell’s Johnson School of Business. Between conference sessions, Erike and Epstein talked about diversity and inclusion at tech conferences and beyond.

Meili Gupta is a senior at Phillips Exeter Academy. Image via Meili Gupta

Greg Epstein: How did you come to be at EmTech Next?

Meili Gupta: I am a rising high school senior at Phillips Exeter Academy; I’m one of the managing editors for my school’s science magazine called Matter Magazine.

I [also] attended the conference last year. My parents have come to these conferences before, and that gave me an opportunity to come. I am particularly interested in the MIT Technology Review because I’ve grown up reading it.

You are the Managing Editor of Matter, a magazine about STEM at your high school. What subjects that Matter covers are most interesting to you?

This year we published two issues. The first featured a lot of interviews from top {AI} professors like Professor Fei-Fei Li, at Stanford. We did a review for her and an interview with Professor Olga Russakovsky at Princeton. That was an AI special issue and, being at this conference you hear about how AI will transform industries.

The second issue coincided with Phillips Exeter Global Climate Action Day. We focused both on environmentalism clubs at Exeter and environmentalism efforts worldwide. I think Matter, as the only stem magazine on campus has a responsibility in doing that.

AI and climate: in a sense, you’ve already dealt with this new field people are calling the ethics of technology. When you hear that term, what comes to mind?

As a consumer of a lot of technology and as someone of the generation who has grown up with a phone in my hand, I’m aware my data is all over the internet. I’ve had conversations [with friends] about personal privacy and if I look around the classroom, most people have covers for the cameras on their computers. This generation is already aware [of] ethics whenever you’re talking about computing and the use of computers.

About AI specifically, as someone who’s interested in the field and has been privileged to be able to take courses and do research projects about that, I’m hearing a lot about ethics with algorithms, whether that’s fake news or bias or about applying algorithms for social good.

What are your biggest concerns about AI? What do you think needs to be addressed in order for us to feel more comfortable as a society with increased use of AI?

That’s not an easy answer; it’s something our society is going to be grappling with for years. From what I’ve learned at this conference, from what I’ve read and tried to understand, it’s a multidimensional solution. You’re going to need computer programmers to learn the technical skills to make their algorithms less biased. You’re going to need companies to hire those people and say, “This is our goal; we want to create an algorithm that’s fair and can do good.” You’re going to need the general society to ask for that standard. That’s my generation’s job, too. WikiLeaks, a couple of years ago, sparked the conversation about personal privacy and I think there’s going to be more sparks.

Seems like your high school is doing some interesting work in terms of incorporating both STEM and a deeper, more creative than usual focus on ethics and exploring the meaning of life. How would you say that Exeter in particular is trying to combine these issues?

I’ll give a couple of examples of my experience with that in my time at Exeter, and I’m very privileged to go to a school that has these opportunities and offerings for its students.

Don’t worry, that’s in my next question.

Absolutely. With the computer science curriculum, starting in my ninth grade they offered a computer science 590 about [introduction to] artificial intelligence. In the fall another 590 course was about self driving cars, and you saw the intersection between us working in our robotics lab and learning about computer vision algorithms. This past semester, a couple students, and I was involved, helped to set up a 999: an independent course which really dove deep into machine learning algorithms. In the fall, there’s another 590 I’ll be taking called social innovation through software engineering, which is specifically designed for each student to pick a local project and to apply software, coding or AI to a social good project.

I’ve spent 15 years working at Harvard and MIT. I’ve worked around a lot of smart and privileged people and I’ve supported them. I’m going to ask you a question about Exeter and about your experience as a privileged high school student who is getting a great education, but I don’t mean it from a perspective of it’s now me versus you.

Of course you’re not.

I’m trying to figure this out for myself as well. We live in a world where we’re becoming more prepared to talk about issues of fairness and justice. Yet by even just providing these extraordinary educational experiences to people like you and me and my students or whomever, we’re preparing some people for that world better than others. How do you feel about being so well prepared for this sort of world to come that it can actually be… I guess my question is, how do you relate to the idea that even the kinds of educational experiences that we’re talking about are themselves deepening the divide between haves and have nots?

I completely agree that the issue between haves and have nots needs to be talked about more, because inequality between the upper and the lower classes is growing every year. This morning, Mr. Isbell from Georgia Tech talk was really inspiring. For example, at Phillips Exeter, we have a social service club called ESA which houses more than 70 different social service clubs. One I’m involved with, junior computer programming, teaches programming to local middle school students. That’s the type of thing, at an individual level and smaller scale, that people can try to help out those who have not been privileged with opportunities to learn and get ahead with those skills.

What Mr. Isbell was talking about this morning was at a university level and also tying in corporations bridge that divide. I don’t think that the issue itself should necessarily scare us from pushing forward to the frontier to say, the possibility that everybody who does not have a computer science education in five years won’t have a job.

Today we had that debate about role or people’s jobs and robot taxes. That’s a very good debate to have, but it sometimes feeds a little bit into the AI hype and I think it may be a disgrace to society to try to pull back technology, which has been shown to have the power to save lives. It can be two transformations that are happening at the same time. One, that’s trying to bridge an inequality and is going to come in a lot of different and complicated solutions that happen at multiple levels and the second is allowing for a transformation in technology and AI.

What are you hoping to get out of this conference for yourself, as a student, as a journalist, or as somebody who’s going into the industry?

The theme for this conference is the future of the workforce. I’m a student. That means I’m going to be the future of the workforce. I was hoping to learn some insight about what I may want to study in college. After that, what type of jobs do I want to pursue that are going to exist and be in demand and really interesting, that have an impact on other people? Also, as a student, in particular that’s interested in majoring in computer science and artificial intelligence, I was hoping to learn about possible research projects that I could pursue in the fall with this 590 course.

Right now, I’m working on a research project with a Professor at the University of Maryland about eliminating bias in machine learning algorithms. What type of dataset do I want to apply that project to? Where is the need or the attention for correcting bias in the AI algorithms?

As a journalist, I would like to write a review summarizing what I’ve learned so other [Exeter students] can learn a little too.

What would be your biggest critique of the conference? What could be improved?

I kind of think that there should be more AI that’s involved because a lot of the time and especially with the generational divide between the younger generation who is very versed in this digital literacy, I could call it, and an older generation. There is a gap yet the older generation is often the one making decisions and we saw that in this past year when I was watching the video online of Congress interviewing Mark Zuckerberg. A lot of the questions that they asked weren’t the right ones to be asking. There’s a lot of fantastic and really intelligent business professionals in this space and as much as it’s good to be talking about jobs and the impacts in the workforce, maybe conferences like this should also focus on being an avenue for educating an older generation that has not had a computer science education.

You and your peers are taking multiple classes in these areas and you’re seeing not everybody knows as much as you do, actually. I don’t.

You don’t?

Not about the technical aspects of AI! A lot of really good work is being done by students globally on climate right now. How worried about it are you? Some say we need to panic immediately, others are taking a much more optimistic view.

I’d like to say I’m in the middle and I think we need people at both ends. Hopefully more people saying we need to panic now. Personally, I am a little bit more relaxed about that. If I use a plastic straw some of my friends will yell at me and I’ll say, “Oh yeah, sorry. I forgot.” But overall, I think there’s a new wave of climate action awareness with the younger generation and I’m really glad to be sort of one of those people who’s on that wave.

Do you have a lot of friends who are really tense about things like plastic straws these days?

I do. I have friends who will yell at their friends about that. I have a lot of friends who are doing environmental internships and who are involved with the environmental clubs on campus. I brought some of them in to write for Matter magazine when they heard we were doing an environmental issue. I think it’s up to this young generation and you hear that a lot, to be the ones pushing for the world that we want to live in.

What about taxes? Like many students I work with at Harvard and MIT, you’re likely to be surrounded by people who, at least until the climate crisis sends us all off the cliff, will be wealthier than average. What about the argument that these tech conferences and companies or whatever talk about changing the world and making it better and giving back and all of that, maybe one thing they just need to do is just pay their taxes because public schools need to be better and we need to address inequality and maybe the globally wealthy are not the best people to do that. What are your feelings about that?

I’ve debated the topic ‘this house believes that we should implement a carbon tax and a wealth tax,’ multiple times. I believe we should implement these types of taxes. To my 17 years of understanding, inequality is growing. More wealth and money is accumulating at the top. Already as I see my brother go into computer science, he’s telling me there’s a lot of money being poured by corporations into the field.

This morning we looked at the graph where there’s more people working at the low wage and low skill end than at the high wage and high skill end, yet those people aren’t getting the money as they did several years ago. My friends and peers believe taxes can be justified. As someone who just filled out their first tax form, for this internship that I’m getting paid for over the summer, I don’t think I have to pay any taxes because I’m getting paid it’s a little, but I have to write a little zero on the tax form. That’s something that we’re willing to do.

The last question I ask all my interviewees: how optimistic are you about our shared human future?

I’m optimistic. I mean, people like to say I have a positive attitude. I will agree with that whole heartedly. I’ve been thankfully learning about computers and AI for a little while and there’s a long way to go. It has promise. It has dangers, but already we’re having these types of conversations like we are right here about those types of dangers and that shows awareness and the imitative in our society in general to combat any dangers. Let’s go to the future, yeah.

Thank you, Meili.

Walter Erike is an independent consultant and SAP Implementation Senior Manager. Image via Walter Erike

How could a conference like this one be most relevant, engaging, or appealing to other Black people in tech and in related fields? What do you feel this conference is doing right and not so right?

Walter Erike: In addition to thinking about ethnic diversity, I’ve also thought about possible geographic diversity. I don’t have evidence of this, but if I hear many of the companies in the room, and if I consider where their headquarters are, we’re very West Coast, Silicon Valley focused, and New York, finance focused. And there are a lot of really intelligent, motivated Americans living in the center, the Midwest, the breadbasket of America. From what I’ve heard and seen, they’re not represented.

As was mentioned, research shows manufacturing jobs in America have left and they’re not coming back. Our output, as American manufacturing, is still quite high, because we have some resources, some labor, [and] we’ve combined them with automation. But people, cities, and other institutions in the Midwest have to devise strategies on how [to deal with the loss of manufacturing jobs].

I don’t want to pile on MIT, because I don’t think it would be fair. But perhaps it would help if they were to reach out to the Urban League, reach out to the National Black MBA, reach out to the Consortium, reach out to NAAP, which is a national organization of Asians, to get more ethnic diversity in the room.

Ethnic diversity will help improve the solutions, just by having a diverse set of minds in a room. And it will help mitigate the risk of having a biased dataset. Because different people see different things in a dataset. So the more diversity you have, more women, more diversity, and so forth, the more you can mitigate the risk of a biased dataset.

Let’s say we could magically make the outreach strategy you recommended happen and the organizers of a conference like this were to get a much more diverse crowd in the future. if they wanted a good number of those people to come back for a second conference, how might they design the content of the conference to make it most relevant, most engaging, most helpful?

That’s a very good question. And I like to recall my early experience in IT. Starting off as a developer analyst, I realized there was a huge disconnect between developers and business process experts, like, say, your finance leader or your marketing leader.

That’s quite similar to the disconnect now between people who have advanced knowledge of artificial intelligence and machine learning, and people who actually use the tools. So the content, in my opinion, is quite academic and theoretical. It needs to be taken down to the functional level.

You told me earlier that at a certain point in the conference, you were glad that it at least came down from the clouds into the trees. What do you see on the ground that you wish you could learn more about at a conference like this one, that would be better for the social impact of the conference as a whole?

Let’s consider logistics and supply chain: currently one of the largest sources of expense for many companies. Artificial intelligence and machine learning can help plan a more efficient supply chain by knowing when to deliver, how much to deliver, considering economies of scale for full truckloads versus less than a full truckload, considering promotions for customers that order full truckloads, thus encouraging the ordering of full truckloads, so on and so forth.

Then there’s a whole pricing component. Well, if we have your purchase history, we know your birthday, we know what you like to buy, where you live, this can influence our ability to offer you promotions, and when [to do so]. Or maybe you buy so frequently we should stop offering you promotions.

To be more granular, more practical, artificial intelligence and machine learning can form, say, decision trees, and then present leadership with options: say, do you accept a 10% increase in the price of this product based on the output of the algorithm? Yes or no? And if you increase 10%, here’s the projected outcome for your organization. That’s the level it needs to go to, [allowing] decision-makers to make quick decisions, based on all the numbers that have been crunched.

That makes me want to ask a bit about your background, how you came to these insights, and how you ended up at this conference in the first place.

Image via Walter Erike

I grew up in New York. I spent a lot of time in Harlem and also on Long Island, Many of my uncles are engineers. One day, when I was a youngster in the 90’s, one of my uncles brought home a green screen computer that had a floppy disk about the size of a VHS cassette. I was immediately intrigued by the computer and what it could do. I learned some coding, [then] continued to learn more about the usability of the computer. When it was time for me to pursue a major [at Clarkson University,] I chose computer science and information systems, [and] I was able to merge my affinity to computers with my passion for commerce and business, [gaining] a foundation in finance, accounting, supply chain, human resources and marketing. This foundation [propelled] me into a consulting career, where I implement and enhance ERP software.

How did you decide to go to EmTech Next?

Well, I don’t want to become obsolete and presently, my solutions are based on static data. They’re based on the sales, invoicing, [and] logistics data that’s available. But none of my solutions have included artificial intelligence or machine learning, and I see that as the next wave of solution architecture, which is my specialty. I am here to see what I can glean from all the research that has been done by various professors and students and so forth.

George W. Bush once spoke of the “soft bigotry of low expectations.” Maybe institutions like the ones represented at this conference don’t do enough recruiting in a place like Harlem, where you grew up, because they don’t associate it with great engineering talent or great technical talent. But maybe they just don’t know how to look for or develop that talent.

What’s your perspective on that, as a passionate tech person from a non-traditional tech background?

This is a very good question. I’ve been to various conferences, and often there’s a lack of diversity. I don’t think it’s purposeful. The efforts are well-intentioned. They just need help.

Perhaps there needs to be a change in mindset. Oftentimes in technology, we put the technology ahead of the problem or the solution. We’re so keen to push a new technology that can do X, Y and Z, but we don’t focus on the problem.

So we can have people in a room who may be under the impression they know what is best for everyone, without having the women in the room, without underrepresented groups, without people in the breadbasket of America in the room. I think that can only lead to disarray, to inefficiencies. Because it’s not an inclusive process — and not purposely. It’s just the way things have been done. I would encourage organizations in this position to reach out to different [organizations] and try and find out, what are we missing?

When I was at Clarkson, there was an organization called East Meets West, [whose] purpose was a transfer of culture and ideas from American students to students from Asian countries. We had American students, Chinese students, Indian students, Taiwanese students, [etc.] This was one of the most amazing experiences of my life. Soon after graduation, I even took a trip to China and spent time with one of the friends that I made in the program. So I think this sort of cultural exchange is very important. Whereas here, it’s quite homogeneous. And that should change for the sake of society, and for the sake of the data.

If an executive or decision-maker reading this interview, or someone else in your extended network were thinking to themselves, ‘I have to admit I’m one of these leaders who is not channeled into meaningful diversity,’ how would they reach out so you would feel good about hearing from them? They might worry about saying the wrong thing or making a wrong impression.

Well, I would like to say that the individual would have nothing to be ashamed of. We all have limitations, we all have subconscious bias. I have just as many.

I don’t see myself on a pulpit preaching diversity. I’m just preaching awareness and unity. As much as I want diversity, I also want to collaborate. If we work together, there should be progress. So I would encourage anyone or any organization to feel free to reach out to me.

My readers couldn’t do better than reach out to you. My last question, at the end of all my TechCrunch interviews: how optimistic are you about our shared human future?

I am optimistic because I don’t know if I would have imagined, say 10 or 15 years ago, that I would attend a conference [where I could talk to] a C level employee at a [tech] company and hear things like diversity and inclusion, or the evolution of work and so forth. So I’m impressed and hopeful.

It’s as if our economy is taking on a more human form, and that is good. There can be profit had with respect for human dignity; that’s the lesson [for] many organizations on the cutting edge today. It’s not a zero-sum game. You can have both profits and people.

So as long as we continue along that track and we do not forget about the people in Toledo, Ohio, in St. Louis, Missouri, in Detroit, Michigan, I think we’re going in the right direction and it will make us a more productive country.

Beautifully said.