Compliance, ethics, and live calls

Reminder: Live conference call today at 2pm EDT / 11am PDT

We have TechCrunch writers Kirsten Korosec and Kate Clark talking all things tech today — be sure to check your inboxes for dial-in details a bit later today, and come armed with questions!

A roundtable conversation on tech ethics

Our resident humanist and ethicist Greg Epstein published a roundtable conversation with three notable scholars of ethics, debating what exactly is tech ethics and what do the debates around it mean. Hilary Cohen, Kathy Pham, and Jessica Baron had much to offer on the subject, and why it is a bit more complicated than the term suggests.

Cohen described tech ethics this way:

I think at various times, depending on the conversation, tech ethics refers to at least three distinct things. The first, probably the most narrow, is an actual branch of analytic philosophy that looks at the impact of technology on society. That’s philosophers working to ask new questions on these topics, then systemizing and giving us vocabulary to understand them. That’s how I see the work of Nick Bostrom, or I know you recently were in conversation, Greg, with James Williams. That’s how I see what [Williams] did, during his work at Oxford.

The second category of tech ethics is a set of dedicated attempts to get technologists to engage with the social and ethical dimensions of their work, as they’re building products, or writing code, or studying computer science in school. You see efforts on this front both at universities and within companies. I think at least Kathy and I have been involved with this in some respect. Kathy’s helping to build, promote, and scale up nascent efforts that are underway.

The third [category] is probably the least clear, but also the reason for the question. [Tech ethics has] now become a placeholder for all kinds of questions about a society that’s in flux economically, culturally, and politically, in which many of the most profound shifts have either been driven by or exacerbated by the emergence of new technologies.

What is “dark data” and how do you deal with it?

Lisa Hawke talks about the legal implications of the rise of dark data and how companies are trying to do more to solve the problem:

So what is dark data and why is it a topic of interest for legal and compliance teams? Dark data is data that is difficult to search or access, or both. It is relevant to data risk management because searchability and accessibility of data are relevant to being able to find and secure it. Dark data can make it difficult or impossible for legal and compliance teams to find information when searching their organization’s files and documents. A common dark data example is chat files coming out of modern messaging apps, like Slack or WhatsApp. Unlike email messages, there is no chat message “standard,” resulting in each company having their own file format (e.g. Slack uses a textual JSON schema, iMessage uses a binary SQLite database).

Reminder: guest columns

We have published a smattering of interesting guest posts on TechCrunch and Extra Crunch from experts in their fields who want to spread their smart ideas to more people. Know someone who is brimming with ideas that deserves a larger platform? Definitely drop us a line and read what we are looking for.

Thanks

To every member of Extra Crunch: thank you. You allow us to get off the ad-laden media churn conveyor belt and spend quality time on amazing ideas, people, and companies. If I can ever be of assistance, hit reply, or send an email to danny@techcrunch.com.