Crowdsourced labor startup CrowdFlower is launching version 2.0 of its Real Time Foto Moderator today, with updates that should make RTFM more useful for apps with adult and edgy content.
CrowdFlower first announced RTFM last month, pitching it as a self-serve tool that app developers can use to tap into the crowd and make sure the images shared by their users are appropriate. This was the company’s first big launch after co-founder Lukas Biewald (a former roommate of mine from college) rejoined as CEO, saying he wanted to create self-serve products that could be used beyond CrowdFlower’s enterprise customer base.
For version 2.0, RTFM Product Manager Vaughn Hester says the team has made two significant additions. First, customers can now choose between two different “rule sets”, which determine which photos are deemed appropriate and which are not. There’s the stricter rule set, which is what RTFM launched with, and then there’s a second, one that allows more “borderline scenarios” — the kind that might, for example, come up in an adult-oriented dating app. Things like photos of torsos without a face (hmm, it sounds kind of weird to put it that way …), indoor shirtless photos, and visible underwear would be allowed in the new ruleset, but not the stricter one, Hester says.
CrowdFlower has also added the option for a moderator to say why a photo was rejected. So when a user gets notified about the rejection, they can know what they did wrong and try again.
Hester adds that with RTFM, she’s also trying to encourage app developers to be proactive about content moderation. As one illustration of why this is important, she points to the alleged rapes that were reportedly facilitated by dating app Skout. To be clear, she isn’t saying that CrowdFlower’s photo moderation could have prevented Skout’s problems — after all, Skout is already an RTFM customer. However, Hester argues that the Skout case illustrates the enormity of the risks involved. She says she wants to do more with crowdsourced moderation to make apps safe, for example adding features to review entire social network profiles for inappropriate content or signs that they were written by someone who’s an inappropriate age for the app.
“For developers, this often just falls by the wayside,” she says. “We’d like to see a change of the norms, coming from the app store or from developers themselves.”