To paraphrase an oft seen Internet trollism, ‘Internet comments are the worst thing since Hitler’. By which I mean: they suck. They suck bad. No, they really suck.
The problem we have right now with Internet comments is clear: negativity and nastiness is burying or discouraging everything else.
When, as an online writer, you’re relieved to find the comments clocked up by your post are only spammers touting ways to earn $$$ when you work from home and/or posting a link to a video about BOOSTING your sales leads by 250% in just 2 weeks! — rather than, say, random vitriol, abject stupidity or violent outpourings — you know something is wrong, very, very wrong with the standard reader feedback structure of the Internet.
For pity’s sake tech world, there has to be a better way!
And I don’t mean turning off comments entirely — although more websites are doing that, including some media companies (apparently) lacking the resources and/or willpower to effectively police the effluent that inexorably flows from the bottom half of the Internet.
But switching comments off entirely is to surrender to the trolls’ bile-colored flag.
And while it might be more pleasing to hear silence than even the most half-hearted troll dirge, it also means you’re silencing genuinely engaged readers — who probably had something interesting to say, if only they felt it worth their while to say it. So you’re giving up the chance to grow a community of your own.
And sure, any media company can offload comments to social networks — outsourcing commenting to Facebook and Twitter, where your content is most probably being recirculated anyway. But with more of web users’ attention being gated onto social tech platforms — and especially as these dominant platforms formulate ever more controlling media/content distribution strategies — that’s arguably very short-term thinking.
On the shifting sands of online publishing there should really be fresh impetus for media firms facing commoditization on the social web, and attacks on their ad-powered business models, to retain and sustain a strong brand identity of their own. And doing that absolutely requires having an engaged community. And how do you get more people engaged with your stuff? You let their voices be heard as a part of the proposition you offer.
A lot of the tech companies… don’t really consider the content their responsibility.
And so into this embattled comment arena, steps Civil: a startup co-founded by former online comment moderator, Aja Bogdanoff, who got tired of trying to firefight the endless stream of comments spewing out of an unregulated pipe. Her answer? A comment plug-in that requires users to engage with a framework of civility and quality as the specific entry criteria for being able to air their views.
She argues that the current model of digital services using human moderators to assess comment content for acceptability — typically after the fact — is “not a very efficient way to do it”. And the result? Well, the result is clear… Welcome to Trolltown: Population, the idiot who shouts loudest.
So how do you determine civility and quality? You ask readers to be the judge of that. Ergo you crowdsource comment moderation. In Civil’s case this means review three other comments, get to post your one.
The system also asks comment posters to review the quality and civility of their own comment before it gets submitted for review by others (and potential posting) — so effectively reminds them to sanity check their own opinions.
Do these extra steps result in fewer overall comments being posted? The team claims it’s only seeing a marginal impact on that front so far — claiming a six per cent abandonment rate during testing.
Users also accrue a civility rating, based on their behavior within the commenting platform, which is used to control their commenting permissions in certain flash-point scenarios (see below for Civil’s ‘Sieve mode’ feature, for instance).
It’s really a very simple idea, technically speaking — so simple you have to wonder why it hasn’t been done before. But that question just leads to a much more interesting thought: that bad Internet comments are actually the byproduct of neglect. A direct consequence of thoughtless design and zero/minimal investment. Much like weeds growing on an untended plot — if you don’t create the conditions where good stuff can grow, really what else would you expect?
“We get that question a lot,” says Bogdanoff, when I ask the obvious question — i.e. why this hasn’t been done before. “A lot of the tech companies right now don’t really consider the content their responsibility. They provide the tools to let site owners post comments but then what people actually publish they say well that’s up to you to moderate. Then it’s the site owner’s responsibility to for them to put expensive moderation teams on it. And obviously they can’t keep up.”
“I think there’s a lot that tech can do to shape how people interact with each other online,” she adds. “I have a lot of theories about why other companies haven’t done it — but I don’t know for sure.”
We all deserve to have a decent culture online.
In Civil’s case, there is also some algorithmic special sauce going on in the background — to determine which readers are asked to rate which comments. So it’s not quite as simple as it appears on the surface. “We select the comments; so it’s not a popularity contest,” is how Bogdanoff explains it.
She won’t be drawn on how exactly these algorithms work — given that any comment platform needs to guard against the risk of trolls trying to game its mechanisms en masse (e.g. like the co-ordinated comment attacks in and around #Gamergate). But she will say it involves looking at “the behavior of the person”.
“We’ve put a lot of thought into making it so that it’s not a popularity contest. So that it’s not something that’s easy to bring a bunch of buddies in and change the vote,” she adds. “Over time we see the pattern of a person’s behavior and the system is able to determine who’s acting in a trustworthy manner.”
In a nutshell, Civil’s thesis is that consequences and social structures in real world public spaces help keep (most) people doing and saying civil things in public most of the time. But online? Well, that’s a whole other lawless story.
“Online we don’t have those social structures. We don’t have those systems in place that constantly remind people that indulging in their baser instincts will leads to negative consequences… So it’s not so much why are people doing these things — it’s ‘what’s missing that keeps people from doing these things?’,” argues Bogdanoff.
“Instead of saying ‘gosh why are people so horrible online?’ it’s been more helpful for us to look at it from the perspective of ‘why aren’t people this horrible this often in face to face situations?'”
Why ask readers to judge comments on both civility and quality? That’s to try to encourage people to set aside more emotive personal responses — that might mean they like or dislike a particular comment — and rather push them to make a more impartial assessment, based on whether a comment is honestly composed vs intended to troll, or is acceptable vs abusive.
“What we really need to do is we need to get in there and we need to change how people are submitting their content, submitting their comments. And we need to make sure that we’re giving them good reasons to behave well. And we also need to make sure that we’re equipped to know what the content of a comment is,” she adds.
Civil Comments requires users register to comment but does not enforce a real names policy — unlike, for example, Facebook Comments. The team argues that allowing users to comment using pseudonyms is an important protection for people participating in a public debate.
“Real name policies… often do more harm to the folks who are acting in good faith than they do in preventing bad behavior/bad actors,” says Civil’s other co-founder and designer, Christa Mrgan. “A real name can really be used against you. I think it’s a terrible thing to require people to put themselves out there in that way just in order to participate.. I think that’s a very high price to ask.”
So at the business end of this crowdsourced comment moderation pipe, what does the site user see? All comments but with quality/civility floating on top? Or only the comments with a minimum bar of quality/civility?
That’s something it’s still tweaking. “Right now, the default is three ratings per comment, with a 2/3 approval system, but we’re actually experimenting with different variations on that, for lower-volume sites. We can also make this configurable according to what the publisher wants,” says Mrgan.
The team, which started working on the business in January this year, has raised some $340,000 in angel investment thus far. Their first product (Civil Comments) is still in public testing, on mock sites they’ve created themselves. They’re not yet installed on any third party sites — “hopefully soon… we’re in talks”, they say on the customer front.
The initial business model is SaaS — they’re selling b2b, but plan to open up a freemium product for individual bloggers to use in future. And they have a pipeline of other ideas for other products that extend beyond Civil Comments, albeit they’re not discussing these plans publicly at this stage.
“When you start looking at designing platforms this way we’ve a lot of other things that we’d like to be able to tackle once this is established,” says Mrgan.
Is there perhaps also a licensing business here? It strikes me there are ideas here that could be applied to social platforms, such as Twitter, which has had a sustained problem with co-ordinated abuse attacks (albeit it’s doing more behind the scenes to try to prevent the spread of abusive tweets).
“Maybe,” is all they’ll say, with a laugh, when I suggest this.
They’re upbeat about their timing, despite moves by websites to turn off comments and a general sense of fatigue among web users about the entire commenting process owing to the effort vs harassment ratio.
“I think a lot of these sites understand, on some level, that these are their most engaged users — this is one of their most valuable assets are their communities, right, the communities formed around their content,” says Bogdanoff.
“With the platforms that currently exist for a lot of these companies the numbers just don’t work out — because it costs so much money to moderate comments at high volume. With something like Civil Comments that really helps reset that balance and puts more control into the hands of their best community members, I think we’re going to see more sites willing to jump back into this. Because it really does benefit them.”
“It’s like having a mini social network right there on your site,” adds Mrgan.
I posit that it’s perhaps no accident that a startup with a mission to fix online commenting is led by two female founders — given how much hatred women frequently have to endure in online comments.
The pair are diplomatic on this point. “It’s definitely a problem. It’s a problem for everybody really. We all deserve to have a decent culture online,” is what they say.
How confident are they that what they are building can survive a co-ordinated abusive comment onslaught such as the one triggered by Gamergate?
“I don’t know if we want to go on record that we’re confident we can’t be gamed… I think that we have something that goes a long way towards addressing co-ordinated attacks. I don’t think anything is ever going to be perfect — again we use the real world as our benchmark, and in the real world nothing is 100 per cent safe from co-ordinated attack, right? But we have built in systems that we think are going to make a difference,” says Bogdanoff.
One of these systems is a feature called ‘siege mode’ where an individual article can be put into a partial comment lock down. “It’s still able to accept comments — you can still go and comment on this article if you have a civility rating that is high enough, if your trust level is high enough,” explains Mrgan.
“If you don’t have that civility rating — if you’re a brand new commenter. And it will fall into this mode automatically, based on suspicious activity. Or the publisher themselves can put an article into siege mode. And so if you go there as a new commenter, or your civility rating just isn’t high enough you’ll get a little message saying: ‘I’m sorry this article is receiving a high amount of suspicious activity. The comment section is restricted to people with a civility rating of 80 per cent or higher. But you can improve your civility rating by going and leaving civil comments elsewhere on the site.”
“It takes work and effort to raise your civility rating,” she adds.
A comment platform designed to make trolls sweat? Now you’re talking…