About a year and a half ago, just in time for National Eating Disorders Awareness Week, Tumblr took a hard stance against blogs on its network that encouraged “self harm.” This includes those that glorify or promote anorexia, bulimia, and other eating disorders, the company said, as well as those focused on self-mutilation and suicide. The company also said it would revise its Content Policy, and start showing Public Service Announcements (PSAs) when users search for certain keywords on the site, like “thinspo” or “proana,” for example. Other services like Pinterest and Instagram soon followed suit.
Here’s how well that’s working today.
Tumblr, which thrives on the emotional, sometimes diary-like output from a younger demographic who’s shying away from Facebook and the prying eyes of moms, dads, co-workers and bosses, serves as a pseudo-anonymous enclave where people can post, share, opine, vent and dwell on their interests — even when those interests are unhealthy ones.
The company is already well-known for having a “porn problem” — that is, it toes a fine line between permitting adult content but not wanting to host it directly. That’s a whole ‘nother ballgame, as they say, but while viewing pornography can be addictive, it’s not potentially lethal to the viewer.
The same cannot really be said for Tumblr users who are seeking out self-harm content, however.
In the U.S., 20 million women and 10 million men suffer from a clinically significant eating disorder at some point in their life. Anorexia has the highest mortality rate of any mental condition, as the sufferer is literally starving him or herself to death.
In addition to eating disorders, according to Your Voice, the offshoot nonprofit associated with rising mobile network for shared secrets Whisper, suicide is the second-leading cause of death for college students, but 75 percent of that demographic doesn’t seek help for mental health problems.
In other words, the chances for outreach and connection with sufferers are few and far between.
The problem with eating disorders, as well as other mental troubles involving “self-harm,” is that continually being exposed to content that portrays some sort of glorification of the practice involved — whether a rail-thin model or a photo of an arm marked up with cut marks — is extremely toxic to those who are susceptible to the disease or condition.
“We do know that exposure to the kind of content that glorifies dangerous behaviors that are characteristic of those that struggle with eating disorders which can be life-threatening is a real problem — particularly for people who have a genetic predisposition to being vulnerable to eating disorders,” explains Susie Roman, National Eating Disorders Association‘s (NEDA) Director of Programs. She says that social media sites can further entrench the disorder with those who are viewing the images and messages, and it can also delay or prevent them from seeking help or entering into recovery programs.
Tumblr, unfortunately, is the worst offender.
“We hear that Tumblr is where people are constantly seeing content that is very triggering and very harmful, in terms of pro-ana and thinspo images and content. We don’t actually get a lot of complaints about Facebook…we just hear a lot more about Tumblr,” Roman says.
Ashley Womble, Director of Communications for the subsidiary of Mental Health Association of New York City (MHA-NYC), which handles the National Suicide Prevention Lifeline, also agrees that by its very nature, Tumblr is home to more self-harm content than others.
“If you type in ‘kill myself’ or ‘suicide’ on Tumblr, you’re going to find some really dark stuff,” she says. “You’re going to find that a lot of people are writing about the suicide ideation online; they’re posting pictures of self harm to what I consider to be a disturbing level. If you type in that same search term on Pinterest, you would not find that. In fact, you might not find anything.”
Her organization, which also works with Facebook, Google and Pinterest, sees more referrals from Tumblr than any other social media site.
Depression, stress and thoughts of suicide are not uncommon among Tumblr’s top demographic, either. “Half of all college students have said that in the last year, they’re either so stressed or so anxious, they’re unable to function,” explains Whisper co-founder Michael Heyward. Most at risk are those who are likely turning to a site like Tumblr in the first place — the bullied, or those who feel outcast or different. LGBT kids are four times more likely to commit suicide, for example. “The numbers are staggering,” Heyward adds.
According to the research team at SimiliarWeb, which studied a sample of 1.6 million Tumblr blogs, only 0.17 percent contained one of the more obvious self-harm tags (e.g. Cutting, Suicide, Self harm, Suicide note, Suicide notes, Suicidal, Suicidal thoughts, Committing suicide, Thinspo, Thins, Anorexia, Anorexic, Thinspiration, Bulimic, Bulimics, Eating disorders, Bulimia, Purge, etc.). If that figure was extrapolated to Tumblr’s overall user base, there would be nearly 200,000 blogs about these subjects. If it included the “alternate” words — the misspellings (“thynspo”) and less obvious terms — that number may be even higher. (Note that SimilarWeb’s study can’t discern the positive self-help blogs from the negative).
Social Media PSAs And Policies
While one could argue that a social media service has no business or responsibility to police the images or posts that appear on its platform, it’s worth noting that when it comes to “self harm” content, all the major sites have taken action.
Last year, Pinterest also came under attack for its growing number of thinspo images around the same time Tumblr took its big stand. And in the month following Tumblr’s announcement regarding its revised content policy, its plans to suspend non-compliant blogs and run PSAs, Pinterest soon after did the same.
According to Roman, NEDA technically reached out to Pinterest first but the company was already in the process of reaching out to them, as it turned out. She describes Pinterest as having an interest in being proactive, and a “receptive” and “eager” partner.
Like Tumblr, Pinterest posted a revised Acceptable Use Policy where it explicitly spells out what sort of content is prohibited (that which “creates a risk of harm, loss, physical or mental injury, emotional, distress, death, disability, disfigurement, or physical or mental illness to yourself, to any other person, or to any animal.”) However, it did stop short of banning topics altogether because someone searching for “warning signs,” “help,” “support groups,” or “recovery stories” may include those banned terms in searches, a company representative explains.
That being said, Pinterest partners with NEDA and SuicidePreventionLifeline.com to help provide the site with PSAs that run against searches for terms like “proana” or “suicide,” for example (the latter is more dominated by the “Suicide Girls” pinups, we should note).
Meanwhile, Instagram, though never having weathered quite as broad a media attack on the matter as Pinterest once did, quickly followed the others’ leads. In April 2012, it also updated its content policy to ban accounts, images or hashtags that glorify, promote or encourage self-harm. And it went a step further, making hashtags like “thinspiration,” “probulimia” and “proanorexia” no longer searchable. This remains the case today. Plus, a year after the policy was enacted, the site also banned the new hashtags its community had turned to in order to avoid censorship (e.g. misspellings like “thynspo”).
Instagram also partners with NEDA to run PSAs related to eating disorders; for searches related to things like “cutting” or “suicide,” Instagram points users to BeFrienders.org instead. Unlike Pinterest, which more unobtrusively displays its PSAs at the top of its website or in the app’s search-results pages, Instagram actually requires users to click “Show Photos” or “Cancel” after reading a pop-up PSA message.
Facebook, Instagram’s parent company, is a bit different. Though the site is not running PSAs against self-harm searches on its newly launched “Graph Search” service, nor on individual communities, it’s highly involved in monitoring content. Simply put, the company attempts to make the most serious self-harm content unfindable by the general public. (A search for “suicide,” for example, sends you to page after page of organizations involved in prevention.)
Its content policy prohibits self-harm like the others, and Facebook offers tools to allow users to report suicidal content, it provides suicide hotline info worldwide, and works with partner organizations to help inform its policies. Though a number of potentially “triggering” groups remain, Facebook has huge teams of moderators to police content related to self harm, hate speech and more, in addition to its automated systems. So while there are pages of “thinspiration,” for instance, there aren’t massive sub-sites (Facebook Pages or communities) with millions of members supporting each other’s decision to starve or kill themselves.
Tumblr Runs Its Own PSAs
When asked for an update on Tumblr’s earlier plans for PSAs, a company representative provided only a brief comment via email: “We have been and continue to suspend blogs based on reports we receive from our users and partner organizations.” The company never responded to subsequent requests to discuss the matter further by phone, or follow-up questions. The rep added, however, that “there is no plan to run PSAs or any other content on individual user accounts, nor are there planned changes to Tumblr’s content policies.”
Roman corroborated Tumblr’s claim that the site has worked with NEDA in the past, and even provided the organization with a dedicated email address that would allow the group to contact Tumblr of reports coming from its Media Watchdog program. She also says that some of those blogs did get pulled down. However, when pressed to ballpark how many requests were handled in this manner, Roman said there were “dozens.”
That’s not a lot.
Tumblr has more than 116 million blogs, so clearly a “one-off” method like this was not intended to be a long-term solution. The solution is, of course, running those PSAs — the alternative being Facebook’s heavy involvement in content oversight, something that a startup like Tumblr could probably not afford…at least, pre-Yahoo. But it has also struggled to communicate with its non-profit partners about its plans.
NEDA helped Tumblr craft the language for the PSA which Tumblr posted on its staff blog, and provided the company with a list of search terms to run PSAs against, like the terms it has in the past given to Facebook. But a year later, NEDA’s own PSAs still don’t run, despite the company’s assurance to the organization earlier this year that a solution was in the works.
“Because of so many technological challenges, given the magnitude of the content of proana and thinspo content, they were experiencing a lot of problems with being able to address it [with PSAs] on a one-by-one flagged basis,” Roman says of Tumblr’s explanation. “We’re disappointed to see that, a year later, the PSA is not popping up.”
But oddly, Tumblr is running PSAs, NEDA was just not aware of this.
Either the startup had not let NEDA know that PSAs have been running against search terms (they weren’t using NEDA’s text, however) or Tumblr rolled these out very quietly or very recently — perhaps not making a big announcement about its troubled users in advance of the big sale to Yahoo. Today, messages on select Tumblr searches for general terms, like “ana,” “thinspo” or “suicide,” for example, read:
“If you or someone you know is dealing with an eating disorder, self harm issues, or suicidal thoughts, please visit our Counseling & Prevention Resources page for a list of services that may be able to help.”
(Some searches also show no posts when Safe Search is switched on.)
Roman says that NEDA’s earlier contact at Tumblr didn’t answer the company’s questions about the missing PSAs as of just last week, and instead directed her to another Tumblr staffer who also never responded. The nonprofit did not advise on the “Counseling and Prevention Resources” web page or the current PSA, and — since it doesn’t do counseling, actually — would have provided different language about how it would like to be referenced.
However, Womble’s experience with Tumblr has been different. Her suicide prevention organization advised the service to put outreach information on its website and provided Tumblr with a list of terms that match those searching for info on suicide. She doesn’t know when Tumblr began showing the messaging next to searches, but is satisfied that it is doing so now.
Is There Another Way?
Though Tumblr, Pinterest and Instagram’s content policies say all the right things, those vulnerable to eating disorders, depression and other mental illnesses have found thriving communities on the sites nonetheless. It’s difficult for sites to keep up as users change their preferred tags. Instagram may have revisited its policies this year to correct for the now-rampant misspellings that are used to bypass search filters, but across it and other sites, searches for misspelled words (“proanna” or “thyn”) or other non-obvious tags (“thigh gap”) will still take you directly to large communities that have seemingly continued on undisturbed.
Whisper, for what it’s worth, is the only service that’s really trying something different. Instead of policing someone’s (unhealthy) thoughts, which may glorify or promote self-harm and then trigger others, the small but growing startup allows the post to appear to go through. But the post doesn’t show up for other Whisper users to view.
“We’re not going to sweep things under the rug. But if you ever say something even remotely suggestive, we remove the posting and watermark it,” Heyward explains. Only the poster can see the watermarked version. The copy reads, “your Whisper has been heard” and directs users to Your Voice for help and offers a hotline number.
For those posts that are borderline – and many are – the service has created a supportive community where negative and trolling content disappears with less than half a minute. “We don’t mess around with banning,” says Heyward.
Today, he believes that other social media sites need to do more. “A lot of people are unwilling to make short-term sacrifices for long-term viability of the business,” he says, pointing out that Myspace also once had a lot of issues around cyber bullying and teen suicide. “[These sites are] addicted to traffic…they’re not willing to do anything that even remotely alienates a small amount of the audience or that’s going to affect their daily numbers.”
Facebook, however, removes harmful posts all the time. So will Whisper. “It’s the right thing to do morally and ethically,” Heyward says, “and by happenstance, it’s the right thing to do for the business, as well.”
Need help? In the U.S., call 1-800-273-8255 for the National Suicide Prevention Lifeline or 800-931-2237 for the referral helpline offered by NationalEatingDisorders.org. Not in the U.S.? Try Befrienders Worldwide or the International Association for Suicide Prevention.
Image credits: top: Jenni Holma, Getty Images; sad boy: Shutterstock / sokolovsky