top of page
  • Writer's pictureAnthony McCosker

We need to talk more about suicide, but not on Facebook

Updated: Aug 7, 2021

June 11, 2018

This piece is drawn from a program of research around social media, social support and health participation. For links to the relevant research papers, jump to the end or get in touch on Twitter @ACMcCosker or at my Swinburne University address.

We need to talk more openly about suicide. Death by suicide remains shrouded in stigma, shame, or even resentment. And we know that stigma silences and marginalizes those most in need of help, disconnecting them from family, friends and health services.

With the recent death of Anthony Bourdain by suicide, and with each public figure to die in this way, the circumstances and sadness of suicide becomes part of our public conversation. But this conversation works mainly through proxies — not with those around us who may be constantly struggling with suicidal thoughts.

For a while now I’ve been researching the way we talk publicly about mental health issues and suicide, and where. I’ve looked at Facebook, Instagram, Reddit, and other spaces like Tumblr and dedicated online forums. There is a lot to be said about how each of these platforms support or constrain supportive, intimate conversations about mental health problems. But I want to draw a distinction between what can be understood as the individuating spotlight-effect of Facebook and the diffused emotion and empathy exchange of online forums as community platforms.

On average, eight people take their own lives in Australia each day, and many more make an attempt. In the US, the Center for Diseas Control & Prevention reports rising suicide rates over the last 30 years, with 45,000 suicide related deaths in 2016. Risks are heightened for those already isolated, dealing with high stress or trauma, those working the land, or marginalised socially.

What’s certain is the need for vigilence and action. In Australia as elsewhere, we seem to have review after review saying that more needs to be done to address mental health issues in the community before they get to crisis point. The National Review of Mental Health Programmes and Services, is not the only national review to highlight the role online forums among other digital media tools can play in fostering social support.

Better use can be made of safe, supportive digital forums for dialogue about the sense of hopelessness and crisis of feeling that is the emotional and social scaffolding of suicidality. Too often these conversations are shut down, censored, moderated online, or put off until it’s too late. As social media platforms become more focused than ever on profit, it feels like it’s more difficult than ever to find enclaves of social support and open social exchange.

Social media and suicidality

While social media can help facilitate powerful public mental health campaigns like RUOK day, or aid organic movements like Project Semicolonand To Write Love on Her Arms (#TWLOHA), the dominant platforms — Facebook, Twitter, Snapchat, YouTube or Instagram — are not well suited to addressing the complex issues associated with suicide.

In fact, these platforms have had a bad run in the press and from some researchers with new claims about their negative impact on mental health. Last year the Royal Society for Public Health surveyed young people in the UK about how they rate different social media platforms in relation to their mental health. They found that Instagram was considered the worst social platform for mental health and that there was a strong sense overall that social media was taking its toll.

While the RSPH researchers weighed the positive against negative impacts of social media on mental health, it sparked a new run of popular panic. Consensus of public opinion seems to have tipped toward the idea that social media are just simply bad for our psychological wellbeing. At least, for a range of reasons, social media are tanking in trust and good will, even among young people — their native constituents.

None of this is helped by Facebook’s deepening woes regarding its porous data security, and its shareholder-driven model that prioritizes collecting and selling its users’ personal information. From fake news to the misuse of personal data, it is clear that Facebook has sparked widespread social anxiety about sharing our personal life online.

In December last year, two Facebook researchers went to some lengths to explore the growing claims that spending too much time on their app is bad for our mental health. Their conclusion was that it can be but depends how we use it. We should use Facebook more actively and to engage more directly with our network of friends, share more, and spend less time just looking, particularly at fake news, they insisted. This is a good start, but the central claim is disingenuous, as technology writer and researcher Jenny Davis has argued here. Platforms shape and restrict our use, usually on the basis of a business model derived from those uses, and in the case of Facebook, that involves an emphasis on the individual, not the collective or community, or even supportive networks.

Let’s think of this situation as Facebook’s individuating spotlight effect. Structurally and socially, with Facebook we are individually on show, measured, compared and commoditized. There is little room to let your guard down, to converse meaningfully and safely about already stigmatized feelings of mental ill-health.

Facebook and Instagram are premised on real name profiles and this rarely suits those dealing with mental health issues. And these platforms are individual-centric. They’re less interested in community or groups, despite Mark Zuckerberg’s year-long tour of the US touting his company’s eagerness to reengage communities and reigniting personal connections and sharing among friends.

In other words, because of the way these platforms are built, the one thing that Facebook and Instagram are just not good at is generating community-oriented solutions to expressions of mental health, ill-health and suicidality.

Despite the problems flagged by the Facebook-Cambridge Analytica revelations, digital and social media technologies do offer avenues for 24/7 access and support for those experiencing mental health crisis. They help peers connect with each other around issues and interests, improve the distribution of health information and develop health literacy. These are forms of social support that reach beyond an individual’s proximate networks; and social support can help prevent suicide according to some research.

But we need to better understand which social media platforms actually deliver social support, and how different platforms inhibit or encourage dialogue about serious mental ill health or suicidality. Where is the space for supportive communities?

What about online forums?

Online forums are a simple, old internet technology often overlooked in assessments of digital health tech. Of course, forums have evolved and morphed into many guises. Reddit takes the functionality of forums, with up and down voting to manage mass scale and bring a kind of (often toxic) competitive interest and play. The problems of Reddit are a story for another day. I’m referring here to one successful, supportive community platform for talking about suicidality and self-harm.

A program of research I have conducted with mental health organization beyondblue in Australia showed that forums still have a lot to offer — especially when they’re not all about collecting and brokering data (as is Health Unlocked or Patients Like Me). There are many other great mental health support forums like beyondblue’s. Check out the forums of SANE Australia, MIND’s Elefriends and Mental Health Forum in the UK, Inspire’s Mental Health America.

I investigated activities and interactions in beyondblue’s online forums and Facebook Page over several years. Part of the research focused on one of the forums dedicated to Suicidal Thoughts and Self Harm. This is a sanction and moderated space dedicated to discussing and supporting people experiencing the most difficult of mental health issues. It challenges the way suicidality is often censored, filtered, or moderated elsewhere online. While the forum’s moderators still hide direct discussions of methods and other triggering discussion, responding out of the public eye, the discussions of suicide there offer a model of diffused empathy generation and collective support. There, desperate and often time-critical need is aligned with skillful empathic peer support.

Although anonymous (all uses who register to post only do so through a pseudonym — no real names are associated with posts at any point), and my informants insist it is because of the anonymity, the experiences and thoughts shared in the forum are intimate, laden with personal crisis and reflect complex social and mental health situations.

Trust is key to the social support that is sought and provided through the forums. And it is trust that Facebook and Instagram have squandered. Perhaps it’s there on Tumblr in the sense that there are built-in restrictions on accessing other users’ following / follower lists, and a flexibility around pseudonymity that allows for play, emotive freedom and exploration. But forums like beyondblue’s offers moderation by a handful of professionals and a decent number of badged peer mentors — community champions, who give their time and experience to help others.

If you know where to look, these forms of supportive discussion about suicidality are possible online. But health support and digital health participation are unevenly distributed — they depend on a level of digital and health literacy that is not universal.

Above all, online mental health support communities are desperately in need of funding to avoid the need to monetize personal health data to provide an essential service.

Support services in Australia:

Lifeline 13 11 14

Kids Helpline 1800 55 1800

MensLine Australia 1300 78 99 78


McCosker A and Hartup M (2018) Turning to Online Peer Forums for Suicide and Self-Harm Support, Melbourne: beyondblue and Swinburne Social Innovation Research Institute. Available at:

McCosker A (2017) Networks of Advocacy and Influence: Peer Mentors in beyonblue’s Mental Health Forums, Melbourne: Swinburne Social Innovation Research Institute. Available at:

McCosker A (2018, forthcoming) ‘Engaging Mental Health Online: Insights from beyondblue’s Forum Influencers’, New Media and Society.

Contact details and bio here, more of my research here

13 views0 comments

Recent Posts

See All


bottom of page