蜜桃传媒破解版下载

Skip to main content

After a suicide, more loved ones are seeking support online. Does it help or harm?

After a suicide, more loved ones are seeking support online. Does it help or harm?

Dylan Thomas Doyle was a college junior traveling abroad when he got word that his friend Jack had taken his own life back home. Shaken, but reluctant to talk about it to his friends in person, he turned to online grief support spaces like Facebook and Reddit.

A decade later, when serving as a hospital chaplain and Unitarian Universalist minister, he lost two more loved ones to suicide. He found solace online again鈥攖his time in a subreddit specifically created for suicide bereavement.

鈥淎ll grief is hard, but suicide is often sudden, traumatic and has a lot of social stigma around it. No one knows what to say, so you can feel really isolated,鈥 said Doyle, now a doctoral candidate in the Department of Information Science at 蜜桃传媒破解版下载. 鈥淚t鈥檚 comforting to go to these spaces and have people say, 鈥業鈥檝e been through that. I know what you鈥檙e feeling.鈥欌

But as Doyle reports in two new studies, such spaces also have the potential to do harm, exposing emotionally vulnerable people, including children, to graphic stories, unhelpful comments and other potentially re-traumatizing content.

The , , are among the first to explore what goes on in suicide bereavement groups. 

鈥淚t鈥檚 great that these communities exist,鈥 said Doyle, who is now working to make them safer. 鈥淏ut right now, it鈥檚 sort of a free for all.鈥 

Man in suit posing for photo

Dylan Thomas Doyle

The power of sharing stories

On average, 132 people in the U.S. complete suicide daily. More than half of the population will, at some point, grieve a loved one who has died this way. Professional help can be hard to find because suicide bereavement is specialized. One recent study found that 62% of people grieving a loved one who died by suicide turn to social media for support.

For their study, Doyle and his co-authors examined nearly 2,600 posts and 16,502 comments in the r/SuicideBereavement subreddit. 

The team used AI natural language processing (NLP) technology to get insight into the emotional state of users and identify different kinds of posts, from lengthy stories to short questions or requests for resources. 

They found that nearly half of content posted was narrative storytelling, and many of those stories were extremely graphic.

When the team noticed a large subset of users were writing letters to the deceased, they launched in which they read through 189 such posts and 652 comments.

The posts were anonymized and the research team made sure to take care of their own mental health along the way.

鈥淓ven as researchers, we struggled to read some of these,鈥 said Doyle. 

Some letter-writers shared how they had found out and how it affected them. Others asked for explanations or sought forgiveness for not doing enough. One shared a story about a final trip they and the deceased had taken to the mountains, and how much they laughed afterward. Many commenters responded with comfort, reassurance, gratitude and offers of direct support outside the platform.

But some shared detailed descriptions of the way they had found their loved ones or the way their death had been carried out. Some expressed rage and hatred for being left behind.

The team was heartened to find almost no deliberately abusive comments but they did find some they deemed 鈥渦nsupportive,鈥 in which commenters replied with their own graphic stories.

鈥淪ome people come there just seeking resources or asking factual questions, and don鈥檛 expect to find people sharing narratives of really tough images,鈥 Doyle said.

Due to the way social media algorithms operate, the most graphic comments tended to rise to the top and generated more comments.

If you or someone you know is struggling or in crisis, call or text 988 or chat . Read about suicide prevention resources at 蜜桃传媒破解版下载.

鈥淚f you鈥檙e, say, 13-years-old and you come upon this and start taking it all in, that could really be harmful,鈥 he said. 鈥淎nd for people who are already in a vulnerable emotional state, it can be damaging to their grieving process.鈥

Building a more supportive platform

Doyle stressed that he is not specifically critiquing Reddit, but rather raising questions about how to more effectively support people using social media platforms for suicide bereavement support. He believes more research is needed and does not think banning narrative storytelling on platforms is the answer. (Previous research shows that in offline support groups, such storytelling can be extremely therapeutic.)

He does believe platforms could serve users better.

At present, r/SuicideBereavement subreddit moderators are not required to be certified or trained in mental health. 

On its homepage, the site clearly prohibits 鈥渁ctively suicidal content鈥 and advises that it is reserved only for those bereaved by suicide. But the subreddit, just like an NFL or travel subreddit, operates with few guardrails.

Doyle imagines a day when, using the AI tool his team developed, narrative posts could be categorized and users, when logging on, could opt in or out of seeing them.

He also suggests that moderators get training around grief support and users have an opportunity to customize what they want to see at the top of their feed.

鈥淪ocial media platforms in general don鈥檛 really know what to do with death or the bereaved,鈥 he said. 鈥淲e believe that more needs to be done to make these spaces customized to the unique needs of the grieving.鈥