Old Version
Society

AI Suicide Watch

Big data and AI are helping companies and experts flag the first signs of suicide among users on social media to reach out before it’s too late

By Qiu Guangyu Updated Nov.1

Wu Gang, who works security at Alibaba Group, received an alarming message from a Taobao shop operator on August 1.  

“Could you tell me a way to die without pain?” read the chat record between a 16-year-old customer and the vendor on August 1.  

The teen wrote that she had purchased a lot of medications and was preparing to commit suicide. 

Wu and his colleagues responded quickly with an emergency plan. As the largest e-commerce platform in China, Taobao searches can reveal many aspects of a consumer’s emotional state. 

Nowadays, users leave a trail of telling information on apps and online communities about their moods, from happiness and anger to sadness and depression, including suicidal thoughts.  

Wu’s team initiated Safeguarding Lifeline, a project that uses artificial intelligence (AI) to link businesses, public security and third-party organizations in a suicide intervention mechanism. The project, launched in July last year, has engaged in over 2,500 cases of suicide outreach. 

Wu said his work is simple: His team reaches out to people in need of emotional support. “But what’s more important is that the results [for online suicide intervention] are positive,” Wu said. 

Other internet companies have set up departments to tackle potential psychological issues and suicidal inclinations among China’s youth. 

Alibaba employee Wu Gang started the Safeguarding Lifeline project with the tech firm to help prevent suicide

Online Interventions
Soon after chatting with the teen, the online shop operator contacted the Safeguarding Lifeline team. Wu had a hunch she could be a high school student who had failed the national college entrance examinations. The team called the teen, who lives in Shandong Province, but did not reach her. The same night, the team contacted local police who located her and her family.  

She later admitted to having thoughts of suicide. Her family was unaware.  

This is just one successful case for Safeguarding Lifeline. The team has eight full-time employees and some part-time employees working around the clock. Wu, a former pharmaceutical professional, is trained to observe for the telltale signs from customers making purchases of medications that can be used to commit suicide.  

Based on his experience, Wu started the Safeguarding Lifeline project with Alibaba in March 2019. The project was launched in July that year.  

When users search for products or keywords on Taobao relevant to suicide, like sleeping pills, the Safeguarding Lifeline page pops up automatically, along with information about the national youth crisis hotline (12355) and regional free counseling hotlines in over 10 major cities and provinces across the country. The page includes a mental health section to help people with anxiety, insomnia and other disorders through self-help materials or by connecting them with mental health professionals.  

While the project uses AI to flag potential cases, Alibaba trains vendors to recognize when customers are showing signs of suicide. Vendors inform a suicide intervention practitioner from the Alibaba security department. In an emergency, the platform works with local police. 

Alibaba put together an algorithm that identifies whether a user is at risk of suicide. For example, an engineer at Alibaba programmed in more than 10,000 keywords to strengthen the algorithm. Most cases the project deals with involve young people. Around 60 percent were aged 30 and under.  

Ripple Effect
In the US, to mark World Suicide Prevention Day on September 10, 2019, Facebook also began sharing public data from users discussing suicide to address concerns about suicide and self-harm. The company also hired mental health experts to its team. 

Chinese video sharing platform Bilibili and internet giant Tencent have launched similar services. 

According to Bilibili, in the first half of 2020, its program reported 26 potential cases of suicide to police, and identified 714 users who were showing signs of severe depression and suicidal tendencies in May alone.  

If a user on Bilibili posts comments with suicide-related content, a reviewer can intervene to evaluate the psychological state of the user. If signs of severe depression or suicidal tendencies are detected, the reviewer hands the case over to a mental health professional. The professional contacts the user to provide emotional guidance. If more serious psychological issues are involved, they transfer the case to a professional psychological counseling organization. If the user is in a critical state, the service calls police.  

Data provided by Taobao shows that since the launch of the Safeguarding Lifeline page in June this year, its embedded video on mental health has seen around 2,500 views a day, and the number of detected at-risk cases has dropped nearly 30 percent. 

Tencent’s Dim Light project covers its chat platform QQ, aiming to detect and stop the spread of suicide hoaxes such as the Blue Whale Challenge of 2016. As of January 2019, the project closed more than 2,000 QQ chat groups and assisted police in helping more than 35 users with inclinations of self-harm. The project advocates nationwide strategies for suicide intervention, including improving access to health services, promoting mental health, reducing the harmful use of alcohol, limiting accessibility to instruments of suicide and promoting media accountability. 

Online suicide intervention is not a new concept. In 2018, Huang Zhisheng, deputy director of the Big Data Research Institute of Wuhan University of Science and Technology, started the Tree Hole Rescue Mission, which uses AI that scans information to detect and intervene in suicidal behaviors. By the end of 2019, the project had dealt with more than 1,600 potential suicide cases.  

In July 2017, Zhu Tingshao, a researcher at the Institute of Psychology of the Chinese Academy of Sciences, and his team began providing psychological crisis intervention to Sina Weibo users who post suicidal comments. 

Zhu told NewsChina that research into suicidal motivation shows that candidates for impulsive or “on a whim” suicide also have a strong desire to survive, talk and express their emotions, making them more receptive to intervention.  

But, Wu said, time is of the essence. “It takes time [to buy medicine online]. If they’re determined, they could get it immediately. But if they’re still planning to commit suicide, there is some time for us to stop them,” Wu told NewsChina.

Print