Chinese professors have
applied artificial intelligence (AI) technology to monitor suicidal signs of
netizens on social media and prevented 320 would-be suicides within one year in
response to the country’s long-existing suicide problems.
The tree hole project of 285 members started in July 2018, including about 40 psychiatrists and more than 40 psychological counselors, Huang Zhisheng, a senior researcher from the Department of Computer Science of Vrije University Amsterdam and scientist at the brain protection center of Beijing-based Capital Medical University, told the Global Times on Thursday.
The team has prevented 320 people who showed suicidal signs from killing themselves as of March.
The project uses a robot developed by Huang called 004, which carefully goes through thousands of Sina Weibo posts that show signs of being written by depressed people.
The robot, integrated with data collection and suicidal risk analysis technology selects messages with high suicidal risk signs every day.
Huang said they set a suicide risk range from zero to 10 – that is, the higher the rank, the higher the risk.
The robot would issue a warning when the rank is at least six.
Huang’s team will try to contact and rescue the would-be suicide first. They will call the police if it fails to do so.
The tree hole project is
not the only case. A similar AI rescue team was launched in 2017 by academic
volunteers from the Institute of Psychology under the Chinese Academy of
Sciences, which was the first of its kind in the world.
These projects are a response to the growth in the number of people suffering from mental problems and the lack of qualified mental health professionals in China.
According to media reports in 2013, every year, about 250,000 people die by suicide in China, while another 2 million attempt to.
Suicide has reportedly become the primary cause of death among Chinese people aged between 15 and 35.
Although gaining in public support, including from the police and netizens, these projects have also sparked concerns over the privacy of the rescuers.
“We only checked their available online information like location, school, age and net friends. If we can’t find their detailed information online, we will call the police,” Huang said, noting that the team would strictly protect chat history with the rescuers.
“Anyone who shares their suicidal intentions in the open media platform means they are calling for help,” a psychological counselor surnamed Zhang and tree hole team member, told the Global Times. GLOBAL TIMES