It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Artificial Intelligence Is Learning to Predict and Prevent Suicide
For years, Facebook has been investing in artificial intelligence fields like machine learning and deep neural nets to build its core business—selling you things better than anyone else in the world. But earlier this month, the company began turning some of those AI tools to a more noble goal: stopping people from taking their own lives. Admittedly, this isn’t entirely altruistic. Having people broadcast their suicides from Facebook Live isn’t good for the brand.
But it’s not just tech giants like Facebook, Instagram, and China’s up-and-coming video platform Live.me who are devoting R&D to flagging self-harm. Doctors at research hospitals and even the US Department of Veterans Affairs are piloting new, AI-driven suicide-prevention platforms that capture more data than ever before. The goal: build predictive models to tailor interventions earlier. Because preventative medicine is the best medicine, especially when it comes to mental health.
If you’re hearing more about suicide lately, it’s not just because of social media. Suicide rates surged to a 30-year high in 2014, the last year for which the Centers for Disease Control and Prevention has data.
...Think about it. Between all the sensors in your phone, its camera and microphone and messages, that device’s data could tell a lot about you. More so, potentially, than you could see about yourself. To you, maybe it was just a few missed trips to the gym and a few times you didn’t call your mom back and a few times you just stayed in bed. But to a machine finely tuned to your habits and warning signs that gets smarter the more time it spends with your data, that might be a red flag.
That’s a semi-far off future for tomorrow’s personal privacy lawyers to figure out. But as far as today’s news feeds go, pay attention while you scroll, and notice what the algorithms are trying to tell you.
How long before we're all labeled as suicidal terrorists?
originally posted by: Aliensun
a reply to: soficrow
I would say that you critically need to review this sentence in your OP:
"On the surface, helping people about to commit suicide seems to be a good idea.
What say you ATS?"
originally posted by: Aliensun
a reply to: soficrow
I would say that you critically need to review this sentence in your OP:
"On the surface, helping people about to commit suicide seems to be a good idea.
What say you ATS?"