How Facebook Is Helping People Struggling With Mental Health Issues
Facebook rolled out AI technology to find suicidal posts and videos from users. The software could save lives and send help to people who need it before it's too late.
Facebook is making big moves toward suicide prevention. Last week Mark Zuckerberg announced that Facebook is now using that spooky thing called artificial intelligence to spot and report suicidal behavior on the platform. Instagram rolled out similar suicide-prevention features a few months ago but without AI (robots??).
In the past, a somewhat alarming number of suicidal Facebook content went unreported, so the platform is upping its game to catch the posts and videos that aren't reported by users. Presumably people either didn't realize how serious their friend's problem was or didn't see the post in time, but either way, this is your reminder to keep your eyes peeled for things that look concerning in your feed—it never hurts to be extra cautious.
Just how does AI catch this stuff? It uses pattern recognition and keywords—comments like "You OK?" or "Do you need help?"—to spot and report potentially harmful posts.
Facebook can then immediately send resources (or help—it now has a team of moderators trained in suicide prevention on call 24/7) to the person posting the suicidal content. The tech also prioritizes posts or videos that seem particularly urgent or dangerous, so users at greatest risk get help first.
To be totally honest, sometimes AI freaks us out. It's a powerful, scary product of the tech age. But when it's used for good, it can actually save lives. And that's always worth it.