The social-media giant is working on hardware that can analyze and filter live video.
Background: After Facebook rolled out its live video feature in 2016, the company was criticized for a rash of suicides streamed to audiences on the platform. In response, the company created AI tools to spot dangerous behavior and increased the number of reviewers, so that it took less than 10 minutes to remove footage after it was posted.
Improvement: Investing in chips with AI software that can recognize self-harm, sexual acts, or other activities Facebook wants to ban would reduce the need for human moderators to watch suspect videos.
Why it matters: Mark Zuckerberg has big plans for these kinds of custom-built systems. By designing and making its own hardware, Facebook could not only improve its platform but save a lot of money by reducing reliance on chip manufacturers like Nvidia and Intel.