For years, Facebook has built its algorithms to maximise engagement and clicks; a strategy that has helped the company garner 2.7 billion users across its family of apps, including Instagram and Messenger. But increasingly, the company is willing to go up against the way its software is designed to combat the spread of harmful content.
The question is whether these changes are tweaks on the margins or more fundamental fixes to a service that —while massively profitable — has experienced a precipitous loss of public trust. The newsfeed algorithm alone takes in hundreds of thousands of behavioral signals when it evaluates which posts get promotion, and it's tough to assess the impact any single fix might have on such a complex system.
On Instagram, the company says it will remove more sexually suggestive and other borderline content from its Explore and Discovery tabs, two features where people find new accounts and posts that they didn't explicitly seek out. Users who seek out terms like #porn, #cocaine, and #opioids find a blank page on the company's Explore tab, and the company will be increasing the numbers of blocked terms.