Meta has published an in-depth analysis of the company’s social media algorithms in an effort to demystify how content is recommended for Instagram and Facebook users. In a blog post Released on Thursday, Meta’s president of global affairs Nick Clegg said dumping information about the AI systems behind its algorithms was part of “the broader ethic of openness, transparency and accountability.” company, and outlined what Facebook and Instagram users can do to better control the content they see on the platforms.
“With the rapid advancements being made with powerful technologies like generative AI, it’s understandable that people are both excited about the possibilities and concerned about the risks,” Clegg said in the blog post. “We believe the best way to address these concerns is to be open.”
22 “service cards” are now available that describe how content is rated and recommended for Facebook and Instagram users.
Most of the information is contained in 22 “system boards” which cover the feed, stories, reels, and other ways people discover and consume content on Meta’s social media platforms. Each of these maps provides detailed yet accessible information on how the AI systems behind these features rank and recommend content. For example, Instagram’s Explore overview — a feature that shows users the content of photos and account reels they don’t follow — explains the three-step process behind Instagram’s automated recommendation engine. AI.
The card indicates that Instagram users can influence this process by saving content (indicating that the system should show you similar items) or marking it as “not interested” to encourage the system to filter out similar content. coming. Users can also see reels and photos that weren’t specifically selected for them by the algorithm by selecting “Not Custom” in the Explore filter. More information about Meta’s predictive AI models, the input signals used to direct them, and the frequency with which they are used to rank content, is available through its Transparency Center.
Instagram is testing a feature that will allow users to mark reels as “interested” to see similar content in the future
Besides system cards, the blog post mentions a few other Instagram and Facebook features that can inform users why they see certain content and how they can personalize their recommendations. Meta extends the “Why am I seeing this?” feature on Facebook Reels, Instagram Reels, and Instagram’s Explore tab in “the coming weeks.” This will allow users to click on an individual reel to find out how their previous activity may have influenced the system to show it to them. Instagram is also testing a new Reels feature that will allow users to mark recommended Reels as “Interested” to view similar content in the future. The ability to mark content as “Not Interested” has been available since 2021.
Meta also announced that it will start rolling out its Content Library and API, a new suite of tools for researchers, in the coming weeks that will contain a bunch of public data from Instagram and Facebook. Data in this library can be searched, explored, and filtered, and researchers will be able to request access to these tools through approved partners, beginning with the University of Michigan’s Intercollegiate Consortium for Political and Social Research . Meta says these tools will provide “the most comprehensive access to publicly available content on Facebook and Instagram of any search tool we’ve built to date,” while helping the company meet its data sharing obligations. data and transparency.
These transparency obligations are potentially the most important factor driving Meta’s decision to better explain how it uses AI to shape the content we see and interact with. The explosive development of AI technology and its subsequent popularity in recent months has attracted the attention of regulators around the world who have expressed concern about how these systems collect, manage and use our personal data. Meta’s algorithms aren’t new, but the way it mishandled user data during the Cambridge Analytica scandal and the reactions to TikTok’s lukewarm transparency efforts are probably a motivational reminder communicate too much.