Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
In February 2023, Meta consulted the Board on whether to continue removing content using the Arabic term "shaheed" (شهيد) when referencing a designated individual under the Dangerous Organizations and Individuals Community Standard. "Shaheed" is an honorific term with various meanings, including referring to those who die honorably or unexpectedly. In its policy advisory opinion the Board found Meta's approach overly broad, restricting free expression. The Board recommended that Meta should stop presuming that using "shaheed" to reference a designated individual is always violating, but rather that this content should be removed only when accompanied by signals of violence or other policy violations.
What was the impact of Meta’s implementation of this recommendation?
In response to the Board's recommendation, Meta updated its policy to allow people to use the word "shaheed" when their content does not contain signals of violence and does not praise dangerous individuals or organizations, such as terrorists. After the policy change, the Board’s data team used the Meta Content Library data to conduct an independent evaluation of this policy change and its impact on free expression in which they identified an increase of 19.5% in daily posts with more than 50,000 views containing the word “shaheed” across Facebook and Instagram. Read more on the Board’s assessment in their 2024 report.