Meta’s Community Safety Efforts 

Daily News Egypt
6 Min Read

Following the recent attacks,  our teams introduced a series of measures to address the spike in harmful and potentially harmful content spreading on our platforms. Expert teams from across our company have been working around the clock to monitor our platforms, while protecting people’s ability to use our apps to shed light on important developments happening on the ground. 

The following are some of the specific steps we have taken:

  1. Mobilising a special operations center: we quickly established a special operations center staffed with experts, including fluent Arabic and Hebrew speakers, to closely monitor and respond to this rapidly evolving situation in real time. This allows us to remove content that violates our Community Standards or Community Guidelines faster, and serves as another line of defense against misinformation.
  1. Fixing Bugs: We identified and fixed some bugs this past week. One impacted all Stories that re-shared Reels and Feed posts on Instagram, meaning they weren’t showing up properly in people’s Stories, leading to significantly reduced reach. This bug affected accounts equally around the globe – not only people trying to post about what’s happening in Israel and Gaza – irrespective of the subject matter of the content. We fixed this bug as quickly as possible. Another bug prevented people from going Live on Facebook for a short time. This was also a global issue that was fixed within a few hours. We understand people rely on these tools everyday to stay connected and we’re sorry to anyone who felt the impact of these issues. 
  1. Enforcing our policies and community guidelines: We continue to enforce our policies around Dangerous Organizations and IndividualsViolent and Graphic ContentHate SpeechViolence and IncitementBullying and Harassment, and Coordinating Harm. In the three days following October 7, we removed or marked as disturbing more than 795,000 pieces of content for violating these policies. The content was in both in Hebrew and Arabic, and was inclusive of all geographies. 
  1. Removing content around dangerous organizations: Hamas is designated by the US government as both a Foreign Terrorist Organisation and Specially Designated Global Terrorists. It is also designated under Meta’s Dangerous Organizations and Individuals policy. This means Hamas is banned from our platforms, and we remove praise and substantive support of them when we become aware of it, while continuing to allow social and political discourse — such as news reporting, human rights related issues, or academic, neutral and condemning discussion. 
  1. Enhancing comment and profile settings: As a temporary measure to protect people in the region from potentially unwelcome or unwanted comments, we have: 
  • Changed the default setting for who can comment on newly created public Facebook posts of people in the region to Friends and/or established followers only. Users globally can choose to use this setting and opt in or out at any time, and we are notifying people in the region with specific instructions on how to change this setting.
  • We’ve made it easier for people to bulk delete comments on their posts.
  • Disabled the feature that normally displays the first one or two comments under posts in Feed.
  • We recently rolled out the Lock Your Profile tool in the region that allows people to lock their Facebook profile in one step. When someone’s profile is locked, people who aren’t their friends can’t download, enlarge or share their profile photo, nor can they see posts or other photos on someone’s profile, regardless of when they may have posted it. 
  1. Empowering voices while ensuring safety: We want to reiterate that our policies are designed to give everyone a voice while keeping people safe on our apps. We apply these policies regardless of who is posting or their personal beliefs, and it is never our intention to suppress a particular community or point of view. Given the higher volumes of content being reported to us, we know content that doesn’t violate our policies may be removed in error. To mitigate this, for some violations we are temporarily removing content without strikes, meaning these content removals won’t cause accounts to be disabled. We also continue to provide tools for users to appeal our decisions if they think we made a mistake.
  2. Fundraising on Facebook and InstagramSince October 7, people have raised more than $11.5 million for nonprofits on Facebook and Instagram to help with relief efforts in Palestine and Israel. This includes over 340,000 donations to 262 charities – providing disaster relief, ambulance and blood services, medical care and more.

We understand that this is a dynamic situation, and these actions may be updated in due course. 

 You can read more about our Community Standards here: https://transparency.fb.com/policies/community-standards

Share This Article