Qatar v. Ecuador to kick off FIFA World Cup 2022™ on 20 NovemberRead more Webb Fontaine Announces Launch of Niger National Single Window (NNSW) to Bolster TradeRead more Ethiopia: Loan from United Nations Fund Allows Food and Agriculture Organization (FAO) to Scale Up Fertilizers for Farmers in TigrayRead more How Choosing the Right Printer Helps Small Businesses and Content Creators to Save Time, Maximise Productivity and Achieve GrowthRead more The United States Contributes USD $223 Million to Help World Food Programme (WFP) Save Lives and Stave Off Severe Hunger in South SudanRead more Eritrea: World Breastfeeding WeekRead more Eritrean community festival in Scandinavian countriesRead more IOM: Uptick in Migrants Heading Home as World Rebounds from COVID-19Read more Network International & Infobip to offer WhatsApp for Business Banking Services to Financial Institution Clients across AfricaRead more Ambassador Jacobson Visits Gondar in the Amhara Region to Show Continued U.S. Support for the Humanitarian and Development Needs of EthiopiansRead more

Facebook tests alerting users to extremist posts

show caption
Facebook's "Redirect Initiative" features are intended to route people using hate- or violence-related search terms toward resources, education or outreach groups aimed at more harmonious outcomes, according to the social media giant./AFP
Print Friendly and PDF

Jul 03, 2021 - 09:20 AM

SAN FRANCISCO — A Facebook test of pop-up boxes asking people whether they think friends are becoming extremists raised concerns Friday among US conservatives who felt their voices might be stifled.

Facebook spokesman Andy Stone said in a Twitter exchange that the alerts sprang from an initiative at the social network to combat violent extremism and dangerous organizations.

“Redirect Initiative” features are intended to route people using hate- or violence-related search terms toward resources, education or outreach groups aimed at more harmonious outcomes, according to Facebook.

For example, Facebook said that searches related to white supremacy in the United States get directed to a Life After Hate group that provides crisis intervention.

Images of the alerts shared on Twitter showed messages asking whether users were worried someone they knew was becoming an extremist or if they had been exposed to extremist content.

People could opt to click on a link to “get support” or simply close the pop-up box.

Virginia state politician Nicholas Freitas, a Republican, was among those who shared an image of the Facebook alert on Twitter.

“I have a real concern that some leftist technocrats are creating an Orwellian environment where people are being arbitrarily silenced or banned for saying something the ‘thought police’ doesn’t like,” Freitas said in the post.

Facebook and other online platforms have been under pressure to stop the spread of misinformation and posts leading to real-world violence.

The social media giant recently beefed up automated tools to assist group moderators striving to keep exchanges civil in a time of clashing viewpoints.

Automated systems at Facebook check for posts in groups and news feeds that violate the platform’s rules about what content is acceptable.

Facebook in June banned former US president Donald Trump for two years, saying he deserved the maximum punishment for violating platform rules over a deadly attack by his supporters on the US Capitol.

Trump was suspended from Facebook and Instagram after posting a video during the attack by his fired-up supporters challenging his election loss, in which he told them: “We love you, you’re very special.”

The punishment was effective from January 7, when Trump was booted off the social media giant, and came after Facebook’s independent oversight board said the indefinite ban imposed initially should be reviewed.

“Given the gravity of the circumstances that led to Mr. Trump’s suspension, we believe his actions constituted a severe violation of our rules which merit the highest penalty available under the new enforcement protocols,” Facebook vice president of global affairs Nick Clegg said in a post.

Facebook also said it will no longer give politicians blanket immunity for deceptive or abusive content based on their comments being newsworthy.

ZONNTECH.COM uses both Facebook and Disqus comment systems to make it easier for you to contribute. We encourage all readers to share their views on our articles and blog posts. All comments should be relevant to the topic. By posting, you agree to our Privacy Policy. We are committed to maintaining a lively but civil forum for discussion, so we ask you to avoid personal attacks, name-calling, foul language or other inappropriate behavior. Please keep your comments relevant and respectful. By leaving the ‘Post to Facebook’ box selected – when using Facebook comment system – your comment will be published to your Facebook profile in addition to the space below. If you encounter a comment that is abusive, click the “X” in the upper right corner of the Facebook comment box to report spam or abuse. You can also email us.