Facebook Provides Transparency? Platform Gives Users More Control Over News Feed


Facebook has, once again, decided to change its algorithm but shockingly not in a way that insults the platform’s users. The platform made the unusual move to provide more transparency and give its users more control over their news feed. 

The company introduced new features that allow users to control “who can comment on [their] public posts.” Users also will have the ability to “control and prioritize posts from the friends and Pages [they] care about most in News Feed,” Facebook said in a blog post. 

First, the platform announced that users will have more control over who can comment on their posts. “Now, you can control your commenting audience for a given public post by choosing from a menu of options ranging from anyone who can see the post to only the people and Pages you tag,” Facebook said.

Facebook created an option for users to rank what they see in their “News Feeds.” “By selecting up to 30 friends and Pages to include in Favorites, their posts will appear higher in ranked News Feed and can also be viewed as a separate filter,” the company added.  

Facebook also provided more information about suggested posts as a final adjustment to its algorithm: “Today, we’re also providing more context around the content we suggest in News Feed by expanding ‘Why am I seeing this?’” Facebook explained that users can “check out [their] News Feed preferences and privacy settings in the app and adjust them to [their] liking.”

Facebook VP of Global Affairs Nick Clegg explained Facebook’s decision in a Medium column. He wrote: “In the long run, people are only going to feel comfortable with these algorithmic systems if they have more visibility into how they work and then have the ability to exercise more informed control over them. Companies like Facebook need to be frank about how the relationship between you and their major algorithms really works. And we need to give you more control over how, and even whether, they work for you.”   

“You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes,” Clegg added.  

Despite giving more choice to its users, the platform still has a slew of issues. Former Facebook employee Cassandra Spencer wrote an exposé that alleged anti-conservative bias within the company. Additionally, a report from the Foundation for Government Accountability (FGA) claimed that donations from Facebook CEO Mark Zuckerberg may have swayed the vote in Arizona for President Joe Biden in the 2020 election. 

Conservatives are under attack. Contact Facebook headquarters at 1-650-308-7300 and demand that Big Tech be held to account to provide clarity on “hate speech,” rules that seem to be applied inconsistently. If you have been censored, contact us at the Media Research Center contact form, and help us hold Big Tech accountable. 





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *