Facebook Inc. FB was aware the platform gave rise to extremist and polarizing content but chose to ignore it.
What Happened
Facebook knew its algorithms were driving a wedge between users in 2018. “Our algorithms exploit the human brain’s attraction to divisiveness,” an internal slide admitted at a presentation. “If left unchecked,” Facebook would provide users with “more and more divisive content in an effort to gain user attention & increase time on the platform,” according to the Wall Street Journal.
The presentation was the result of an internal effort by Facebook to understand how it affected the behavior of its users and how the harm could be limited. The presentation noted a high number of extremist groups on the platform and found that its own algorithms were driving their growth.
Why It Matter
The presentation led to a debate at Facebook, which reached all the way to the CEO. Zuckerberg indicated he was losing interest in recalibrating the platform in the name of social good, reported the WSJ.
When the Facebook newsfeed team was reorganized in 2018, the company told its employees that it was shifting its priorities “away from societal good to individual value.”
Facebook formed a task force in 2017 named “Common Ground” to investigate how polarising content was trending on its platform.
The findings of this task force were either ignored or diluted to give the appearance Facebook was not trying to shape users’ opinions or taking a moral stance.
Zuckerberg recently admitted that the social network wasn’t prepared for the 2016 U.S. presidential election and would stop the flow of misleading information.
Price Action
Facebook shares traded 0.25% lower at $231.62 in the after-hours session on Tuesday. The shares had closed the regular session 1.15% lower at $232.20.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.