Fb vice president of global public policy Joel Kaplan and Facebook CEO Tag Zuckerberg leave the Elysee Presidential Palace after a meeting with French Chief executive Emmanuel Macron on Might 23, 2018 in Paris, france, France.
Facebook reportedly wiped out a feature meant to reveal its users in order to viewpoints over fears of being seen as having a liberal bias, according to a Wall Street Diary report published Sunday.
Facebook’s most crucial conservative executive and global policy head Joel Kaplan reportedly pushed for the feature to be squashed, based on the Journal.
Fb has long faced criticism of getting a liberal slant, that has ramped up as the company’s top professionals have had to face lawmakers to testify over its use of consumer data. Kaplan, in particular, has emerged as a controversial figure as the company has entered the limelight, most notably after attending Justice Brett Kavanaugh’s hearing over alleged sex misconduct. The Journal statement found that Kaplan, a former aide to President George W. Bush, had key input on the types of information that would be exposed to Facebook’s users.
Kaplan reportedly elevated concerns over an inner analysis by Facebook about how exactly its users were uncovered to a range of information. The report found that Facebook’s right-leaning users were more unlikely to be exposed to different views and therefore more polarized, based on the Journal. Facebook’s alleged Common Ground initiative, which proposed in part to boost articles in the News Feed that were highly liked and left a comment with a range of users across the political range, was meant to bring a greater range of perspective to users. Kaplan argued that the effort would quiet conservative sounds to a disproportionate level, the Journal reported.
Within a statement, a Fb spokesperson said, “Understanding a wide variety of viewpoints is an important part of our product development process and it is essential for building products and services that serve everyone. The general public policy team, led by Joel Kaplan, is tasked with understanding the viewpoints of groups, regulators, governments, NGOs and other stakeholders from around the world and using that knowledge to inform product conversations and decisions. The team plays an essential role in making certain we follow objective standards and that our policies are applied fairly and consistently. inch
Facebook has tried other ideas to stop the spread of misinformation on its site and help users distinguish quality information from that which is less well-sourced. They have added a number of third-party fact-checkers to help it position information in the Information Feed and give framework to the origin of various articles. The program has faced criticism of the own, including that it can only review a limited amount of information in line with the capacity of the fact-checkers.