The long-awaited independent board of Facebook was finally announced on 6th May. Reportedly the board members belong to various professions, cultures, religions, and have many political views. It aims to govern the content on social media which is getting more complicated and out of hand each day. Arguably, there is a need to remove bogus content from the internet but the question is how is it possible?
The board agrees to support free expression within the constraints of human right laws and it is named Facebook Oversight Board. Their work will directly impact the Facebook users and the society. Reshaping how Facebook affects the world is a mountain of a task and anybody including Facebook itself will face huge challenges in doing so.
Independent Facebook Oversight Board to Decide The Fate Of Content
Since its founding, Facebook has been controlled by CEO Mark Zuckerberg who is also the chairman of board and regulates most of the company’s voting shares. The independent board is holding limited powers to decide which content is deemed viable and which needs to be taken down. Initially, It’s going to have 20 members but the number will later rise up to 40.
The board will start reviewing appeals of content takedowns on Facebook. Their decisions will overrule all other content moderators working at Facebook. The company can forward any number of cases regarding content takedown to the Facebook Oversight Board. A proper content management is required at Facebook because they cannot afford to make so many mistakes.
Is The New Board Merely A Controversy Shield?
Facebook has faced many controversies over the mismanagement of different types of content i.e. news, history, or art. There are certain data privacy issues that the company faced like the Cambridge Analytica one that made headlines. Many people believed that the company made a wrong decision in eliminating such posts while a plethora of misinformation is still on the internet. It is also argued that Facebook might not be aware of the legitimacy of all posted content which is why Facebook Oversight Board will perform in its stead.
Misinformation contaminates the mind of the general public intensifying the gap in the difference of opinion into intolerance and hate-mongering. Reportedly their aim is to identify the content from different activists including hackers, extremists, and conspiracy theorists who are proficient in manipulating the rules of social media platforms.
Challenges In Social Media Content Moderation
Highly skilled activists might not be the toughest challenge faced by the board. They are planning to implement a system that is easily approachable to the general public. The issue is that they are deploying this system in phases. In initial stages, users will only be able to appeal to the board in case of Facebook taking down their content. An opportunity to generate an appeal to Facebook regarding removal of a particular content, will be deployed in coming months.
Moreover, the Facebook team and board members admit that they will not be able to moderate all the content out of thousands of potential cases each year. So what difference will it make? The viral content only needs a few hours to do the intended damage. Therefore, the social media giant will need to come up with a more comprehensive strategy.
It is highly likely that a board will be working on something else and the contamination will be happening somewhere else. Perhaps this argument can be insignificant due to the power vested in Facebook to magnify certain content over another since the company has to keep in mind about the importance of its revenue base.
On paper, Facebook agrees to expand the decision and policy making powers of Facebook Oversight Board but it is not sure that this step can retune algorithms of Facebook.
The company can also summon the help of technology like artificial intelligence (AI). Incorporating AI can make a notable difference in terms of content investigation like Facebook’s AI system to understand memes is one such example.. The skilled members of the board can utilize AI to expand their scope in moderating content on social media.