Under pressure to deal with complaints about the nature of extreme content that finds its way onto Facebook, the social media giant this week announced an answer: an innovation in governance.
Facebook will establish an external independent board of wise men and women to adjudicate on complaints about content. Funded by a trust, the board will come into being in the next months and will seek to provide a detached application of the networking site’s ethical standards.
The move comes after a wave of scandals involving the site and its use for spreading “fake news”, as well as its management of user information.
Board members will constitute “panels” to sit in judgement on cases, with the full board making “binding decisions” on specific items of content.
Mark Zuckerberg, the site’s founder, chief executive and chair, said in a statement: “The board will be an advocate for our community—supporting people’s right to free expression, and making sure we fulfill our responsibility to keep people safe.
“As an independent organisation, we hope it gives people confidence that their views will be heard, and that Facebook doesn’t have the ultimate power over their expression.
“Just as our board of directors keeps Facebook accountable to our shareholders, we believe the oversight board can do the same for our community.”
Extreme content
The move, foreshadowed by Zuckerberg earlier this year, comes as all social media platforms grapple with maintaining a stance on freedom of speech while at the same time managing extreme content published on their sites.
Dealing with the governance of social media content is not an easy process. In April Google announced it was shutting down its own ethics board after controversy emerged over the appointment of key members. Google said it would “go back to the drawing board” in developing its ethical approach to issues such as artificial intelligence, machine learning and facial recognition.
This week’s move from Facebook received a positive response in some quarters. Responding to Facebook’s pledge to publicly explain why it will or won’t implement an oversight board decision, Kate Klonick, a professor at St John’s university Law School, told Bloomberg news it was a “huge deal” and constituted “probably the most accountable we’ve ever seen Facebook”.
Others pointed to obstacles that could soon be confronting the new board. Philippa Foster Back, director of the Institute of Business Ethics, said the Facebook oversight board raised two areas of concern.
“The first is the timeliness of the panel reviewing cases, when the platform operates real time, and the second is relying on the judgement of the panel members which might not preclude bias,” she said.
Timeliness could be a critical issue during election campaigns which typically run over short time frames and might require quick decision over material considered either misleading or extreme.
But Foster Back also pointed out that public trust in social media remains at comparatively low levels and could form a big issue undermining the board’s work. Edelman’s 2019 Trust Barometer revealed this year that trust in social media in both Europe and North America stood at just 34%. Meanwhile, trust in European traditional media was at 60% and in North America at 65%. Overall, 73% of those polled said they worried bout false information or fake news being used as a weapon.
“These statistics make for a potentially high level of scepticism towards the Facebook initiative, particularly on the back of Google’s failure with their introduction of an Ethics Council,” said Foster Back.
‘User democracy’
Elsewhere, other have already argued that a “Supreme Court” style approach, while a step in the right direction, might not restore public faith in Facebook.
According to Henning Meyer, a research associate at the London School of Economics and director of consultancy New Global Strategy, Facebook not only needs to take judgements outside its existing structures but also implement “user democracy” to help develop its own rules.
In a blog post in June, Meyer said the “Supreme Court” approach takes decisions away from the company, creates accountability and transparency and ensures decisions were not determined by commercial interests.
But it would “fall short” because of doubts over the legitimacy of Facebook’s rules. “What counts is what the users consider to be legitimate,” Meyer said.
He concluded: “In a period in which private sector companies exercise public functions, the transfer of more elements of democratic statehood into the corporate governance of companies could be a suitable instrument for tackling their problems of legitimacy.”
That would be a radical step for any social media outfit, where founders have become notorious for attempting to retain as much control over their organisations as possible.
But Henning may be right and organisations like Facebook may be involved in a process of governance evolution. Other platforms are yet to show their hands over the governance of content, despite political and public pressure. Social media companies are still grappling with the ethical questions raised by their services. They may follow Facebook’s example, they may innovate further.
Read Facebook’s full statement here.