Boards are notorious for group think and spinelessness. In Facebook’s case, a board could make a difference.
Documents reported by the Wall Street Journal earlier this week revealed a secret system at Facebook to coddle the posts of politicians and celebrities, as well as internal stats on the full extent of psychological harm that Instagram causes teen girls. Both issues were long suspected. Both are now backed by startling evidence.
What they boil down to is this: Facebook, like the oil and tobacco companies of years ago, has been far too secretive about the impact of its products on human life.
When U.S. Senators asked Facebook last month about Instagram’s impact on teen mental health, the company did not answer. The reason is likely because its internal stats looked so bad: A third of teenage girls who already felt negatively about their bodies felt worse when they went on Instagram, according to the Journal’s reporting, which cited documents from a person seeking federal whistleblower protection. Facebook has also deflected public questioning of XCheck, the internal system that allowed more than 5 million elite users like celebrities and politicians to skirt Facebook’s content rules.
On the one hand, Facebook could see this as just another PR nightmare that will eventually blow over, barely denting its growth in users and revenue. After all, with regulators focused on Facebook’s corporate behavior and governments on social media more broadly, who will actually do anything about Facebook’s chronic secrecy? One party could be Facebook’s Oversight Board.
Also read: Facebook wins cases that wanted to break it up to end monopoly
‘Supreme Court of Facebook’
This is the 20-member panel of academics, former politicians, and activists that Facebook founded in 2019 to help make it more accountable for the way it moderates content on the site, in a unique experiment in corporate governance. Sometimes described as a Supreme Court of Facebook, the board reviews the site’s higher-profile rulings on content, such as the decision to indefinitely ban former President Donald Trump, which the board overturned and openly criticized. Noah Feldman, a contributing columnist for Bloomberg Opinion, is an advisor to Facebook and helped set up the board.
If you have already heard of the Oversight Board, you’ve likely read the unflattering assessments, too. Skeptics say it is largely toothless, since Facebook funds its six-year, $130 million budget and the salaries of its members, reportedly in the six figures. And while it has overturned more of Facebook’s decisions than not, it only looks at individual cases, when it should arguably be scrutinizing Facebook on bigger issues like its potentially harmful recommendation algorithms.
But it is also no shrinking violet. Within hours of the Wall Street Journal’s story about elite users going online, the board posted this:
The Oversight Board has expressed on multiple occasions its concern about the lack of transparency in Facebook’s content moderation processes, especially relating to the company’s inconsistent management of high-profile accounts.
— Oversight Board (@OversightBoard) September 13, 2021
It went on to add: “The Board has repeatedly made recommendations that Facebook be far more transparent in general…”
You can almost hear the frustration.
Also read: Without the privacy and data protection law, India’s regulation of Big Tech will be ad hoc
What lies ahead for Oversight Board?
I’m eager to see what the Oversight Board does next. In the coming weeks, it is scheduled to publish its first-ever “transparency report” into Facebook, according to an Oversight Board spokesman. It will be the first of a series of quarterly reviews aimed at shining a light on how diligently Facebook has been following the board’s recommendations for each case. And many of the 70 recommendations the board has made to Facebook until now focused on transparency.
Armed with the latest revelations about Facebook’s XCheck system and its studies on teens, the board’s upcoming report could be a golden opportunity to put its foot down. Its members could if they truly want to make a mark, threaten to quit if Facebook doesn’t divulge the information that they are demanding.
The board’s big problem until now is that its recommendations are not legally binding (only its rulings are). But a senior board administrator recently told me that those same recommendations could, in theory, change the way Facebook’s algorithms are designed to handle the content. The board now has a chance to push for Facebook to make those changes. The revelations about the company’s internal policies, and renewed pressure from politicians, should give it some extra leverage.
It’s worth noting the structural changes happening in the coming months: The board is working on adding 20 new members to swell its ranks to 40. Here’s hoping they bring on a few rabble-rousers.—Bloomberg
Also read: ‘Likes’ & ‘shares’ on social media teach people to express more moral outrage, study says