Facebook refine some of its policies in response to the recommendations of the Supervisory Board. The council published its content moderation decisions last month, in a series of decisions that overturned some of Facebook’s original actions. In addition to these decisions on a handful of specific positions, the board also made recommendations on how the social network could change its policies.
Now Facebook has responded to those suggestions. The company says it is “committed to taking action” on 11 of the board’s recommendations, including updates to Instagram’s nudity policy. But in other areas, like the suggestion that Facebook alert users when moderation decisions are the result of automation, the company is yet to commit to making permanent changes.
Among the areas in which Facebook says it is “committed” to change, these are not so much major policy adjustments as promises to increase “transparency” around its existing rules. On that front, Facebook says it will make the rules around health misinformation clearer, like its recent vaccine policy updates that specify the types of complaints the company will remove. Facebook is also planning to launch a new user transparency center to better explain its community standards. The company further said it would “share more information about our policy on dangerous people and organizations,” but “assess the feasibility” of a recommendation that the company list the groups and the individuals covered by the rules.
One area where Facebook has accepted a bigger change is Instagram’s nudity policy. It now allows “health-related nudity,” after Facebook restored a post from a user who posted photos to raise awareness about breast cancer.
Facebook’s use of automation tools to make content moderation decisions was also mentioned in several of the council’s recommendations. The board had said Facebook should let users know when the app is the result of automation rather than human content reviewers. The social network says it will “test the board’s recommendation to tell people when their content is being removed by automation,” but stopped ahead of a permanent engagement.
The only area where Facebook has refused to implement changes is its coronavirus disinformation policy. The Supervisory Board had decided that Facebook should reinstate the publication of a French user wrongly claiming that hydroxychloroquine could cure COVID-19. The council further recommended that Facebook use “less intrusive measures” to deal with disinformation about the pandemic when “a risk of physical harm is identified but is not imminent.”
But in its latest response, Facebook said that while it would make its coronavirus disinformation rules clearer for users, it would not change the way it enforces them. “We will not take any further action on this recommendation as we believe we are already using the least intrusive enforcement measures given the likelihood of imminent harm,” Facebook . “We restored the content based on the binding power of the board decision. We will continue to rely on extensive consultations with key public health authorities to tell us what is likely to be contributing to imminent physical harm. During a global pandemic, this approach will not change. “
While not necessarily surprising, Facebook’s response offers some insight into how the social network views the Supervisory Board. Facebook has likened the independent council to its “Supreme Court” and, like a tribunal, its decisions are . But Facebook has considerable leeway in whether or not to adopt the broader policy changes recommended by the board. The fact that Facebook has embraced some, while only agreeing to consider others, suggests that it is still at least a little reluctant to let the board have too much influence over the larger political structure. from Facebook.
The company’s response comes as it prepares for what could be the Supervisory Board’s most prestigious decision: to reinstate or not . This council did not say exactly when it will rule on the matter, but a decision is expected in the coming weeks.