Approved platform Facebook is to amend some of its policies in response to the recommendations of the censorship board, which issued the first set of content moderation decisions last month, in a series of provisions that canceled some of Facebook’s original measures.
In addition to those decisions on a few specific jobs, the oversight board also made recommendations on how the social network could change its policies.
Facebook said it was committed to taking action on 11 recommendations from its censorship board, including updates to Instagram’s nudity policy.
But in other areas, such as suggesting that Facebook alert users when oversight decisions are the result of automation, the company has yet to commit to making permanent changes.
And there are no major policy changes among the areas in which Facebook says it is committed to change as much as it promises to increase transparency around its existing rules.
In this regard, the platform clarified that it tends to make the rules related to health error information more clear.
Facebook is also planning to launch a new transparency center for users to better explain community standards.
The company said it was sharing more information about risky individuals and organizations policy, but was assessing the feasibility of recommending that the company list groups and individuals covered by the rules.
Health-related nudity is now allowed, after Facebook retrieved a post from a user who had posted photos to raise awareness of breast cancer.
Facebook’s use of automation tools in making content moderation decisions has also appeared in several of the censorship board recommendations.
The oversight board said: Facebook should alert users when implementation is the result of automation and not human review of the content.
The social network says: It’s testing the censorship board’s recommendation to notify people when content is removed via automation, but it hasn’t reached a permanent commitment.
But the one area where Facebook has refused to implement any changes is its coronavirus disinformation policy.
And the censorship board decided that Facebook should return a post by French users falsely claiming that hydroxychloroquine can treat the Corona virus.
The council also recommended that Facebook use less intrusive measures in dealing with misinformation about the epidemic when the potential for bodily harm has been determined but is not imminent.
Facebook said: It makes the rules of misinformation related to the Corona virus more visible to users, but it will not change how they are applied.
And Facebook wrote: We will not take any further action on this recommendation because we believe that we are using the least intrusive measures given the possibility of imminent damage.
She added: We continue to rely on extensive consultation with leading public health authorities to inform us of what is likely to contribute to imminent physical harm, and this approach will not change during a global pandemic.
Facebook’s response offers some insights into how the social network would see the Independent Supervisory Board, which it likened Of the Supreme Court.
And like the court, his decisions are supposed to be binding. But Facebook has plenty of room to maneuver about its adoption of the broader policy changes recommended by the censorship board.
Facebook’s adoption of some of the proposals, while agreeing to consider others only, indicates that it remains reluctant to allow the oversight board to have very significant impacts on Facebook’s broader policy structure.