Dark Mode Light Mode

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Follow Us
Follow Us
Login Login

The Oversight Board wants Meta to revise its ‘incoherent’ false video guidelines.

The Oversight Board, Meta’s external advisory committee for Facebook and Instagram moderation, released a verdict on Monday about a doctored seven-second video of President Biden that circulated last year.

The original video shows the president helping his granddaughter, Natalie Biden, vote early in the 2022 midterms. The video shows President Biden kissing his granddaughter on the face and putting an “I Voted” sticker on her.

The video is modified to remove the sticker and loop it to show Biden improperly groping the young woman with sexual lyrics. The seven-second film was posted on Facebook in May 2023, calling Biden a “sick pedophile.”

Advertisement

Monday’s Oversight Board ruling supports Meta’s decision to leave the video online but labels the applicable policy “incoherent.” After a Facebook user reported the video and pushed the matter when Facebook refused to delete it, Meta’s Oversight Board took up the case last October.

“As it stands, the policy makes little sense,” Oversight Board co-chair Michael McConnell said. It prohibits edited recordings of people saying things they don’t say but not posts of them doing things. This only pertains to AI-created videos, while other bogus stuff is exempt.”

The legislation failed to handle altered audio, “one of the most potent forms of electoral disinformation,” McConnell said. Although the board’s decision to keep the video online is binding,

Meta’s policies should focus on preventing damages rather than how material was developed, according to the Oversight Board’s ruling. The ruling requires revisions to be made “urgently” due to worldwide elections, but only the board’s content decision is binding.

Meta established the Oversight Board for 2020. Facebook had been under fire for years for spreading misinformation, extremism, and other hazardous content. The board can make final decisions regarding specific content moderation instances if Meta agrees to apply them, but the business has only pledged to “consider” larger proposals to amend Facebook and Instagram’s policies.

Beyond broadening its manipulated media policy, the Oversight Board advised Meta to label change videos instead of relying on fact-checker-initiated content takedowns, which it calls “asymmetric depending on language and market.” The Oversight Board thinks Meta may optimize freedom of speech, limit harm, and give more context and information by identifying more material than removing it.

Meta told Eltrys that it is “reviewing the Oversight Board’s guidance” and will respond publicly within 60 days.

Meta left the altered video online because its policy on manipulated media—misleadingly altered photos and videos—only applies when AI is used or when a video subject says something they didn’t say, as the Oversight Board noted when accepting the Biden “cheap fake” case.

To combat deepfakes, the manipulated media policy applies exclusively to films that are modified or synthesized in ways that are not obvious to the typical person and may mislead them.

The changed video circulates on X, previously Twitter. A verified X account with 267,000 followers posted the film last month with the statement, “The media just pretend this isn’t happening.” This video has over 611,000 views.

The Oversight Board has ordered Meta to rewrite its regulations before the Biden video. After Facebook suspended former President Trump’s account, the organization criticized the “vague, standardless” indefinite penalty but supported the decision. The Oversight Board has encouraged Meta to provide more detail, consistency, and openness in its platform policies across instances.

In late 2020, the Oversight Board missed an opportunity to be relevant during the U.S. election and was criticized for its inevitable links to the firm it was designed to examine. Meta chooses whether to listen to the Oversight Board, even as it protects its content filtering systems from political criticism.

Critics of Meta’s policy-making experiment say the self-designed review board is too late.

Meta has a standardized content moderation review mechanism, yet disinformation and other harmful content spread faster than the world could have envisioned two U.S. general election cycles ago.

As the 2024 presidential contest heats up, researchers and watchdog organizations expect a flood of false claims and AI-generated fakes. Even as new technologies scale deadly lies, social media corporations have secretly cut trust and safety spending and abandoned their once-public campaign to combat disinformation.

“The volume of misleading content is rising, and the quality of tools to create it is rapidly increasing,” McConnell added.

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Yandex will sell its remaining Russian companies for $5.2B—half its market worth.

Next Post

ProducePay raises $38 million to combat waste in the produce supply chain.

Advertisement