The Facebook Oversight Board is currently reviewing a significant case involving a manipulated video of President Joe Biden that has the potential to impact Meta’s policies on “manipulated media” in the lead-up to the 2024 election. This case raises important questions about the role of manipulated media in shaping elections worldwide.
The video in question features an altered clip of President Biden placing an “I Voted” sticker on his granddaughter’s chest and kissing her cheek during the midterm elections last year. The original footage was edited to repeat Biden’s motion of touching the girl’s chest, accompanied by Pharoahe Monch’s song “Simon Says,” specifically the line “Girls, rub on your titties.” The video was shared on Facebook with a caption labeling Biden as a “sick pedophile.”
A Facebook user reported the video to Meta, but the company did not remove the post, arguing that their “manipulated media” policy only applies to content generated by artificial intelligence or when a subject is shown saying words they did not actually say. This case raises the question of whether Meta’s responsibility extends to video content that has been altered to mislead viewers about public figures.
Thomas Hughes, the director of the Oversight Board administration, emphasized the broader significance of this case, stating that it touches on the impact of manipulated media on elections globally. While acknowledging the importance of free speech as a cornerstone of democratic governance, he highlighted the complex issues surrounding Meta’s human rights obligations when it comes to misleading audiovisual content.
Meta initially introduced its manipulated media policy in 2020. The policy aimed to identify and remove AI-generated content that has been manipulated in ways that might deceive the average viewer. However, the policy did not adequately address some of the highly criticized altered viral videos shared on the platform at the time. For instance, videos showing then-House Speaker Nancy Pelosi slurring her words or candidate Biden falsely making a racist remark were likely created using easily accessible editing software like iMovie.
The video in question falls into the category of manipulated videos that Meta has chosen not to remove in the past. While the Oversight Board can review the case and propose policy changes, Meta is not obligated to implement any suggestions made by the board.
Currently, the public comment window is open until October 24th. Once the board reaches a decision, Meta will have 60 days to respond. This process allows for public input and ensures a thorough evaluation of the case before any potential policy changes are made.
The outcome of this case has implications beyond just the specific video involving President Biden. It has the potential to shape Meta’s policies on manipulated media and how the company addresses misleading content during future elections, not just in the United States but worldwide. As the use of manipulated media becomes increasingly prevalent, it is crucial to strike a balance between safeguarding free speech and protecting against the manipulation of information that could influence elections.
In conclusion, the Facebook Oversight Board’s review of the manipulated video of President Biden presents a significant opportunity to reevaluate Meta’s policies on manipulated media. The decision reached in this case will have far-reaching consequences for how the company addresses misleading content in the context of elections. By considering the broader impact of manipulated media on democratic processes, the Oversight Board aims to ensure that Meta upholds its responsibilities in creating a safe and informed online environment for users worldwide.