In a notable policy shift, YouTube announced on Friday that it has reversed its stance on misinformation regarding election content. The company’s updated policy now allows for the inclusion of content that raises doubts about the accuracy of the 2020 presidential election results. This change, as stated on YouTube’s website, marks a significant departure from its previous position on the matter.
YouTube’s parent company Google has a policy that prohibits content “advancing false claims that widespread fraud, errors, or glitches occurred in certain past elections to determine heads of government,” and Youtube wrote in December 2020 that it would remove content that “misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election.” The platform will now “stop removing” such content about the 2020 election, per its website.
“In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm,” YouTube wrote. It added that “the ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society–especially in the midst of election season.”
Users would receive a strike for each violation after their first time violating the policy, with three strikes leading to their accounts being banned, according to the Google policy. A spokesperson for YouTube told the Daily Caller News Foundation that past content that violated the policy and accounts removed for more than four violations would not be reinstated.
The spokesperson also told the DCNF that the company would have “more details to share about our approach towards the 2024 election in the months to come.”