Hot Potato: YouTube updated its medical misinformation policies to include new guidelines for vaccines. Google's video-sharing platform said it was seeing a steady stream of false claims about a Covid-19 vaccine in general, and added that they had reached a point where they had to start working with Covid. - 19 other vaccines
New YouTube policies remove content that claims approved vaccines are dangerous or cause chronic health problems. YouTube is also removing videos that claim that vaccines do not reduce transmission or contraction, as well as segments that contain incorrect information about the ingredients in vaccines. Sterility, or a substance that claims to contain substances in vaccines, can somehow track the people who receive it.
Note that YouTube continues to cover vaccine policies, historical vaccine successes or failures, and new vaccine trials. In addition, personal testimony about vaccines is allowed "as long as the video does not violate other community guidelines, or the channel does not show a pattern of skepticism about the vaccine."
YouTube announced They work with local and international health organizations and experts to develop new policies. According to us, since last year, the site has removed more than 130,000 videos for violating Covid-19 vaccine policies.
According to CNBC, a YouTube spokesperson confirmed that it has also removed high-profile channels. Publishers of disinformation, including Joseph Mercola, Sherry Tenpenny, and the Robert F. Kennedy Jr. Children's Health Trust.
Facebook, Instagram, and Reddit are also taking action to combat misinformation. Vaccines have varying degrees of success.
The changes take effect today, although YouTube admits that with any major updates it will take time for their systems to grow rapidly in terms of performance. .
Photo by Artem Powers, Paulina Tankilovich
YouTube bans anti-vaccine misinformation on its platform