False claims about Covid vaccines – including that the Australian prime minister faked getting his jab – were among some of the hundreds of Australian videos TikTok removed from its platform as part of a misinformation crackdown.
In February, Facebook, Twitter, Google, Microsoft, Redbubble, Apple, Adobe and TikTok signed on to a new voluntary industry code aimed at combating misinformation and disinformation online.
On Saturday, the organisation representing the companies, Digi, released the first annual reports on how the tech giants are implementing the code’s obligations.
In social video platform TikTok’s report, the company revealed it had removed 651 videos mentioning Covid-19 or coronavirus between October 2020 and March 2021 for violating the company’s misinformation policy. A further 222 videos were removed for posting medical misinformation.
Some of the false claims being made included that the prime minister, Scott Morrison, “faked” the Pfizer vaccination despite clear vision of him receiving the shot; a false claim the health minister Greg Hunt’s cellulitis diagnosis was due to receiving the AstraZeneca vaccine; and a false claim the AstraZeneca vaccine had caused “80%” of the members of the Australian navy “severe side effects”.
TikTok works with Agence France-Presse (AFP) to fact-check claims made about Covid-19, and passes on those fact-checked claims to the company’s moderators.
Close to 20,000 Australian videos in that time also had a Covid-19 information label added to them, directing users to health sources. In instances where a video’s claims have been reviewed but not yet substantiated, users will get a pop-up warning suggesting not to share the video.
Twitter reported that between July and December last year 3.5m accounts globally had actions taken against them for violation of the rules, including 1m suspended. A total of 4.5m tweets, banners or avatars were removed for violating the rules. On Covid misinformation specifically, 3,400 accounts had action taken against them, including 600 suspensions, and 3,900 tweets, banners or avatars were removed.
For accounts violating the company’s election integrity policy, 6,500 had actions taken against them, with 50 suspended. A total of 8,100 tweets, banners or avatars were removed.
In Australia specifically, 37,000 Twitter accounts had actions taken for violating the rules, including 7,200 suspensions. A total of 47,000 tweets, banners or avatars from Australian accounts were removed.
Just over 50 had actions taken for violating Covid-19 misinformation policy, and less than 10 were suspended. Just 50 tweets were removed. Over 40 Australian accounts had actions taken for violating election integrity policy, and 70 tweets, banners or avatars were removed under this policy.
Redbubble, the Australian marketplace where users can make their own merchandise, reported a spike in sales for merchandise with anti-vax tags, peaking at over $15,000 in mid-2020.
The company said it removed 81 Australian-created products contravening its harmful misinformation policy, including items tagged with “pandemic”, “David Icke”, and various anti-vaccination tags.
On Friday, Facebook stated it had removed more than 110,000 pieces of Covid-related misinformation generated by Australian accounts in the first year of the pandemic.
More about: