Educational videos will be shown in the Czech Republic, Slovakia and Poland to prevent misinformation about Ukrainian refugees
Fake news is, unfortunately, a feature of our times, with the vast ocean of the internet often making it difficult to combat misinformation.
YouTube, however, is set to introduce educational ads to help users identify potentially misleading content.
So, after a successful experiment by the University of Cambridge, YouTube will show these commercials in the Czech Republic, Slovakia and Poland, in order to avoid misinformation about Ukrainian refugees.
As for the experiment, the ads were shown to 5.4 million people, 22.000 of whom then participated in a poll.
Through the research process, the researchers observed an improvement in the participants' detection of fake news and an increased ability to separate authoritative and non-authoritative sources, as well as in the decision to share or not the content on social media.
"Obviously you can't predict every single case of misinformation that will go viral, but what you can do is find common patterns and patterns of discourse," notes John Roosenbeek and lead editor of the scientific publication. He continues: "The idea behind this research was: if we find some of these patterns, is it possible to make people more resistant to them, even to content they've never seen before?"
In the experiment, the researchers used a YouTube tool called Brand Lift that advertisers use to determine whether an ad has made their product more recognizable.
In this particular study, participants were given a news story and asked to identify which technique it used to manipulate the reader.
In a second group of the experiment, no video was shown to the participants, only the title and the corresponding answers were given. The results showed that those who watched videos were 5% more correct than the other group.
“It is clearly important that children learn to read laterally and check the validity of sources. But, we also need solutions that can be traced back to social media and find a point of contact with the algorithms" notes Professor Sander Van Der Linden who was also part of the research team.
He concludes: “But at the end of the day, we have to face the reality that social media companies control much of the flow of information online. To protect people, therefore, we need to find independent, evidence-based solutions that social media companies can implement on their platforms."