Examining algorithmic biases in YouTube’s recommendations of vaccine videos

Deena Abul-Fottouh, Melodie Song, Anatoliy Gruzd

Contact: Deena.abulfottouh@ryerson.ca

Objective: This research examines how YouTube recommends vaccination-related videos. Materials and Methods: We collected 2122 vaccine-related videos using YouTube's API and then used a social network analysis (LOLOG model) to evaluate how YouTube recommends videos to its users. Results: More pro-vaccine videos (64.75%) than anti-vaccine (19.98%) videos are on YouTube, with 15.27% of videos being neutral in sentiment. YouTube was more likely to recommend neutral and pro-vaccine videos than anti-vaccine videos. There is a homophily effect in which pro-vaccine videos were more likely to recommend other pro-vaccine videos than anti-vaccine ones, and vice versa. Discussion: Compared to our prior study, the number of recommendations for pro-vaccine videos has significantly increased, suggesting that YouTube’s demonization policy of harmful content and other changes to their recommender algorithm have lowered the visibility of anti-vaccine videos. However, there are concerns that anti-vaccine videos are less likely to lead users to pro-vaccine videos due to the homophily effect observed in the recommendation network. Conclusion: The study demonstrates the influence of YouTube’s recommender systems on the types of vaccine information users discover on YouTube. We conclude with a general discussion of the importance of algorithmic transparency in how social media platforms like YouTube decide what content to feature and recommend to its users.

← Schedule