{"id":1251,"date":"2024-03-06T09:44:20","date_gmt":"2024-03-06T09:44:20","guid":{"rendered":"https:\/\/blogs.kent.ac.uk\/ris\/?p=1251"},"modified":"2025-05-19T12:11:52","modified_gmt":"2025-05-19T11:11:52","slug":"social-media-algorithms-amplify-misogynistic-content-to-teens","status":"publish","type":"post","link":"https:\/\/blogs.kent.ac.uk\/ris\/2024\/03\/06\/social-media-algorithms-amplify-misogynistic-content-to-teens\/","title":{"rendered":"Social media algorithms amplify misogynistic content to teens"},"content":{"rendered":"<p>Social media algorithms amplify extreme content, such as misogynistic posts, which normalises harmful ideologies for young people, finds a new report co-authored by\u00a0<a href=\"https:\/\/www.kent.ac.uk\/arts\/people\/538\/shaughnessy-nicola\">Professor Nicola Shaughnessy<\/a>\u00a0from the University\u2019s\u00a0<a href=\"https:\/\/www.kent.ac.uk\/arts\">School of Arts<\/a>.<\/p>\n<p>The research, conducted in partnership between Kent, University College London (UCL) and the Association of School and College Leaders (ASCL), found a fourfold increase in the level of misogynistic content in the \u201cFor You\u201d page of TikTok accounts over just five days on the platform, in an algorithmic modelling study.<\/p>\n<p>Through interviews with young people and school leaders, the researchers also found that hateful ideologies and misogynistic tropes have moved off screens and into schools, becoming embedded in mainstream youth cultures.<\/p>\n<p>The report authors stress the need for a \u201chealthy digital diet\u201d approach to education to support young people, schools, parents and the community at large. They also say it is essential to champion the voices of young people themselves, particularly to include boys as part of discussions regarding online misogyny, and they suggest a \u201cpeer-to-peer\u201d mentoring approach.<\/p>\n<p>Professor Shaughnessy said: \u2018Our research offers insights into how and why social media impacts on young people, particularly vulnerable groups and the importance of digital literacy as well as the potential of peer education to build resilience. Our work also offers a novel creative research method using archetypes generated from field work to discover more about the processing of algorithms in real world contexts.\u2019<\/p>\n<p>Principal investigator Dr Kaitlyn Regehr (UCL Information Studies) said: \u2018Algorithmic processes on TikTok and other social media sites target people\u2019s vulnerabilities \u2013 such as loneliness or feelings of loss of control \u2013 and gamify harmful content. As young people micro dose on topics like self-harm, or extremism, to them, it feels like entertainment.<\/p>\n<p>\u2018Harmful views and tropes are now becoming normalised among young people. Online consumption is impacting young people\u2019s offline behaviours, as we see these ideologies moving off screens and into school yards.<\/p>\n<p>\u2018Further, adults are often unaware of how harmful algorithmic processes function, or indeed how they could feed into their own social media addictions, making parenting around these issues difficult.\u2019<\/p>\n<p>The researchers began the study by interviewing young people engaging with and producing radical online content. This then informed the algorithmic study in the creation of archetypes, to represent typologies of teenage boys who may be vulnerable to becoming radicalised by online content. The researchers set up accounts on TikTok for each archetype, with distinct content interests typical of these archetypes (for example, seeking out content on masculinity or addressing loneliness), and researchers used these accounts to watch videos that TikTok suggested in its \u201cFor You\u201d page, over a period of seven days.<\/p>\n<p>Initial suggested content was in line with the stated interests of each archetype, such as with material exploring themes of loneliness or self-improvement, but then increasingly focused on anger and blame directed at women. After five days, the TikTok algorithm was presenting four times as many videos with misogynistic content such as objectification, sexual harassment or discrediting women (increasing from 13% of recommended videos to 56%).<\/p>\n<p>The research team led roundtables and interviews with school leaders, who attested that misogynistic tropes are becoming normalised in how young people interact in person as well.<\/p>\n<p>The researchers set out the following recommendations:<\/p>\n<ul>\n<li>Holding social media companies accountable, and applying pressure on them to address the harm caused by their algorithms and prioritise the wellbeing of young people over profit.<\/li>\n<li>Implementing \u201chealthy digital diet\u201d education, which considers different types of screen time and digital content young people are engaging with, akin to different food groups, considering how much of it is consumed, how it can become \u201cultra-processed\u201d due to algorithms, and potential impacts on mental and physical health.<\/li>\n<li>Peer to peer mentoring, empowering older pupils to work with their younger peers, and helping to involve boys in discussions around misogyny.<\/li>\n<li>Promoting wider awareness of algorithmic processes among parents and the community at large.<\/li>\n<\/ul>\n<p>The research was funded by the Arts and Humanities Research Council (AHRC).<\/p>\n<p>Their research paper\u00a0<a href=\"https:\/\/www.ascl.org.uk\/ASCL\/media\/ASCL\/Help%20and%20advice\/Inclusion\/Safer-scrolling.pdf\">\u2018Safer scrolling: How algorithms popularise and gamify online hate and misogyny for young people\u2019<\/a>\u00a0is published by the\u00a0<em>Association of School and College Leaders.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Social media algorithms amplify extreme content, such as misogynistic posts, which normalises harmful ideologies for young people, finds a new report co-authored by\u00a0Professor Nicola Shaughnessy\u00a0from &hellip; <a href=\"https:\/\/blogs.kent.ac.uk\/ris\/2024\/03\/06\/social-media-algorithms-amplify-misogynistic-content-to-teens\/\">Read&nbsp;more<\/a><\/p>\n","protected":false},"author":74795,"featured_media":1252,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[256065,9112,282605,256079],"tags":[79492],"_links":{"self":[{"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/posts\/1251"}],"collection":[{"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/users\/74795"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/comments?post=1251"}],"version-history":[{"count":1,"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/posts\/1251\/revisions"}],"predecessor-version":[{"id":1253,"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/posts\/1251\/revisions\/1253"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/media\/1252"}],"wp:attachment":[{"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/media?parent=1251"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/categories?post=1251"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/ris\/wp-json\/wp\/v2\/tags?post=1251"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}