17 May 2023

Falling down YouTube’s rabbit hole

The Back Page

More could be done to mitigate social media’s downside.


Your Back Page scribe will be the first to concede that social media technologies can do many wonderful things when used in a positive and productive manner. 

On the other hand, allowing this online landscape to flourish in a wildly unregulated fashion has an undeniable mental health downside that needs to be taken more seriously. 

Which is why some new research from the Australian Institute for Suicide Research and Prevention and Griffith University has caught our eye. 

These boffins sought to investigate the favourable and unfavourable effects on mental health of one of the most widely used streaming platforms, namely YouTube.  

In a nutshell, what they found was frequent YouTube users often had elevated levels of loneliness, anxiety and depression. 

What’s more, the groups most vulnerable to such adverse effects were those under 29 and those who frequently viewed content related to others’ lives. 

For the purposes of the study, watching YouTube videos for more than two hours per day was categorised as high-frequency use while five hours of watching a day was considered saturated use. 

The blame for poor mental health outcomes after high YouTube consumption was sheeted home to the forming of “parasocial”, or one-sided, relationships between the content creators and those watching their videos. 

“For some individuals, these virtual ‘relationships’ can compensate for a lack of in-person social interactions, particularly for those who struggle with social anxiety. Nevertheless, it can amplify their problems when they fail to engage in face-to-face communication, which is especially crucial during developmental years,” the study’s lead author Dr Luke Balcombe told media. 

The researchers also recommended doing more to prevent the recommendation of suicide-related content based on algorithm-generated suggested viewing, which could lead users down a distressing “rabbit hole.” 

Dr Balcombe suggested AI could be utilised for monitoring and intervention in high-frequency YouTube users who were particularly vulnerable, such as children and adolescents. 

“We have examined the concerns related to human-computer interaction and suggested a concept for a recommendation system that operates independently of YouTube. The system will guide users toward verified positive mental health content or campaigns,” Dr Balcombe said. 

Given the backlash encountered when another giant social media platform was negligently tardy in preventing its customers from livestreaming their murderous shooting rampages, this doesn’t seem too big of an ask, does it?   

Sharing story tips with penny@medicalrepublic.com.au will earn you likes IRL. 

Something to say?

Leave a Reply

Be the First to Comment!

Please log in in to leave a comment


wpDiscuz