The Social Media Ban for Children under 16 in Australia is just a Shortcut!

It’s undoubtedly easier to forbid our children from using social media, but is this simply a shortcut to a much larger problem? Rather than taking the easy route, we should be demanding that the digital space be regulated to ensure safety—not just for our children, but for everyone. There needs to be accountability online, just as there is in the real world. Surely, it’s worth the investment?

The reality is that many of the social problems we face today aren't just confined to the physical world—they’re amplified online, often to extraordinary proportions. Without a solid safety framework in place, these issues can spread uncontrollably. The internet has made it easier than ever to falsify facts, manipulate information, and pursue ulterior motives. This means we’re often absorbing information that may or may not be true—and most of us simply don’t have the time or willpower to fact-check everything. We need an automated process to flag and counter false or radical content, monitored by both humans and AI, across the entire online space.

The anonymity that social media affords also allows individuals to hide behind false identities and distance themselves from the real-world consequences of their actions. Without human interaction, the harm they cause can go unchecked. Digital identity verification could help hold people accountable for online crimes and disinformation. If we could strip away predation and manipulation, social media could become a healthy medium for enhancing social interactions rather than replacing them entirely.

While the internet has contributed to societal destabilization, it has also provided immense opportunities. The digital space needs to be transformed so users can regain control. We need regulations that empower users—not powerful entities—to ensure that propaganda and inaccurate content do not dominate the online world. We should be able to filter out content we don’t want to see, without infringing on free speech. Isn’t there a way to create a more balanced digital environment?

If we don't act now, AI could eventually take control of our digital lives, making it even harder to distinguish truth from fiction. As AI becomes more sophisticated, the lines between reality and fabrication will continue to blur, and trust will be harder to restore. We need to ensure that facts matter, and that online spaces are managed in ways that prevent them from becoming breeding grounds for misinformation and harm.


The Dangers of High-Engagement Content

Content that generates high engagement can often be misleading. Just because something is popular doesn’t mean it’s credible. Shock value, rather than genuine agreement, often drives interactions. Social media platforms like Instagram could consider implementing a "disbelief" or "disagreement" button, in addition to the usual "like" or "heart" options, to reduce the emphasis on engagement as a measure of truth.

 The Case for Social Media Regulation for Children, Not a Ban

Focusing on safeguarding children on social media is complex. At first glance, banning children from using social media seems like the simplest solution. After all, all we want is to keep them safe. However, we need to ask whether such a ban is just a temporary fix to a much bigger issue. Are we deceiving ourselves by assuming that everything will be fine when our children turn 16 and are suddenly allowed access to these platforms?

There are different responses around the world to this issue. The UK government recently decided against implementing a ban on social media for children, citing insufficient evidence. Meanwhile, Australia has already passed such legislation, and the EU is pursuing a strategy that focuses on empowering young people rather than limiting their freedoms.

But Can a Social Media Ban Have Repercussions?

  • Have we considered what communication channels children will turn to if social media is no longer available? As long as influencers, celebrities, and brands continue to use social media, the attraction for young people will remain. For children, the digital world is an extension of their reality, just as it is for adults.

  • Social media offers numerous benefits. It can improve children’s social skills[1], boost self-confidence[2], and help them stay connected with friends and family members who live far away. It also allows them to share their ideas, learn about the world, and find opportunities for charity work[3] or local activism—experiences that foster a sense of civic responsibility[4] and community.

  • Many young people also turn to social media to develop creativity[5], express themselves[6], explore their identities, and connect with others who share common interests from all over the world[7]

  • For marginalized groups, such as LGBTQ+ youth, social media can provide crucial support networks. Moreover, social media is a valuable tool for accessing educational content and self-directed learning[8]. Children can also find information on topics they may feel uncomfortable discussing with their parents, such as health, sex education, stress reduction, mental health support[9] and mindfulness.

  •  Are we sheltering children too much when they need to develop social skills and adaptability? A study published this year found that adolescents with moderate social media use could maintain relationships in the real world, and the skills they learned online transferred to their offline lives in a positive way. [10]

  •  Banning social media could also prevent children from learning essential digital literacy skills, which they will need as adults[11], both personally and professionally. As Professor Przybylski pointed out in his study How Much Is Too Much?[12], blanket bans and age restrictions on technology are neither evidence-based nor ethical, especially when moderate screen use can have a positive impact on children’s psychosocial functioning.

  •  Banning social media outright means disempowering our teenagers. A ban could deny them access to the information and social support they might require outside the home.

  •  Whether we like it or not, young people currently get most of their news from social media channels. If these channels are removed, where will they turn for information?

  •  Age is a significant factor in this debate. A blanket ban on social media for children under 16 may be too strict, especially for teenagers, who face unique challenges during this phase of life.

  •  The Northern Ireland Commissioner for Children and Young People (NICCY) recently reported that children view the digital world as essential to their lives. They see the online world as a critical space for development and paramount to their right to play, their right to health services, their right to education, and to freedom of thought, belief and religion (UNCRC Articles 31, 24, 28, and 14).[13]


A social media ban is silencing rather than empowering.

How to Avoid a Social Media Ban

 A report by the Ofcom Parents Focus Group in the UK, "Protecting Children from Harms Online," highlights several factors that need to be considered to prevent an all-out ban on social media for children:

  1. Stringent Age Verification
    There must be robust age verification standards, ideally using biometric identification, rather than relying on self-declaration methods.

  2. Technological and Human Oversight
    The age-verification system should include both technological solutions and human oversight to ensure accuracy and accountability.

  3. Regular Audits of Social Media Platforms & AI Systems
    Social media platforms and AI applications should undergo regular audits, with strict penalties for non-compliance, to ensure they adhere to safety standards.

  4. Audits of Age Verification Process
    The age verification process itself must be regularly audited, and platforms should face penalties for failing to meet established standards.

  5. Transparency of Age Verification Processes
    Platforms should publicly disclose their age verification procedures and provide evidence of their effectiveness.

  6. Immediate Deletion of Underage Users
    Platforms must implement systems that immediately identify and remove users who are underage or whose age cannot be verified.

  7. Advertising and Marketing Scrutiny
    There should be thorough assessments of social media platforms' design and the tactics used to attract children, similar to regulations governing TV advertising aimed at children. Strict rules should govern how advertisers and platforms engage children for marketing purposes.

  8. Demographics and Advertising Transparency
    Platforms should regularly share information about their user demographics and the methods used for advertising, ensuring transparency in how children are targeted.

  9. Adaptive Safeguards
    Safeguards put in place to protect children must be flexible and responsive to ongoing changes in technology. Regular assessments should be conducted to adapt these measures as necessary.

  10. Process for Addressing Harmful Content
    A clear and efficient process must be in place for identifying and addressing harmful content on platforms, with prompt actions taken to protect children. This process should be robust enough to identify harmful content disguised as child-friendly and to block inappropriate AI-generated material.

  11. Privacy Safeguarding
    All AI applications must comply with data privacy protection regulations, such as GDPR, to safeguard children’s personal data.

  12. Enhanced Parental Control & Reporting Processes
    Reporting dangerous content should be easier for parents and users, with a quick response time to address issues promptly. Systems should be simplified so that parents and children can easily understand the environment and interactions.

  13. Regular Monitoring by Independent Bodies
    Regular reviews of online content should be conducted by independent bodies to ensure compliance with safety standards and identify new methods of delivering harmful content to minors.

  14. Constant Education of Parents & Children
    Educational initiatives are necessary to teach parents and children appropriate strategies for staying safe online. This education should be age-appropriate and include support for all age groups.

  15. Cooperation with Platforms & Other Experts
    A collaborative environment should be fostered so that platforms, governments, and child protection experts can work together to improve the online environment and prevent exploitation.

  16. Further Research into the Effects of Social Media and AI on Children
    There should be a responsible body dedicated to managing extensive research into the effects of online content on children, particularly as AI becomes more prominent.

  17. AI Filtering of Text and Images for Body Image & Mental Health Content
    AI should be used to block inappropriate content related to body image, mental health issues, and self-harm, such as suicide, eating disorders, etc.

  18. Safety Measures Independent of Parents’ Involvement
    Safeguarding processes must be in place that protect children online with minimal reliance on parents’ involvement. Many parents may lack the knowledge or resources to effectively navigate the digital environment. Human interaction is necessary to maintain empathy and integrity, reducing the reliance on impersonal and frustrating chatbots.

  19. Clear Communication by Social Media Platforms
    Platforms need to simplify and clarify their communication about support and safety to ensure that users, particularly parents, can easily understand how to navigate the system.

  20. Support & Monitoring of Smaller Platforms
    Smaller platforms should be supported to foster a competitive industry, but they must also be guided on how to adhere to safety regulations to ensure they do not pose a risk to children when under less scrutiny.

  21. Prioritization of Child Safety in Searches & Feeds
    Search results and feeds must be adapted to avoid generating dangerous or harmful content. This should be done without infringing on the freedom of information.

If governments take responsibility for driving the necessary pressure and momentum to transform the online space into a safer, more accountable environment through appropriate regulations, they can prevent the need for a social media ban for children.This proactive approach would empower individuals of all ages, whether young or old.

 

References:

[1] Wegmann, E., & Stieger, S. (2023). Social Media as a Platform for Social Skill Development: A Longitudinal Analysis of Social Interactions and Social Skills in Adolescents. Frontiers in Psychology, 14, 793365. doi: 10.3389/fpsyg.2023.793365.

[2] Chou, H. T., & Lee, L. S. (2022). Social Media and Social Integration: The Role of Online Communities in Adolescent Well-being. Journal of Youth and Adolescence, 51(2), 305-317. doi: 10.1007/s10964-022-01557-9

[3] Tufekci, Z., & Anderson, C. A. (2022). Youth and Social Media Activism: Exploring Opportunities for Political and Social Change. Journal of Youth and Adolescence, 51(5), 870-886. doi: 10.1007/s10964-022-01414-w

[4] Kang, L., & Lee, S. J. (2022). Civic Engagement and Digital Literacy: The Role of Social Media in Adolescent Development. Computers in Human Behavior, 125, 106960. doi: 10.1016/j.chb.2021.106960.

[5] Fletcher, E. S., & Rose, J. A. (2023). Creativity and Self-Expression in Adolescents: The Role of Social Media Platforms. Cyberpsychology, Behavior, and Social Networking, 26(3), 183-190. doi: 10.1089/cyber.2022.0139.

[6] Binns, A. J., & Patterson, E. A. (2023). Social Media and the Construction of Adolescent Identity: Opportunities and Risks for Marginalized Youth. Journal of Adolescent Research, 38(1), 62-84. doi: 10.1177/07435584221124909.

[7] Valkenburg, P. M., & Peter, J. (2021). Social Media and Adolescent Well-being: A Review of the Evidence. Cyberpsychology, Behavior, and Social Networking, 24(7), 442-449. doi: 10.1089/cyber.2021.0061.

[8] Wang, Y., & Wu, X. (2022). Educational Use of Social Media and Its Impact on Academic Engagement and Achievement in Adolescents: A Cross-Sectional Study. Computers & Education, 182, 104495. doi: 10.1016/j.compedu.2022.104495.

[9] Gonzalez, E., & Hall, S. (2023). Social Media and Adolescent Mental Health: Supportive Roles in Coping and Resilience. Journal of Social and Clinical Psychology, 42(1), 12-27. doi: 10.1521/jscp.2023.42.1.12.

[10] Thakur, Nabin & Sulaiman, Zainab. (2024). A Study on the Influence of Social Media on Forming the Social Skills among Adolescents. 10.20944/preprints202409.1997.v1.

[11] Ribble, M. S. (2015). Digital Citizenship in Schools: Nine Elements All Students Should Know. International Society for Technology in Education.

[12] Przybylski, A. K., et al. (2020). How much is too much? Examining the relationship between digital screen engagement and psychosocial functioning in a confirmatory cohort study. Journal of the American Academy of Child & Adolescent Psychiatry, 59(9), 1080–1088. https://doi.org/10.1016/j.jaac.2020.01.015

[13] Purdy, N., Ballentine, M., Lyle, H., Orr, K., Symington, E., Webster, D., York, L., (2023) Growing Up Online: Children’s online activities, harm and safety in Northern Ireland – an Evidence Report. Belfast: Centre for Research in Educational Underachievement (CREU), Stranmillis University College / Safeguarding Board for Northern Ireland (SBNI)

Images courtesy of Freepix.

The intriniscal link of the real world & digital world.

Previous
Previous

Keeping Your Cool During the Festive Season!

Next
Next

Building Trust in the Digital Age: Why Gradual Freedom and Open Communication Matter More Than Simply Saying 'No'