Communications Minister Michelle Rowland presented an amendment to the Online Safety Act in Parliament on Thursday, described by the government as a pioneering initiative in social media regulation.
"The Albanese Government is enacting groundbreaking legislation to establish 16 as the minimum age for social media access," Rowland stated in an official release.
"This legislation is focused on safeguarding young users and providing reassurance to parents that we are prioritizing their concerns," she added.
The proposed amendment imposes financial sanctions of up to AUD 50 million (approximately USD 32.5 million) on companies failing to implement "reasonable measures" to prevent underage users from creating accounts.
Although a definitive list of affected platforms has not been disclosed, the restriction is anticipated to apply to major social media services such as TikTok, X, Instagram, and Snapchat.
The proposal has garnered widespread support among parents and advocates of stricter online regulation, who view it as a necessary step toward holding technology companies accountable for the online environments Australian children navigate.
There are many articles that report that sextortion is a major risk globally, and that this social rot is real. Especially the transfer of peer bullying to digital platforms and its spreading like a virus brings with it many social decays.
Conversely, critics argue that the legislation represents an overly simplistic approach that may inadvertently limit teenagers’ access to critical social support systems and heighten risks for those circumventing the rules.
Both proponents and opponents acknowledge the detrimental impact of excessive online engagement by minors and the pressing need for technology firms to enhance the safety of their platforms.
"Social media can have serious negative consequences for young Australians," Rowland noted. "Government data indicates that nearly two-thirds of Australians aged 14 to 17 have encountered highly harmful content online, including material related to drug abuse, self-harm, suicide, and violence. Additionally, one in four adolescents has been exposed to content encouraging dangerous eating behaviors."
There are many details to discuss here.
This is the result of years of uncontrolled digital social addiction, inhuman manipulation by advertising companies, and the greedy ecosystem increasingly targeting young minds.
Whether you accept it or not, some legal or non-profit parties have to work on legal frameworks that will tame these tech companies.
The protective features of this regulation should be designed sensibly with the consortium.
The requirement for platforms to implement age-assurance mechanisms potentially raises privacy concerns, especially regarding the type and extent of personal data collected for verification purposes. While the mandate to destroy such data post-verification is a positive safeguard, the reliance on user consent for retention introduces the possibility of misuse, particularly if consent mechanisms are poorly designed or lack transparency.
Enforcing harsh penalties for non-compliance aims to ensure accountability, but focusing on punitive measures without addressing the potential exploitation of data collected by third parties can create a gap in comprehensive data governance. Moreover, restricting social media access for minors under the age of 16 can lead to unintended privacy risks, such as increased reliance on fake information or unauthorized access to evade detection, which platforms may struggle to effectively monitor.
So we may be faced with hundreds of fake accounts around. For this, an approval and detection mechanism is definitely required, and how this will be designed must be well determined. It should also be noted that, in the approval and proof mechanism, this may increase the obligation for many people to share their biological identity information with third parties, which creates a separate privacy problem.
Where rights are limited to protect children from online harm, any limitation must be lawful, necessary and proportionate. If less restrictive options are available to achieve the aim of protecting children from harm, these are to be preferred to a blanket ban. The UN Committee on the Rights of the Child has stated that content control and content moderation should not be used to restrict children's access to information in the digital environment; they should only be used to prevent the flow of harmful material to children.
In addition, an unbalanced social media ban will cause some human rights violations:
- Freedom of expression and access to information (Article 19 ICCPR; Article 13 CRC);
- Freedom of association and peaceful assembly (Article 22 ICCPR; Article 15 CRC);
- The right to education and development (Articles 28 & 29 CRC);
- The right to culture, leisure and play (Article 13, ICESCR, Article 31 CRC);
- The right to the highest attainable standard of health, including through access to relevant information (Article 12, ICESCR , Article 24, CRC); and
- The right to privacy (Article 17 ICCPR; Article 16 CRC).
You should remember this fact again. Almost since the 2000s, an internet that has been subject to uncontrolled manipulation by all kinds of groups and has been repeatedly abused by technology and BigTech companies has taken hold of all societies. After a certain point, the harsh face of social decay began to appear more clearly, and this caused a sudden reaction from some segments of society.
The starting point of this type of regulation is these reactions.
Studies suggest that excessive social media use can interfere with healthy brain development, sleep, and academic performance. Social media platforms collect vast amounts of personal data from users, and children may not fully understand how their data is collected, used, or monetized. Restricting access for younger users could help reduce privacy violations and abuse of personal data by big tech companies. Restricting access helps parents and guardians better guide their children's online activities and ensure they interact with technology in a supervised manner.
Additionally, the Australian Human Rights Commission has developed a Children's Rights Impact Assessment (CRIA) tool to assess the impact of proposed laws on children's human rights. You can check here:
https://www.unicef.org.uk/child-friendly-cities/wp-content/uploads/sites/3/2022/06/CRIA_June-2022.pdf
As an expert, I can clearly say that in situations like this that affect entire societies, there are no easy answers.
Here, important topics such as child psychology, technology literacy, and human-centered technology production are included.
Before a clear ban, comprehensive steps should be well planned and instead of a ban, responsibilities that will directly subject social media companies to severe sanctions should be more clearly defined.
Looking back, we have witnessed many times how the so-called protective laws of many governments turned into global surveillance.
Therefore, this regulation, like its counterparts, is two-sided.
Top comments (0)