Social media and technology companies would have six months to implement a suite of new measures to restrict Australian children from accessing adult content online, or face fines up to $50m, under proposed new codes developed by the industry.
The draft codes, submitted to the eSafety commissioner last week for approval, would require social media platforms that allow pornography to prevent access to minors, and implement age assurance measures for users.
Social media services that restrict pornography would be required to detect and remove adult content, including material depicting self-harm or high-impact violence.
The proposed codes apply to various layers at which a user interacts with the internet: there are separate codes for social media platforms, gaming services, websites, search engines, internet service providers and equipment manufacturers.
Companies that make and sell equipment and operating systems that enable access to online services – including phones, tablets and computers – would be required to enable users to set up child accounts and profiles, and to apply default safety restrictions to those accounts.
Search engine services would be required to apply default safety tools, such as “safe search” functions, at the highest safety setting by default to any account holder detected by age assurance systems as “likely to be an Australian child”.
The codes would require internet hosting services to take “appropriate and proportionate enforcement action” against customers that breached content laws and regulations.
Draft codes were released in October but the final proposals – which were developed by industry groups – were delayed by two months to allow the sector to address crossover with the federal government’s announcement last year of a social media ban for under 16s.
The codes were developed by industry groups but will need to be assessed and registered by the eSafety Commissioner before coming into effect. They address pornography and content related to suicide, self-harm, eating disorders and violence.
Jennifer Duxbury, the director of policy and regulatory affairs at Digi, an industry association for the digital sector, said the proposed safeguards would allow children to “navigate online spaces in a secure and supportive way”.
after newsletter promotion
“Online spaces and communication tools provide valuable opportunities for children to learn, connect, and explore the world,” Duxbury said.
“However, children should be protected from exposure to pornography and material that encourages harmful behaviours such as instruction for eating disorders, suicide and self-harm.
“Protecting children from harmful material remains a key priority for the industry, and we have worked collaboratively with eSafety and stakeholders to design practical, scalable solutions that include age assurance for certain content to prevent underage access, and a range of measures across the technology stack to further enhance online protections for children.”
If the codes are accepted by the commissioner, Julie Inman Grant, companies would then have six months to implement required measures before enforcement action could be taken under the federal Online Safety Act. Penalties could be up to $50m.
Article by:Source: Ben Smee
