There are growing calls for a ban on children accessing social media, with support on both sides of politics.
Opposition Leader Peter Dutton has pledged to ban under-16-year-olds from accessing social media by implementing age verification in the first 100 days of a Coalition government.
He says social media usage is to blame for “a high prevalence of many health conditions, issues around body image [and] bullying online” and has had “tragic consequences.”
And while Prime Minister Anthony Albanese has not specifically committed to a ban, he has said he would support one for those under 16 if it could be effective, agreeing social media was having “a negative impact on young people.” His government has provided $6.5 million to assess age assurance technologies.
The idea might be popular with parents concerned about their kids’ welfare online, and it has the support of at least some experts.
But others have warned it would be tricky to enforce and would deny young people access to platforms they enjoy and learn on.
How much time do kids spend online?
While parents might want their kids to be using their phones and tablets less, the current reality is a lot of learning, playing, socialising and entertainment for kids happens online.
A 2020 eSafety Commissioner survey of more than 600 12-17 year-olds found, on average, they spent 14.4 hours online a week, much of which is on social media.
Of the survey participants, about four in 10 reported having at least one negative online experience in the six months prior. This included being contacted by a stranger or being sent inappropriate content.
But nine in 10 reported having at least one positive experience online too, such as sharing positive comments about others.
What are the arguments for kids staying on social media?
A media and communications lecturer at the University of Sydney, Dr Catherine Page Jeffery, doesn’t think enforcing an age limit is a good move because it would be tricky to impose and would deny young people access to platforms they value, learn on and derive entertainment from.
“We seem to forget in these discussions that there are all of these benefits,” she says.
“A lot of young people find solidarity with various communities of interest, particularly young people that are marginalised or from culturally diverse backgrounds.
“If we exclude young people we are denying them the opportunity to exercise so many of their digital rights online.”
She acknowledges children face potential harms online, including bullying and grooming, but she questions whether banning kids from those sites is the answer.
“If we keep young people off social media and the internet altogether, where there are these potential harms, then they don’t develop the skills and the digital competencies that are required for them to navigate those spaces and to actually develop resilience,” she says.
She adds discussions of those harms is often exaggerated, and that many of the harms identified exist elsewhere on the internet too.
“No-one’s talking about banning the internet because we’ve all come to rely on it too much. Social media has become the lightning rod for all of these concerns … [but] the potential harms that are being talked about in the media and that Peter Dutton has been talking about are not necessarily supported by a lot of the research, so we need to be careful about making grand reductive statements.
“It’s like the low-hanging fruit. Actually, there’s a lot going on in the world at the moment, there are wars going on, we’ve just come out of a pandemic, the period of adolescence is generally quite fraught anyway, so just blaming everything on social media I think actually serves to amplify the risks.”
Instead, she suggests a “multi-faceted” approach, which would include social media platforms doing more to protect young people, as well as parents and teachers helping to encourage safer use.
University of New South Wales professor Michael Salter, whose research specialises in online abuse and child exploitation, agrees there’s a lot of fun and learning to be had online, but the harm for children has reached a point where action is necessary.
“It is a pity that it’s come to this, but the data’s pretty clear that we have too many children coming to harm on social media,” he says.
“Age verification is a measure that governments can implement and I think they have a license to do it at this point.”
How would it work?
There are also questions and concerns about how it would be enforced.
At the moment, to sign up to a social media site, users are supposed to be at least 13, but cyber security lecturer at the University of Melbourne, Dr Shaanan Cohney, says it was not a very effective process because it could be easily bypassed.
He said another way to check someone’s age was through technology run by a third party, or the government, which had access to personal information verifying someone’s age.
But he said that would come with privacy and surveillance risks and could lead to “unintended side effects” such as a data hack.
“If every non-minor user of the service has to provide government identification to every online platform that they want to use, this means that the government inherently is going to start gaining access to a lot of information about what websites everyone in the country uses,” he says.
“So there’s a question of trade-offs here: do we think that our concerns about children on social media are sufficient to justify surveillance of the entire Australian adult population?”
While he thought it was likely the Australian government was heading this way, Dr Cohney didn’t think it was a wise move.
Professor Salter says face scanning is another option, but it’s not precise.
He says the best option would be to provide documentation to independent third party providers who would then provide a “token” which could be embedded into a user’s browser and tell sites their age.
He says the government could also look to Google and Apple’s app stores to stop people who have not met the necessary age requirements from downloading the social media apps.
“But certainly we need to push responsibility onto the platforms as well and make sure that they’re fulfilling their child protection obligations,” Professor Salter says.
Is there work already underway?
Yes.
In the federal budget, the government put $6.5 million towards trialling “age assurance” technologies to see how effective they are at checking the age of users accessing content like social media and pornography.
eSafety Commissioner Julie Inman Grant agrees it’s a challenge, but is pleased the government is trailing the technology after years of advocacy from her team.
“Until these comprehensive age assurance systems are in place, implementation of social media bans will be very challenging to measure, implement and enforce,” she says.
“If we do not get the fundamental building blocks of these services right by demanding that tech companies enforce their own policies, use available advanced technologies to tackle these harms at-scale and in real-time and ensure that they are complying with Australian laws, we will be fighting a losing battle.
“Implementation and enforcement of age restrictions will be difficult, but this is not an insurmountable challenge if we are able to effectively galvanise the leadership and broad interest in online safety across the states and territories to come to the best national outcomes.”