Google is taking steps to stop kids seeing, or searching for, pornography as part of a raft of new safety and age-appropriate measures it’s introducing to its various services.
The tech giant will also automatically remove images of anyone under 18 from its Search images if a parent requests it.
And it will block ads targeted at kids, as well as cutting down on its own tracking of kids by switching ‘location history’ off permanently for those under 18.
Google is also further tightening its anti-predator rules on YouTube by making all videos uploaded by younger teens private by default.
And it is turning off ‘autoplay’ on YouTube by default for users under the age of 18 to combat screen addiction.
The raft of safety measures comes after Apple announced new plans to scan iPhone photo uploads for child abuse material and tell parents if their kids are sending or receiving explicit material in Messages.
However, most of the new Google measures are dependent on kids and their families being honest with Google as to how old they are. Google says that an effective method of age verification is still some way off, adding that it needs to work with regulators, lawmakers and industry bodies for more “accurate” age verification.
“We’re always working to prevent mature content from surfacing during a child’s experience with Google,” said Mindy Brooks, Google’s general manager for kids and families.
As such, she said, ‘safe search’ will now be on by default for those under 18, removing search links to adult content. This will extend to Google Assistant on devices such as smart displays too, she said.
However, a spokesperson for Google separately said that the safe search feature may be turned off by teenagers.
Ms Brooks said that the right to have any or all imagery of a child removed from Google’s search results will now be recognised as a basic policy.
“While we already provide a range of removal options for people using Google Search, children are at particular risk when it comes to controlling their imagery on the internet,”
“In the coming weeks, we’ll introduce a new policy that enables anyone under the age of 18, or their parent or guardian, to request the removal of their images from Google Image results. Of course, removing an image from Search doesn’t remove it from the web, but we believe this change will help give young people more control of their images online.”
She said that Google’s move to make sure that under-18s cannot switch on ‘location history’ is a worthwhile safety tradeoff, even if it means that it will make it harder to find a lost phone or have more personalised maps.
“Location History is a Google account setting that helps make our products more useful,” she said. “It's already off by default for all accounts, and children with supervised accounts don’t have the option of turning Location History on. Taking this a step further, we’ll soon extend this to users under the age of 18 globally, meaning that Location History will remain off without the option to turn it on.”
The tech giant is also tightening its policies when it comes to targeting ads at kids and teenagers.
“We’ll be expanding safeguards to prevent age-sensitive ad categories from being shown to teens, and we will block ad targeting based on the age, gender, or interests of people under 18,” said Ms Brooks.
“We’ll start rolling out these updates across our products globally over the coming months.”
Other new measures include a raft of ‘wellbeing controls’ such as screen time limits on kids’ devices.
“In Family Link, parents can set screen time limits and reminders for their kids’ supervised devices,” said Ms Brooks. “And on Assistant-enabled smart devices, we give parents control through Digital Wellbeing tools available in the Google Home app. In the coming months, we’ll roll out new Digital Wellbeing filters that allow people to block news, podcasts, and access to webpages on Assistant-enabled smart devices.”