Google announced a slew of policy changes on Tuesday aimed at safeguarding children under the age of 18 from abuse on the search giant’s platforms.
Minors or their parents will be allowed to request that their images be removed from Google’s Image Search function, a significant shift given Google’s traditionally hands-off attitude to regulating its search engine. Google also said that it would prohibit targeted advertising based on a person’s age, gender, or interests if they are under the age of 18.
During the epidemic, technology has assisted children and teenagers in remaining in school despite lockdowns and maintaining contact with family and friends. Parents, educators, child safety and privacy experts, and policymakers are worried about how to keep children and teenagers safe online as they spend more time online.
Google-owned YouTube has announced that it would alter the default video upload settings for minors, automatically selecting the most private option possible. The platform will also disable auto-play for children by default and enable digital well-being features like notifications that encourage users to take a break after binge-watching movies for an extended period of time.
The reforms come at a time when Silicon Valley firms have been chastised for failing to protect children. Apple sparked outrage last week when it revealed that it will scan iPhone pictures for child exploitation material if they were uploaded to the company’s iCloud storage service. Some privacy activists are concerned about the possibility of monitoring and abuse as a result of the change. Google did not reply to an inquiry about whether the firm had comparable intentions for its Android mobile operating system.
Google does not allow regular accounts for children under the age of 13, but it does have certain products, such as YouTube Kids, that are designed for children to use with parental supervision. On Tuesday, the firm also announced that for users under the age of 18, who are logged into their Google accounts, a function called Safe Search, which filters out explicit search results, would be enabled automatically. Minors will also be unable to use Google’s Location History feature, which records where people have been for maps and other products.
This isn’t the first time Google has changed its search engine rules to combat misuse. The firm said in June that it will change its search algorithms in order to clamp down on websites that publish unsubstantiated and defamatory remarks about individuals. Google’s rules were also changed when the European Union decided in 2014 that Google had to change search results as part of the “right to be forgotten.” The guideline allows citizens to request that Google remove personal information about them from search results if the information is outdated, irrelevant, or not in the public interest.
When it comes to the handling of children’s material, YouTube has already faced backlash. In 2017, YouTube Kids sparked outrage when its filters failed to identify certain videos with horrific images yet are targeted at children, such as Mickey Mouse laying in a pool of blood or PAW Patrol figures bursting into flames after a vehicle accident.
Critics have also accused Google of violating the Children’s Online Privacy Protection Act, or COPPA, a federal regulation that governs the gathering of user data from sites with users under the age of 13. The US Federal Trade Commission slammed YouTube with a record $170 million punishment, as well as additional restrictions, in 2019 for violating COPPA. As a result, the video site made significant modifications to how it handles children’s movies, including restricting the data it gathers from such views.