By – Prakarsh Kastwar
Google and Meta apparently operated a hidden ad campaign targeting teenagers that violates Google’s own policies.
Google and Meta allegedly conducted a hidden ad campaign targeting teenagers that broke Google’s own regulations. The ad strategy allegedly targeted 13- to 17-year-olds on YouTube using Instagram advertisements. Companies are now being investigated for breaking the laws in order to target a younger audience.

Google apparently conducted an investigation and terminated the initiative after the outlet approached the firm. According to Quartz, Google classified the effort as “small in nature,” and stated that it has “thoroughly reviewed the allegations regarding the circumvention of our policies” and is taking “appropriate steps.” The company also stated that it intends to update its training so that its sales agents better comprehend the requirements.
Google’s regulations restrict advertising to children under the age of 18. The adverts were targeted to a group identified as “unknown” in Google’s advertising system. This means that the group included people whose age, gender, or other demographics were unknown. However, Google could use data from app downloads and online activity to confirm that these “unknown” users were teenagers.

How was the covert operation planned?
Spark Foundry, an advertising agency based in the United States, assisted Meta and Google in carrying out their hidden campaign. This year’s program operated in Canada, and it was tested in the United States in May. The initiative would eventually have been implemented globally and used to promote services such as Facebook.

Following the program’s cancellation, Google stated in a statement to the Financial Times, “We prohibit ads being personalised to people under the age of 18, period.”
Notably, the US Senate just enacted legislation to make tech companies liable for endangering minors. One law, the Children and Teens’ Online Privacy Protection Act, or COPA 2.0, restricts targeted advertising to kids and data collecting without their consent. Another law, the Kids Online Safety Act, requires tech companies to develop online platforms in ways that protect children from cyberbullying, sexual exploitation, and drug use.
