Legislation on Advertising to Children

This article examines legislation related to advertising to children in both the U.S. and E.U. Historically focused on television, new laws like COPPA and the DSA address online advertising practices, including restrictions on profiling and contextual advertising. It provides an overview of these regulations and their implications for protecting minors.


This content originally appeared on HackerNoon and was authored by WebTwo

:::info Authors:

(1) Tinhinane Medjkoune, Univ. Grenoble Alpes, CNRS, Grenoble INP, LIG France;

(2) Oana Goga, LIX, CNRS, Inria, Ecole Polytechnique, Institut Polytechnique de Paris France;

(3) Juliette Senechal, Université de Lille, CRDP, DReDIS-IRJS France.

:::

Abstract and Introduction

Background

Legislation on Advertising to Children

Mechanisms for Targeting Children

Usage of Placement-Based Targeting

Limitations

Related Works

Conclusion, Acknowledgements and References

Appendix

3 LEGISLATION ON ADVERTISING TO CHILDREN

Historically, legislation regarding advertising and children has been developed for ads on television and not for ads online. However, more recently, to adapt to the new online advertising ecosystem, privacy legislations such as the COPPA and online platforms legislations such as the DSA have started to include provisions regarding online advertising and children. In this section, we aggregate and review legislation related to children and advertising in the E.U. and the U.S. to understand what practices are regulated and how. Note that there is no single legal text that covers advertising to children, and the rules are generally spread across various broader legal texts. Hence, we present next the excerpts that apply to advertising and children. While we only interpret our findings in the context of the DSA and the COPPA, this section contains a more comprehensive set of text related to advertising to children that we hope can serve as a reference for researchers wanting to learn more about rules in the space. When possible, we cite the actual text (provided it is not too long) of the rules to not bias the description with our interpretation. However, to improve understandability, we provide an overall summary of the topics treated by the rules at the beginning of each subsection.

3.1 Legislation on advertising on television

Legislation that applies to advertising to children on television imposes rules on the content that can be advertised, on the duration of ads, and on the separation between ads and the television program.

\ E.U. legislation. In the E.U., some legal texts come from the European Union, and they have to be implemented by all the Member States, and some are set at the national level, and they apply to only a specific country. At E.U. level, legislation regarding advertising to children on television is contained in the larger Directive 2010/13/EU of March 10, 2010, on Audiovisual Media Services Directive amended by the Directive (EU) 2018/1808” [47, 48]. Regarding content, the Article 9 of this directive (that applies to advertising to children) states that: “(1e) audiovisual commercial communications for alcoholic beverages shall not be aimed specifically at minors and shall not encourage immoderate consumption of such beverages.”, and “(1g) audiovisual commercial communications shall not cause physical or moral detriment to minors. Therefore they shall not directly exhort minors to buy or hire a product or service by exploiting their inexperience or credulity, directly encourage them to persuade their parents or others to purchase the goods or services being advertised, exploit the special trust minors place in parents, teachers or other persons, or unreasonably show minors in dangerous situations.”

\ In addition, Article 20 (2) prohibits advertising breaks on television during children’s programs of less than 30 minutes: “The transmission of children’s programmes may be interrupted by television advertising and/or teleshopping once for each scheduled period of at least 30 minutes, provided that the scheduled duration of the programme is greater than 30 minutes.” Finally, Article 11 (3d) imposes a clear separation between the advertised message and the rest of the broadcasted program: “Programmes containing product placement shall be appropriately identified at the start and the end of the programme, and when a programme resumes after an advertising break, in order to avoid any confusion on the part of the viewer.” [41].

\ This directive was transposed in every Member State of the European Union, with some national adaptations. For example, in France, the directive was transposed by a decree of 27 March 1992, amended by a decree of 2 July 2010 [41].

\ U.S. legislation. The Federal Communication Commission’s (FCC) rules and regulations on advertising and children are located in Title 47 of the Code of Federal Regulations (CFR). These regulations are colloquially referred to as the Children’s Television Act (CTA). They have been initially proposed in 1990, and have been updated several times since with the latest update in 2019. The regulation imposes rules on advertising in broadcast and cable television programming targeting children 12 and younger, including limits on ad time, and prohibiting the airing of advertising for products related to the program currently airing. 73.670 and 76.225 state that “No cable operator shall air more than 10.5 minutes of commercial matter per hour during children’s programming on weekends, or more than 12 minutes of commercial matter per hour on weekdays.” [21, 22]. In addition, rules related to content in general, present in 73.3999, also apply to content advertised to children. Federal law prohibits obscene, indecent and profane content from being broadcast on the radio or T.V. [20]. Finally, FCC has several policies that are designed to protect children from confusion that may result from the intermixture of programs and commercial material in children’s television programming in the Sponsorship Identification Rules and Embedded Advertising [19].

3.2 Legislation on advertising online

Regulations on online advertising and children restrict some targeting mechanisms for reaching children (namely through profiling) while they allow others (namely through contextual advertising). Per our research, regulations on online advertising seem to miss the precise rules present in television advertising regarding the content and the duration of ads. In addition, the E.U. legislation imposes obligations on platforms to assess systemic risk incurred by their systems that could affect minors and an obligation to mitigate such risks.

\ E.U. legislation. At E.U. level, a new Regulation of 19 October 2022, the Digital Services Act contains regulations regarding advertising to children online [46]. The DSA follows the European Commission Communication “European Strategy for a Better Internet for Children” regarding minors [17, 18]. At a high level, the DSA forbids advertising to children based on profiling and puts an obligation on online platforms to assess systemic risks that might harm children’s rights and mental well-being and health. Interpretation hint: the actual rules are present in “Articles”, while “Recitals” can provide additional explanations and intentions behind the rules.

\ Article 28 (2) of the DSA states that “Providers of online platform shall not present advertisements on their interface based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor”. In the Article 4 (4) of the GDPR, “profiling” means “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements".

\ The notion of “reasonable certainty” is a complex concept that can be interpreted in different ways. Article 28 (3) states that “compliance” with Article 28 (2) “shall not oblige providers of online platforms to process additional personal data in order to assess whether the recipient of the service is a minor”. This paragraph is, itself, open to several interpretations, which Recital 71 clarifies as follows: “In accordance with Regulation (EU) 2016/679, notably the principle of data minimisation as provided for in Article 5(1), point (c), thereof, this prohibition should not lead the provider of the online platform to maintain, acquire or process more personal data than it already has in order to assess if the recipient of the service is a minor. Thus, this obligation should not incentivize providers of online platforms to collect the age of the recipient of the service prior to their use ”. To comply with such rules and to not ask for additional data on the age of the users, YouTube is asking content creators to label whether their content is intended for children or not [63, 64].

\ Furthermore, given the importance of very large online platforms and search engines (VLOPS), in facilitating public debate, economic transactions, and the dissemination to the public of information, opinions, and ideas and in influencing how recipients obtain and communicate information online, the Section V of Chapter III of the DSA (Due Diligence Obligations for Very Large Online Platforms and Search Engine) impose specific obligations on the providers of very large online platforms in addition to the obligations applicable to all online platforms. In particular,Articles 34 and 35 of the DSA place an obligation on VLOPS to assess four broad categories of systemic risk, as well as an obligation to mitigate such risks. VLOPS must therefore be particularly vigilant about these risks, which involve the impact of their recommendation, moderation, and advertising systems on the health, safety, and fundamental rights of consumers, whether minors or adults. The two systemic risks that relate to advertising and children are described in Recital 81 and Recital 83:

\ Recital 81 “concerns the actual or foreseeable impact of the service on the exercise of fundamental rights, as protected by the Charter, including but not limited to human dignity, freedom of expression and of information, including media freedom and pluralism, the right to private life, data protection, the right to non-discrimination, the rights of the child and consumer protection. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or by the very large online search engine or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. When assessing risks to the rights of the child, providers of very large online platforms and of very large online search engines should consider for example how easy it is for minors to understand the design and functioning of the service, as well as how minors can be exposed through their service to content that may impair minors’ health, physical, mental and moral development. Such risks may arise, for example, in relation to the design of online interfaces which intentionally or unintentionally exploit the weaknesses and inexperience of minors or which may cause addictive behavior”.

\ Recital 83 identifies another important risk: “category of risks stems from similar concerns relating to the design, functioning or use, including through manipulation, of very large online platforms and of very large online search engines with an actual or foreseeable negative effect on the protection of public health, minors and serious negative consequences to a person’s physical and mental well-being, or on gender-based violence. Such risks may also stem from coordinated disinformation campaigns related to public health, or from online interface design that may stimulate behavioural addictions of recipients of the service”.

\ Finally, regarding the content of online ads, we found a broad provision in the Directive (EU) 2018/1808 of 14 November 2018 amended by Directive 2010/13/EU (Audiovisual Media Services Directive). The amended directive states in Article 28b (a) that “Member States shall ensure that video-sharing platform providers under their jurisdiction take appropriate measures to protect minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development”. Where the appropriate measures are strict access control, age verification and parental control (see Appendix A.2 for the description of appropriate measures). Note that, this article applies to all continent that might be seen by minors on video-sharing platforms and is not specific to advertising.

\ U.S. legislation. The Children’s Online Privacy Protection Rule (COPPA) [23] has restrictions on advertising to children. The proposed rules are intended to stop the behavioral advertising, retargeting, and profiling of children under 13, but not contextual advertising. More precisely, COPPA states that advertisers and content owners may not collect any personal information (which includes cookies and other persistent identifiers) from children under 13 years of age without verifiable parental consent (see Appendix A.2 for a definition of verifiable legal consent and personal infromation). However, COPPA allows operators to collect personal information only for “support for the internal operations of the Web site or online service means that those are activities necessary to (…) authenticate users of, or personalize the content on, the Web site or online service (…) serve contextual advertising on the Web site or online service or cap the frequency of advertising (…) so long as the information collected for these activities is not used or disclosed to contact a specific individual, including through behavioral advertising, to amass a profile on a specific individual, or for any other purpose”’.

\ Senators introduced in January 2022 in front of the US Congress a proposal for “Banning Surveillance Advertising Act” [58]. This text, which is at the proposal stage, intends to prohibit targeted advertising, under certain conditions, but regardless of the age of the recipient, whether a minor or an adult.

\

:::info This paper is available on arxiv under CC 4.0 license.

:::

\


This content originally appeared on HackerNoon and was authored by WebTwo


Print Share Comment Cite Upload Translate Updates
APA

WebTwo | Sciencx (2024-06-25T16:57:56+00:00) Legislation on Advertising to Children. Retrieved from https://www.scien.cx/2024/06/25/legislation-on-advertising-to-children/

MLA
" » Legislation on Advertising to Children." WebTwo | Sciencx - Tuesday June 25, 2024, https://www.scien.cx/2024/06/25/legislation-on-advertising-to-children/
HARVARD
WebTwo | Sciencx Tuesday June 25, 2024 » Legislation on Advertising to Children., viewed ,<https://www.scien.cx/2024/06/25/legislation-on-advertising-to-children/>
VANCOUVER
WebTwo | Sciencx - » Legislation on Advertising to Children. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/06/25/legislation-on-advertising-to-children/
CHICAGO
" » Legislation on Advertising to Children." WebTwo | Sciencx - Accessed . https://www.scien.cx/2024/06/25/legislation-on-advertising-to-children/
IEEE
" » Legislation on Advertising to Children." WebTwo | Sciencx [Online]. Available: https://www.scien.cx/2024/06/25/legislation-on-advertising-to-children/. [Accessed: ]
rf:citation
» Legislation on Advertising to Children | WebTwo | Sciencx | https://www.scien.cx/2024/06/25/legislation-on-advertising-to-children/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.