4A’s APB Media Responsibility Principles

In order to promote a more diverse and equitable media ecosystem, the 4A’s Media, Tech and Data practice, in collaboration with the 4A’s Advertiser Protection Bureau, has updated its  Media Responsibility Principles to reflect marketplace changes over the past few years.  

The Principles are designed to, among other things, encourage the dissemination of factual, accurate information and to combat the spread of misinformation, disinformation and hate speech. Advertisers and agencies are encouraged to support publishers that act responsibly in terms of diversity, transparency, and accountability, and to withdraw support from publishers that behave irresponsibly or in opposition to applicable laws and industry standards. The Principles consist of eleven key elements.

It is our belief that adherence to these Principles will result in a fairer and more transparent environment for advertisers, agencies, publishers/platforms, and the public at large.

Please reach out to the Media, Tech and Data team at [email protected] with any questions.


 

1. PROMOTE RESPECT

Seek out media partners that foster balanced, constructive discourse and respectful civil commentary. Avoid and stop working with media partners or platforms that create and support hostile environments. This includes holding partners accountable if individuals, content, or programming consistently confronts an individual or group of individuals based on their creed, religion, race/ethnicity, sexual orientation, gender identity, or disability. Respectful environments like these present the least risk to brands and the goodwill associated with them.

 

2. PROTECT PEOPLE

Prioritize partners that protect people from harm. This includes requiring partners to take active steps to prevent predatory behavior against an individual or group of individuals, including requiring partners to flag, limit, prevent or remove content that would mislead people as to their rights, how to access public services, or public health concerns. Media partners should also maintain safe and civilized workplaces where employment is freely chosen with no toleration for sexual harassment, discrimination, or offensive behavior of any kind. Protecting people is always good business.

 

3. DIVERSE AND REPRESENTATIVE

The best partners demonstrate that they celebrate all forms of diversity, including all genders, multicultural backgrounds, ages, sexual orientations, people with disabilities, all socio-economic groups, and faiths. And when advertising is delivered, there is a conscious effort made to ensure that the ads are delivered against an audience that is representative of the diversity in the population and non-discriminatory.  Multicultural/BIPOC ownership must also be represented in good faith. Partners should select and promote their people on the basis of their qualifications and merit without discrimination or concern for race/ethnicity, religion, national origin, sexual orientation, gender identity or expression, age or disability. Diverse and representative workplaces are most effective in reaching diverse and representative audiences.

 

4. CHILDREN’S WELLBEING

Media partners and advertisers have a shared responsibility to ensure that both regulatory and industry code guidelines are consistently applied for protecting the welfare of children. These partners should be able to demonstrate that they have appropriate controls in place to protect children and, as necessary, age gate the delivery of advertising where necessary. Child labor is illegal and not to be used.

 

5. NO HATE SPEECH

Brands do not want to fund hate speech or extremist content. Protecting brands requires avoiding advertising with media outlets that fuel hatred, whether on the grounds of race/ethnicity, religion, nationality, migration status, sexuality, gender or gender identity, disability, or any other group characteristic. This includes not advertising on content (including user-generated content), services, or platforms where there is not a good-faith effort to remove and prevent speech that attempts to dehumanize a person or group of people, or that promotes or features content that may be reasonably expected to  incite violence or discrimination. Many reputable platforms will make community enforcement reports available, and these reports should help guide advertisers and agencies to brand-safe outlets.

 

6. ACCOUNTABILITY

Accountability requires that each party in the advertising supply chain, advertisers, agencies, and publishers/platforms will hold themselves and each other individually accountable for adhering to these principles. Accountability is enhanced where transparency measures are available to establish and maintain clarity and trust. Recognizing revenue from advertising is a privilege and not a right; there must be an open and honest dialogue with partners who fail to deliver on these Principles.

 

7. NO MISINFORMATION / DISINFORMATION

Brands  want their media investment to be directed to partners that ensure people receive quality, factual information that enables them to make well-informed decisions. Brands do not want to fund partners or content that spread misinformation, support its spread by community members, or through inaction, allow misinformation to be spread. Opinion pieces should be clearly marked as such to avoid misrepresentation of factual information. Advertising should not fund the distribution of misinformation or disinformation (i.e., false or misleading claims that are likely to cause societal harm) directly or indirectly. Reputable platforms where the spread of misinformation/disinformation by community members is a recurring concern will have mechanisms in place to fact check information published by high-profile and/or high reach accounts including using reputable partners and/or frequently audited methodologies in a timely manner (hours or days, with a focus on constant improvement); these platforms often ensure that factual information from reputable sources is published alongside false claims from said accounts; and these platforms often put systems in place to rapidly stop amplification of false information, especially as reach and momentum of such misinformation grows, even if the initial source is not high-reach. Large “private” communities are explicitly not expected. Priority areas for improvement include Voting/Elections/Census, Environment, Conspiracies, Racism and Health information. AI and other tools have the potential to spread false information via synthetic or otherwise manipulated text, images, audio, and video, and this must be vigilantly monitored. AI models can also inadvertently perpetuate biases present in the data they were trained on. AI technology should be approached with attention to ethical considerations, transparency and accountability.

 

8. ENFORCE POLICY

Media partners must rapidly and consistently apply their own terms of use policies if those policies are to have any meaning and effect. Partners in a common category or vertical may help the ecosystem by aligning on common policy standards that outline the expectations of users of the platforms, whether they be end-users, creators, or hosts. These policies should be transparently enforced regardless of role, title, position, or office. 

Some examples of regulations by industry to consider are detailed below. 

These websites adhere to Children’s Online Privacy Protection Act (COPPA) regulations to protect the privacy of children, including parental consent, opt-out mechanisms for parents, and a clear privacy policy that explains their information practices concerning children, including the types of information collected, how it’s used, and if it’s shared with third parties. 

Pharmaceutical advertising should adhere to the Food and Drug Administration (FDA) requirements that promotional materials accurately represent the benefits and risks of a drug and are  accurate, balanced, and not misleading. These advertisements include appropriate contraindications and warnings about potential side effects and risks associated with the drug.

Alcohol advertising adheres to restrictions and limitations on where and when alcohol advertising can be displayed, such as near schools, religious institutions, or in public transport and must not appeal directly to minors. These advertisements include required messaging about the legal drinking age in the respective jurisdiction.

Financial advertising adheres to regulations to ensure accuracy, transparency, and compliance and is truthful, accurate, and not misleading. Any potential risks associated with financial products or services are clearly disclosed, including investment risks, fees, penalties, and potential losses. Advertisers comply with data protection laws to ensure the security of customer information.

 

9. ADVERTISING AND SUPPLY CHAIN TRANSPARENCY

Advertisers and the public are best served when there is supply chain transparency that measures and reports the distance in space and time that advertising appears in proximity to brand-unsuitable content. This enables advertisers to make informed decisions that protect their brands. Advertisers and agencies have a shared role in scrutinizing publisher quality, including awareness and avoidance of Made For Arbitrage websites (also referred to as Made for Advertising websites) that deliver minimal to no value and create an adverse consumer experience due to low-quality content.

 

10. ETHICAL DATA COLLECTION AND USE

Media partners and advertisers should collect, use and store data in ways that are ethical, accountable, lawful and transparent. Data must be collected and used in a way that complies with all applicable regulations and industry guidelines. Rules and safeguards should be in place to ensure that data is not used in advertising in a way that would violate the law, including by intentionally or inadvertently discriminating against an individual or group of individuals or interfere with their ability to access employment, housing, credit or other products and services.

 

11. FRAUD

Billing must reflect investment that reaches actual humans. Advertisers must be able to verify that ads are reaching real people and that inventory matches the content described in the auction. Reputable platforms will foster a culture of transparency and accountability within the advertising ecosystem to combat fraud, usually by the usage of verification tools and ad fraud detection technology solutions. Reputable platforms will participate in regular audits and compliance checks to verify alignment with evolving industry standards and best practices.