< BACK TO ALL BLOGS
What are the minor content risks that the US market needs to pay attention to?
Aug 2, 2023
Children's Online Privacy Protection Act (COPPA)
Date of adoption: 1998 Relevant articles (excerpts):
——Require the operator of any website or online service directed to children
that collects personal information from children or the operator of a website or online
service that has actual knowledge that it is collecting personal information from a child.
——To provide notice on the website of what information is collected from children
by the operator, how the operator uses such information, and the operator's disclosure practices
for such information.
——To obtain verifiable parental consent for the collection, use, or disclosure of personal
information from children.
——Require the operator of such a website or online service to establish and maintain reasonable
procedures to protect the confidentiality, security, and integrity of personal information collected
from children.
The core viewpoint is aimed at the collection of personal information of children under the age of 13 in the process of providing online services. It stipulates that website
administrators must comply with privacy rules, must explain how to ask for consent from children's parents in a verifiable way, and website administrators must protect children's online privacy and safety.
Communications Decency Act
Date of adoption: 1996 Relevant articles (excerpts):
——Whoever knowingly uses an interactive computer service to send to a specific person or persons
under 18 years of age, or display in a manner available to a person under 18 years of age.
——Any comment, request, suggestion, proposal, image, or other communication that, in context,
depicts or describes, in terms patently offensive as measured by contemporary community standards,
sexual or excretory activities or organs.
——Shall be fined under title 18, United States Code, or imprisoned not more than two
years, or both.
The Communications Content Decency Act was enacted by the U.S. Congress in 1996, which stipulates that it is strictly forbidden to publish information with pornographic content to minors through the Internet. Although the U.S. Supreme Court ruled that part of the bill
is unconstitutional, it truly represents that the United States has paid attention to the issue of underage exposure to pornographic content in the early days of the Internet age, and it represents the United States’ rapid attention and regulatory response to new media.
Children’s Internet Protection Act (CIPA)
Date of adoption: 2000 Relevant articles (excerpts):
——The policy proposed at this meeting must address the following:
(a) Measures to restrict a minor`s access to inappropriate or harmful materials on the Internet;
(b) Security and safety of minors using chat rooms, email, instant messaging, or any other
types of online communications;
(c) Unauthorized disclosure of a minor’s personal information;
——Thus, under this legislation, all Internet access must be filtered for minors and adults, though the
filtering requirements can be more restrictive for minors than for adults. The following content must be filtered or blocked:
Obscene Child Pornography Harmful to Minors
Core viewpoints prevent violence, pornography and other bad online culture from harming young people, treat
children and adults differently, and protect children from encountering content that only adults can access online.
In addition to the legislative level, the judiciary in North America also attaches great
importance to the issue of minors.
On September 4, 2019, the world-renowned company Google (Google) and its subsidiary
YouTube agreed to pay US$170 million to settle the US Federal Trade Commission (FTC) charges
against YouTube video sharing service for illegally collecting children's personal information
without the consent of children's parents. Of the $170 million, Google had to pay $136 million
to the FTC and $34 million to New York State for violating COPPA rules. At the same time, the
US$136 million is also the largest fine since the passage of the US Children's Online Privacy
Protection Act in 1998. The signed settlement agreement also requires Google and YouTube to
develop, deploy, and long-term maintain a system for channel owners to identify whether content
directed at children is compliant, so that YouTube can comply with COPPA rules for a long time.
It can be seen that when the audio and video industry chooses North America as the overseas
region, the issue of minors needs to be the key issue of concern, even the primary issue. It is
necessary to pay attention to the privacy of minors and prevent pornographic and violent content
from invading minors. This poses challenges to companies from two aspects: one is to determine the
identity of minors, so as to selectively provide content for these users and protect their privacy;
the other is to conduct accurate review of platform content, which includes not only general risks
such as pornography, violence, and prohibition that are likely to cause psychological harm to minors,
but also specific risks that are strongly related to minors, such as child pornography, cult videos,
and child nudity.
Therefore, audio and video companies should establish an intelligent review mechanism for
the platform from two aspects to avoid risks:
1.Create a minor account identification function Build a non-intrusive minor account
identification system from the bottom, through equipment, account, behavior, UGC content, etc.
Portraits of adults are used to solve multiple needs such as privacy disclosure of minors' published
content, intelligent recommendation of minors' content, and protection of minors' content.
2.Create a fast and effective platform content audit function. Establish a human-machine
integrated comprehensive content audit capability to quickly screen illegal content on the
platform, and adopt flexible measures such as interception, re-examination, delayed display,
and directional display to avoid platform content risk.