Protect Kids Online
American Principals Project

What Comes Next

What Can We Improve In Future Legislation or Do to Enhance What’s Already Passed?

 

Stronger Enforcement Mechanisms

Pornography websites have already been operating in a legal gray area due to the dozens of obscenity statutes on the books at both the federal level and the state level. That has not stopped them from operating. Enforcement is key. In order to effectively protect kids from online pornography, it will be critical for states to include strong enforcement mechanisms in their bills with severe penalties for failing to comply.

Louisiana was the first state to pass age verification, and its legislature quickly realized the first law was not adequate in terms of enforcement. To fix the problem, the legislature passed a second law that empowered the state attorney general and included specific civil penalties for pornography websites that fail to comply. We encourage other states to adopt similar language in their bills:

A.(1) Any commercial entity that knowingly and intentionally publishes or distributes material harmful to minors on the internet from a website that contains a substantial portion of such material shall be subject to civil penalties as provided in this Section if the entity fails to perform reasonable age verification methods to verify the age of individuals attempting to access the material.

(2) The attorney general may conduct an investigation of the alleged violation and initiate a civil action in the Nineteenth Judicial District Court for the parish of East Baton Rouge on behalf of the state to assess civil penalties. Prior to asserting a cause of action, the attorney general shall provide the commercial entity with a period of time of not less than thirty days to comply with this Section.

B.(1) Any commercial entity that violates this Section may be liable for a civil penalty, to be assessed by the court, of not more than five thousand dollars for each day of violation to be paid to the Department of Justice, in order to fund the investigation of cyber crimes involving the exploitation of children. In addition to the remedies provided in this Section, the attorney general may request and the court may impose an additional civil penalty not to exceed ten thousand dollars for each violation of this Section against any commercial entity found by the court to have knowingly failed to perform reasonable age verification methods to verify the age of individuals attempting to access the material. The civil penalty shall be paid to the Department of Justice in order to fund the investigation of cyber crimes involving the exploitation of children. (2) Each violation may be treated as a separate violation or may be combined into one violation at the option of the attorney general.

(3) Any commercial entity that violates this Section may be liable to the attorney general for all costs, expenses, and fees related to investigations and proceedings associated with the violation, including attorney fees.

(4) If the court assesses a civil penalty pursuant to this Section, the Department of Justice shall be entitled to legal interest as provided in R.S. 9:3500 from the date of imposition of the penalty until paid in full.

While all of the age verification laws include civil liability, and some include a cause of action for the state attorney general, North Carolina’s law adds a private cause of action with the following language:

(d) Cause of Action. – A civil action may be brought against any commercial entity, or third party that performs the required age verification on behalf of the commercial entity, that violates this section by any of the following: (1) A parent or guardian whose minor was allowed access to the material. (2) Any person whose identifying information is retained in violation of this section. (e) Relief and damages – Any person authorized to institute a civil action by subsection (d) of this section may seek and a court may award any or all of the following types of relief: (1) An injunction to enjoin continued violation of this section. (2) Compensatory and punitive damages. (3) All costs, expenses, and fees related to the civil suit investigation and proceedings associated with the violation, including attorney’s fees. Any judgment awarded under this section shall be subject to legal interest as provided in G.S. 24-5.

We believe including both the attorney general enforcement mechanism, extensive civil liability with specific civil penalties, and a private cause of action will all be necessary to force US-based online pornography websites to fully cooperate with the laws. We include the above provisions (and other line edits) in our model legislation at the end of the policy brief.

Limiting the “Substantial Portion” Carve-out

So far, each of the laws passed have included a “substantial portion” carveout that limits the laws’ effectiveness. The provision in question requires that websites only face liability if they include more than 33-1/3% of total “material harmful to minors.” Below is Utah’s carveout:

(10) “Substantial portion” means more than 33-1/3% of total material on a website, which meets the definition of “material harmful to minors” as defined in this section.

1) A commercial entity that knowingly and intentionally publishes or distributes material harmful to minors on the Internet from a website that contains a substantial portion of such material shall be held liable if the entity fails to perform reasonable age verification methods to verify the age of an individual attempting to access the material.

This provision was politically helpful to avoid pushback from certain special interest groups and companies. But unfortunately, it is likely to leave numerous websites that are actively distributing pornography to children unaffected – and therefore unlikely to change their behavior. It also invites the possibility of a pornography website flaunting the law by claiming more than 66-2/3% of material on its website fails to meet the definition of “material harmful to minors.” The vagueness of this carveout could turn into a defense attorney’s dream. How is this amount to be measured? Total pornographic videos? Words on a webpage? Lines of code?

We would prefer to eliminate this carve-out altogether. We believe it makes the law underinclusive and therefore troublesome constitutionally (see the Texas ruling.) Additionally, this carve-out seems to give some websites a lifeline when they clearly do not deserve one. Should PornHub be allowed to distribute pornography to minors? Everyone seems to agree that the answer is no. But should X (Twitter), Instagram, and TikTok be allowed to do the same thing with impunity?

That being said, we understand the carveout is likely to remain in the legislation, so we propose a simple fix in our model legislation that also includes language as part of a strengthened enforcement mechanism:

(1) A commercial entity that knowingly and intentionally publishes or distributes material harmful to minors on the Internet from a website or individual webpage that, in whole or in part, contains a substantial portion of such material shall be held liable in a civil action for damages to any person harmed by that conduct if the entity fails to perform reasonable age verification methods to verify the age of an individual attempting to access the material.

We would also propose changing the definition of “substantial portion” to reflect this amendment:

(10) “Substantial portion” means more than 33-1/3% of total material on a website or individual webpage, which meets the definition of “material harmful to minors” as defined in this section.

In the model legislation included at the end of this document, we chose to nuance the 33-1/3% benchmark. We have expressly detailed how that metric should apply, depending on whether the distributor of the content is a social media site, a search engine, or a website. See section 10 (a) – (c) of the proposed statute.

Limiting the “News-gathering Organization” Carve-out

A pornography website could seek immunity by identifying itself publicly as a “news-gathering organization.” Each of the laws contain similar news carve-outs that read as follows:

(5) This section shall not apply to any bona fide news or public interest broadcast, website video, report, or event and shall not be construed to affect the rights of a news-gathering organization.

If PornHub simply adds a news component to its website, it would appear to be immune from any liability. To prevent this, we propose adding an additional section:

(5a) An Interactive Computer Service is not deemed to be a news-gathering organization unless its primary business is as an Information Content Provider, news publisher, or broadcaster, of current news and public interest.

Eliminating the Search Engine Carve-out

As these bills are currently written, search engines enjoy immunity from any liability under the law. That means that a child could simply access Google, turn “safe search” off with the click of a button, and access pornographic images without leaving Google’s website. In addition to limiting the ability to achieve the law’s aims, this carve-out makes the law underinclusive and therefore possibly unconstitutional.

As Judge David Alan Ezra stated in his order granting a preliminary injunction against Texas’ age verification law:

“H.B. 1181 will regulate adult video companies that post sexual material to their website. But it will do little else to prevent children from accessing pornography. Search engines, for example, do not need to implement age verification, even when they are aware that someone is using their services to view pornography… Defendant argues that the Act still protects children because they will be directed to links that require age verification… This argument ignores visual search, much of which is sexually explicit or pornographic, and can be extracted from Plaintiffs’ websites regardless of age verification… Defendant’s own expert suggests that exposure to online pornography often begins with “misspelled searches[.]” Search engines have already demonstrated the technological capability to block pornographic images from the vast majority of user queries. Google has implemented a “safe search” system with a default of “on” when it has not verified a user to be 18 years or older. DuckDuckGo, a Google competitor, has implemented similar technology. But users of all ages ultimately have the option to turn this setting off – and if they do, both search engines readily display pornographic material on their own websites.

Eliminating the search engine carve-out would require these search engines to add an age verification system that is needed only in the instances the search engines are displaying pornographic images on their own websites. In other words, Google and DuckDuckGo would not need to verify a user’s age to allow them to conduct a search, but they would be required to verify a user’s age in order to distribute pornographic images to them.

Comporting Language with Section 230

American Principles Project has long advocated for Congress to make structural changes to Section 230, a law passed back in 1996 as part of the Communications Decency Act. We detailed some of our proposed reforms in a blueprint we published back in 2020. There are also a number of legal cases making their way through the courts that may eventually challenge the constitutionality of the statute.

But our approach to state legislation has been to play under the current rules – assuming that existing Section 230 precedent will hold – and that certainly dictates what we can and can’t do at the state level.

We have addressed Section 230 in our proposal in two different ways. First, in a direct way, we have sought to embed those legally defensible ways in which the statute’s immunity from civil liability for interactive computer services can be avoided while still being consistent with Section 230’s own language.

Obviously, a long line of legal cases have unnecessarily broadened the breadth of the civil immunity that is granted to online platforms under Section 230. But that doesn’t mean that there are no exceptions to that immunity. Our legislative proposal seeks to fit the liability of tech platforms for harmful content into categories that courts have already recognized. For instance, tech platforms can be liable for the content they disseminate over the internet if they are, at least as to the content in question, not just a distribution service, but a content creator.

In the ordinary course, social media platforms profess to make content moderation decisions supposedly guided by their terms of service. They also deny joining their users in co-creating content with them. As a result, Section 230, under the current judicial approach, permits the platforms generous carte blanche to decide which content will be allowed, and which will be blocked, demonetized, restricted, or otherwise treated adversely as compared to other user content, all without incurring legal liability.

This changes, though, when a platform has become a content creator of certain online user information at issue, and not just a distributor or a gatekeeper for third-party user material. Section 230 defines an information content creator as an entity, platform or service “that is responsible, in whole or in part, for the creation or development of information provided through the internet.” Case law has developed in several federal circuits, defining what it means for a platform to be “responsible” for the “creation or development” of online information on the internet, thus rendering the platform civilly liable for that activity regarding specific problematic content.

Our proposal has adopted some of the language from decisions of those U.S. Courts of Appeal that have found a particular website to have been civilly liable as an “information content provider.”

The second way we have addressed Section 230, this time somewhat indirectly, is by expressly excluding almost all tech platforms from escaping liability under the Utah exception for a “news-gathering organization.” Only an online service whose primary business is in the news gathering business will qualify for the exception. Otherwise, the typical online platform will be responsible for civil liability caused by their conduct regarding content that is harmful to minors, assuming of course, that it has also been a “content creator” in partnering in the “creation or development” of the offensive material that caused the harm.