California’s “Online Eraser” Law for Teens is Damaging to Online Communities
Last month, California enacted Senate Bill 568 (California Business & Professions Code Section 22581), which is aimed at protecting teenagers online.
According to BBC News, it works by forcing websites to remove content posted by minors who live in the state, when so requested. However, if that content is reposted by others, those posts do not have to be removed. Furthermore, the data does not have to be removed from the website’s servers – only from public view. This law will take effect on January 1, 2015.
It was authored by Senate President pro Tem Darrell Steinberg and signed into law by Governor Edmund G. Brown, Jr.
This caught my eye because of the damaging side effects that it could have for online communities, which can be irreparably harmed by the mass deletion of content. Online communities don’t function like Facebook and Twitter, where you have a profile that is yours that people have to follow, friend or opt-in to. They are shared spaces that are usually public, where everyone participates.
How Online Community Discussions Work
To illustrate this, here is a common example of how a discussion within a community might progress:
- Bob starts a discussion.
- Susan replies to Bob.
- Larry replies to Bob.
- Bob replies to Susan and Larry.
- Karen replies to Susan.
- Susan replies to Karen and Bob.
- Derek replies to Bob.
- Tammy replies to Bob, Derek and Susan.
And on and on it goes. But let’s say that Bob is a California teenager and he asks for all of his posts to be removed. The posts made by Susan, Larry, Karen, Derek and Tammy would be devoid of context and couldn’t exist (and make sense) on their own. They speak to a topic that no longer exists, responding to comments that have disappeared. This harms their contributions. Now it looks more like this:
- Susan replies to ?
- Larry replies to ?
- Karen replies to Susan.
- Susan replies to Karen and ?
- Derek replies to ?
- Tammy replies to ?, Derek and Susan.
- James replies asking what the heck is going on because the conversation makes no sense and people appear to be talking to themselves.
- Susan replies asking what happened to the thread and why all of the posts were removed.
Other Problems
Eric Goldman, a Professor of Law at Santa Clara University School of Law who focuses on internet law, intellectual property and advertising and marketing law, wrote an article about SB 568 for Forbes. He touched on the issue I discussed above and also highlighted others.
He describes the bill as being “riddled with ambiguities.” For example, the law does not define when a person can request that the content be removed. Do they have to make the request when they are a minor? Or can they do it at any point? When they are 70, can they request that 50 year old content be removed? Goldman feels that the natural reading of the bill is that the request must be made while the person is still a minor, which means that a kid would have to decide what content they want online forever, during adulthood, before they are even an adult.
Like Goldman, I feel that this type of issue has no business being legislated on a state level, only a federal level. He suggests that the law would likely run afoul of the Dormant Commerce Clause, a Constitutional doctrine that permits only Congress to regulate interstate commerce. For example, I’m in North Carolina, not California. Do I have to require that all of my members tell me what state they are in, simply so that I can comply with California’s law that affects only users from California? Hence, interstate commerce.
When I read the bill, I couldn’t help but feel it was an uncomfortable extension of COPPA, the Children’s Online Privacy Protection Act of 1998, which requires that websites receive permission from a parent or guardian before collecting personal information from kids aged 12 and younger. If each state tries to enact their own COPPA-like policies, dictating how websites may interact with minors, this could become very messy.
It also opens up the possibility that someone might lie, and say they are from California (or say that they are a minor), in order to ask for their content to be mass removed. COPPA operates largely on the honor system, which works because if the person lies, they are only hurting themselves. How can we verify that someone is actually from California and actually a minor before allowing them to damage our community?
Anonymization and Ambiguity
It is common for online communities to require an irrevocable, non-exclusive license before a person is permitted to post. This is to prevent a member from harming the community and the contributions made by other members. One member could make 10,000 posts that, when removed, might affect 100,000 posts and 10,000 members in a negative way.
At the same time, even though that is my policy, that does not mean that I turn a deaf ear to requests from members who want to leave our community or who want to remove something that is of a personal nature. I try to be flexible with people. I’m not going to mass remove posts, but if there is a post or two or three that bothers them for a specific reason, I try to work with people. If it is personal information, I’ll always remove it for them. A photo of them, their family, etc. Their full name. Contact information. I’ll happily remove it. In fact, it is against my guidelines to share personal, offline contact information on our community. We simply don’t allow it.
Of course, the bill isn’t limited to personal information. It simply covers every piece of content posted and that’s the issue.
When a member asks me to delete their account, the option that I offer is account closure. This means that the username is changed to something non-descriptive, like username12345. The profile fields (email address, signature, website URLs, avatar, etc.) are cleared. The account is blocked from further usage (banned) and the member can no longer use it, as it is no longer tied to them or identifiable to them. They cannot sign up for a new account in the future and, once completed, the action is irreversible. This helps separate people who want to actually leave from those who just want attention and to waste my time (and harm the community repeatedly) by joining and then making me delete their existence, over and over again.
It has worked well, as a middle ground. I would definitely be more comfortable with the bill requiring anonymization, rather than full and complete deletion. There are times where the bill reads as if this may be acceptable. For example, read this passage:
The bill would, on and after January 1, 2015, require the operator of an Internet Web site, online service, online application, or mobile application to permit a minor, who is a registered user of the operator’s Internet Web site, online service, online application, or mobile application, to remove, or to request and obtain removal of, content or information posted on the operator’s Internet Web site, service, or application by the minor, unless the content or information was posted by a 3rd party, any other provision of state or federal law requires the operator or 3rd party to maintain the content or information, or the operator anonymizes the content or information. The bill would require the operator to provide notice to a minor that the minor may remove the content or information, as specified.
The words “unless” and “or” imply to me that anonymization is an alternative to removal. Later in the bill, there are other passages that lead one to believe this may be true.
(b) An operator or a third party is not required to erase or otherwise eliminate, or to enable erasure or elimination of, content or information in any of the following circumstances:
…
(3) The operator anonymizes the content or information posted by the minor who is a registered user, so that the minor who is a registered user cannot be individually identified.
But then, farther down, we have this:
(d) An operator shall be deemed compliant with this section if:
(1) It renders the content or information posted by the minor user no longer visible to other users of the service and the public even if the content or information remains on the operator’s servers in some form.
On one hand, a website is “not required to erase” content if the operator “anonymizes the content … so that the minor … cannot be individually identified.” On the other hand, an operator is “deemed compliant” if they render the content “no longer visible.” By erasure, they may mean erasure from your database. For example, you have to remove the content from public view, but not from your database, if it is anonymized. I don’t know if that’s actually the case, because it is ambiguous.
I reached out to Eric Goldman to ask for his perspective and how he reads the portion of the bill targeted at anonymization.
“I don’t understand what the statute means by ‘anonymize’ content,” he said. “It could be as simple as just removing the user’s name, but that would not really make the content ‘anonymous’ in most cases. There are so many ways to determine the identity of an author even without a username attached to it. Because the term ‘anonymize’ is so vague, I don’t expect many UGC websites will rely upon that exception – at least, not until they get further clarity from the legislature or the courts.”
Looking Forward
Since the law is not set to take affect until January 1, 2015, we have some time before any change is necessary. It is worthwhile to wait and see what happens between now and next summer. The law is ambiguous in key areas and hopefully we’ll at least get some clarity. If you are in California and you feel that this is a poorly conceived law, you could write to state legislature, including Senate President pro Tem Steinberg and Governor Brown.
If the law proceeds as is, it will be worthwhile to discuss the matter with qualified counsel to nail down a plan for compliance. For smaller communities, or those without the financial resources to hire an attorney, it is best to err on the side of caution. Primarily, I see this taking two forms:
- You do not allow members who are aged 17 and younger, period. This is unfortunate. I started managing online communities when I was 15 and moderating and posting even younger than that. If I wasn’t allowed to do so, I wouldn’t be in this field.
- You allow (or possibly require) each member to specify what state they are in, if they are from the U.S., and then if they say they are from California, you display a prompt in their profile allowing them to request content removal if they are a minor. The bill describes what the notice should include, such as an explanation that the removal “does not ensure complete or comprehensive removal of the content or information” on the community.
That said, we’re more than a year from this being in play. In the meantime, let’s talk about it, spread the word and discuss how the objectives of this bill can be accomplished in a more productive manner.
Please note: this is a general discussion of legal topics, not legal advice. You should consult an attorney on matters of this nature and not simply rely on this general exploration of complicated topics.