California Gov. Jerry Brown signed a bill Monday regarding advertising to minors that also allows them to remove content they post online. The legislation, SB 658, quickly attracted attention for that "eraser" provision which required all Web sites where a California minor is registered to have a way to delete content they themselves have posted. Some have criticized the legislation as out of sync with how the Internet operates or sounding similar to the "right to be forgotten" proposals that have emerged elsewhere in the world. We talked to the author of the bill, California Senate President pro Tem Darrell Steinberg, on Wednesday. This interview has been lightly edited for clarity and length.
Andrea Peterson: What is the legislation and what does it contain?
Sen. Darrell Steinberg: The legislation contains two key elements. One, it prohibits Internet companies from marketing products to minors that are otherwise prohibited to be offered and sold to minors outside the Internet. In other words, if you can't sell it to a minor in a retail outlet or face to face, you can't sell it or advertise it, or solicit on the Internet. Some of those categories are obvious like tobacco or alcohol. But there are other products as well that are illegal to sell to young people, and this provides that blanket prohibition.
The second part of the bill is the so-called eraser button bill, which would require Internet companies to provide an easy-to-use method for a minor to delete a posting or a picture from a Web site before it's transmitted to a third party. The purpose, of course, is to allow minors — and we've all been teenagers who sometimes act in ways that they regret a few moments or an hour later or makes their parents looking over their shoulder say "why did you post that?" — and allows them to remove it before it can be embarrassing to themselves or harmful to somebody else. That's the piece that's getting a lot of attention, and it's a very important piece.
How exactly do you expect online marketers to tell whether or not they are speaking specifically to a minor audience?
It will be up the Internet company to have the appropriate filters so that an ad cannot get through to them on their Facebook page or whatever other site may be exclusive to them. We worked very closely with the industry on this and the industry largely removed its opposition. A number of them do it already, but we want to make sure that its the law across the board that you can't purposely allow advertisements for products that can't be sold to minors to be advertised to them.
How did the legislation come about — what was the timeline?
Common Sense Media, Jim Steyer, approached me sometime in the early part of 2013. And what grabbed my attention was the story, as they had done their background research about why this kind of legislation as important, of young teenage girls who go on a Web site to look at fashion — which is completely appropriate — only to then have that decision be used by marketers selling diet pills to barrage them with ads for diet pills. That was where my radar went off. I have a teenage daughter. It to me was the clearest example of how an appropriate activity by a teenager, using great technology, can easily be turned a negative way and in a way that can harm them. Then we drafted the bill.
Going to the "eraser" part of the legislation. My understanding is that it will only apply to information that they themselves upload ...
How is that different from the delete buttons that most social media sites already have right now?
I think a lot of young people don't know — it's not always easily accessible to delete and it can still be accessed even if it is deleted I think in many instances with the right kind of technology. This will allow it to be removed and not be accessed by anybody subsequently. And we've seen, whether it's cyber-bullying, whether it's the posting of an inappropriately picture or a derogatory comment about a third party — sometimes young people make impetuous decisions and this allows them to, yes, if recognized in a very timely manner, to be able to take it back.
The thing that really shocked me on this was the fact that a number of colleges and universities around the country have the technology to properly access the Web sites, the Facebook pages, of college applicants. So a comment a young person posts that may seem innocuous, if it's derogatory, or it's embarrassing, or shows them in a negative light, it could actually affect their future in a very obvious way. And so this is a fail-safe. And, again, we worked it out with the companies themselves.
Can you give me a specific example in which this might make it easier than the sort of delete button options already available?
How to use the delete button, number one, is not always clear, and then secondly, even if something is deleted, it is potentially retrievable as I understand it. Snapchat is a very good example of something that is deleted being retrievable. Even if something is deleted by the current technology and the current delete button, it doesn't necessarily go away. This would ensure that the Internet company provides the technology — easy to access to technology — so that when something is deleted it is deleted. There are many examples, right? You post something — an inappropriate picture of yourself or you and your friends — and you can take down or remove it without it coming up again.
Would this stop other people from being able to copy things and repost them?
No. We're not overselling the provision for that very reason. Once something has been transmitted to a third party, the Internet company has no obligation to try to take it down from the third party or the transferees Web site.
It seems like around the specific circumstance about college applications, more training or teaching about privacy settings might be helpful.
In fact, on Monday night I sponsored at a forum co-sponsored by Facebook at Rosewood High School in Sacramento where we did just that. I think that's correct, there's no one solution here. Obviously education and how to use these sites properly and parental supervision are all key parts of assuring safety for young people and that these great tools are used appropriately. Our bill is I think one important set of tools, but not the only one.