In the depths of my grief, I pledged to honor Alison’s memory in the best way I could. My wife, Barbara, and I started a nonprofit group to fund arts programs for underserved children in Southwest Virginia, and I began advocating sensible gun-safety reforms.
My goal was — and still is — to keep guns out of the hands of dangerous, unstable people so others won’t suffer the same fate as Alison. In response to my fervent advocacy, however, countless people have targeted me and my family online, claiming that Alison’s death was faked as part of some diabolical conspiracy to seize their guns. They have taken the gruesome footage of my daughter’s murder, edited it into videos selling these lies and flooded YouTube with hate-filled diatribes maligning my family.
The vitriol directed at me and my family has been unbearable. So I was outraged to discover that recommendation algorithms for YouTube and its parent company, Google, have bolstered these conspiracy theories.
It started when I searched “For Alison,” the name of our nonprofit, on Google. The search returned a YouTube video posted by an anonymous conspiracy theorist alleging that our foundation was a scam. This prompted me to search our daughter’s name, which led me down a rabbit hole of painful and despicable content, including claims that Alison had plastic surgery and was living a secret life in Israel.
As much as I want to blame the sick creators for the pain I feel, I blame Google even more. By surfacing this content and profiting from the data Google collects from those who view it, Google is monetizing Alison’s death and our family’s pain.
Our society must not tolerate conspiracy theorists threatening and harassing private citizens. And Google, one of the wealthiest companies in the United States, should not be directing users to online content that spreads hate and menace on YouTube and other sites that it owns, such as Blogger. And it certainly should not be delivering any ad value to that content.
Although YouTube recently announced that it will try to reduce its promotion of “borderline content” that misinforms viewers, such as flat-Earth videos and conspiracies around historic events such as 9/11, my experience with the company tells me that this effort will not be effective.
Since Alison’s murder, I’ve implored YouTube and Google to remove the graphic videos of my daughter’s death and related conspiracy content from their platforms. What I’ve since learned is that Google’s policies are neither clear nor consistent. Removal of videos that violate its terms of service seemingly happens on an ad hoc basis.
Worst of all, my efforts have been retraumatizing. Google has placed the burden on people such as me to flag harmful videos and explain why they should be removed. This is a burden I simply cannot bear.
As the company with a virtual monopoly on Internet search, Google has a duty to ensure that private citizens are protected from targeted harassment and conspiratorial content that can be found through, or hosted on, its platforms. In this regard, it has failed miserably.
The Georgetown Law Civil Rights Clinic is now assisting me to again urge Google to adequately monitor its own platforms. If Google again fails, Congress should step in and provide much-needed oversight.
This is not about stifling free speech. Odious free expression is lawful, but targeted harassment, defamation and slander are not. In fact, the relentless attacks against victims of mass shootings and their surviving loved ones are meant to intimidate us and silence our speech.
What I have experienced is far from unique. The parents of children murdered at Sandy Hook Elementary
School have faced relentless abuse and threats against their lives, as have the teen survivors of the Parkland, Fla., shooting who became outspoken advocates for gun control.
The Internet is an ever-evolving sphere of speech, and it is important that we work to make it a safe place for all people to share their ideas and be heard. For all the amazing technical achievement Google has pioneered, the company has unleashed a dark web of misinformation hiding in plain sight.
You might think that losing a child in a televised, coldblooded murder is just about the worst thing a parent can experience. But Google allowing footage of that horrific moment to be weaponized against my family has made our living nightmare so much worse. Tragically, I know we won’t be the last to suffer this pain.
The people running Google have a choice. They can take this call to action seriously and provide meaningful protections against harassment. Or they can continue letting conspiracy theorists mine survivors’ loss for attention, entertainment and perhaps even money.