After the recent shootings in El Paso and Dayton, Ohio, Ivanka Trump asked those advocating for the new agency whether it could produce new approaches to stopping mass shootings, said one person familiar with the conversations who spoke on the condition of anonymity because they were not authorized to discuss them.
Advisers to Wright quickly pulled together a three-page proposal — called SAFEHOME for Stopping Aberrant Fatal Events by Helping Overcome Mental Extremes — which calls for exploring whether technology including phones and smartwatches can be used to detect when mentally ill people are about to turn violent.
Using his personal connections to Trump and others, Wright has pushed his HARPA proposal to the White House and Health and Human Services Secretary Alex Azar and several senators and House members, according to two people involved in the effort. Last month, on the presidential campaign trail, former vice president Joe Biden also advocated for creating such an agency.
The violence detection plan has alarmed experts studying violence prevention, technology, psychology and mental health.
“I would love if some new technology suddenly came along that would help us identify violent risk, but there’s so many things about this idea of predicting violence that doesn’t make sense,” said Marisa Randazzo, former chief research psychologist for the U.S. Secret Service.
Beyond the civil liberty concerns about monitoring people through their gadgets, Randazzo said, there’s the problem of false positives.
Even if the technology could be developed, such a program would probably flag tens, or hundreds of thousands, more possible suspects than actual shooters. How, she asked, would you sort through them? And how would you know you were right, given the difficulty of proving something that hasn’t happened?
Most concerning, she said, is that the proposal is based on the flawed premise that mental illness is directly linked to mass shootings. “Everything we know from research tells us it’s a weak link at best,” said Randazzo, who spent a decade conducting such research for the Secret Service and is now CEO of a threat-assessment company called Sigma.
In recent weeks, Trump has repeatedly pointed to mental illness as the cause of the United States’ mass shootings. “Mental illness and hatred pull the trigger. Not the gun,” Trump said immediately after last month’s shootings in El Paso and Dayton. Federal health officials have taken steps to make sure government experts don’t publicly contradict Trump.
But studies of mass shooters have found that only a quarter or less have diagnosed mental illness. Researchers have noted a host of other factors that are more significant commonalities in mass shooters: a strong sense of grievance, desire for infamy, copycat study of other shooters, past domestic violence, narcissism and access to firearms. Experts note that those with severe mental illnesses are much more likely to be victims of violence than perpetrators.
“To those who say this is a half-baked idea, I would say, ‘What’s your idea? What are you doing about this?’ ” said Geoffrey Ling, the lead scientific adviser on the HARPA proposal.
A Johns Hopkins University neurologist, Ling was a founding director of DARPA’s Biological Technologies Office. Ling said having the gumption to tackle really big problems and think creatively is what led to DARPA’s successes.
“The worse you can do is fail, and failing is where we are already,” Ling said. “You need to find where the edge is so you can push on that edge.”
Ling said he began working with Wright on the idea of creating HARPA shortly after the death of Wright’s wife in 2016 to pancreatic cancer. According to Ling and others who have worked on the project, Wright was frustrated with the lack of major progress in halting illnesses such as his wife’s pancreatic cancer — which has an overall five-year survival rate of just 9 percent, according to the American Cancer Society.
Wright was not available for comment while recovering from surgery, said Liz Feld, president of the Suzanne Wright Foundation, the organization Wright has used to lobby for the HARPA proposal. Feld said the foundation has worked methodically to gather support for HARPA in the past two years, meeting with Trump officials and congressional leaders.
The idea has backers in both parties. During an Aug. 8 speech at the Iowa State Fair, Biden said creating a HARPA agency could help solve health problems including Alzheimer’s and obesity. “Those who have been in the military know there’s an outfit called DARPA,” he said. “It’s the thing that allows the military to do advanced research on everything from stealth technology and the Internet and all those other things. . . . We should be doing the same thing with health care.”
There is a huge gap between government research bodies such as the National Institutes of Health that fund research in its early stages and the private sector that often applies them to problems and brings solutions to market, said Michael Stebbins, former assistant director for biotechnology during the Obama administration, who has been hired as a consultant for the Wright Foundation.
“That’s the massive hole that HARPA would fill,” Stebbins said. “It’s about creating new capability, driving innovation.”
According to a copy of the SAFEHOME proposal, all subjects involved would be volunteers and that great care would be taken to “protect each individual’s privacy” and “profiling of any kind must be avoided.”
Ling said that even if SAFEHOME fails to predict mass shooters, it could lead to other advances, such as new ways of predicting and preventing suicides or child abuse.
Matthew Nock, a leading suicide researcher at Harvard University, agreed that a new health research arm like HARPA might be helpful. For decades, Nock said he has tried to find ways to predict and prevent suicides. In an email, Nock said he’d welcome an agency that would apply advances in machine learning and artificial intelligence to such efforts.
But he added that using such a proposed agency to “find” links that science has shown don’t exist is dangerous. While research shows mental illness is strongly linked to suicide, Nock noted, the link between it and violence toward others is much weaker.
Other researchers pointed to worrisome results from other recent attempts to use artificial intelligence to predict risk of violence. In court decisions on parole and sentencing, for instance, artificial intelligence programs have at times deepened problems of racial bias, overestimating the likelihood of black offenders committing further crimes and underestimating the likelihood of white offenders doing so, said Stephen Hart, a clinical forensic psychologist and researcher on violence risk assessment.
“The irony is that there are low-tech solutions that already exist for some of these problems that we simply aren’t funding or deploying enough,” said Hart, including research and policies that address the prevalence of guns in the United States.
Another already existing low-tech solution, Hart said, is threat assessment, which emphasizes preventing violence by identifying and addressing problems flagged by fellow students or co-workers.
That was also the conclusion of a 2012 study commissioned by the Pentagon after the mass shooting at the Fort Hood military base. The study’s task force surveyed every technology available that might help predict violence — including DNA swabs, retinal scans and merging big data from military personnel records. Like the HARPA proposal, the task force experts also looked at physical, neurological and genetic biomarkers, but ultimately concluded that predicting violence was a fool’s errand. The study’s panel devoted an entire appendix to dispelling the notion, entitled “Prediction: Why It Won’t Work.” Instead, it recommended approaches such as threat assessment.
“PREVENTION should be the goal rather than PREDICTION,” the task force concluded in its final report.
Jacqueline Alemany and Alice Crites contributed to this report.