“MULTI-MODALITY solution.” “Real-time data analytics.” “Breakthrough technologies with high specificity and sensitivity.” These empty buzzwords take on an eerie edge when you hear their aim: “early diagnosis of neuropsychiatric violence.” The White House is considering a plan to study whether monitoring the mentally ill could prevent mass shootings. The proposal is at once a distraction and too dangerous to ignore.

The Post reports that President Trump has been briefed by a longtime associate on a campaign to create an arm of government called the Health Advanced Research Projects Agency, or HARPA. This counterpart to the military’s Defense Advanced Research Projects Agency, or DARPA, would marry scientific discoveries coming from the National Institutes of Health to the day’s most innovative technologies. Supporters claim it could also help solve the gun epidemic: A program named SAFEHOME, or Stopping Aberrant Fatal Events by Helping Overcome Mental Extremes, would try to identify signs of oncoming violent behavior in the mentally ill using monitors from Amazon Echoes to Apple Watches to, somewhat inexplicably, FitBits. (Amazon founder and chief executive Jeff Bezos owns The Post.)

HARPA itself isn’t necessarily a poor idea. The projects its backers initially presented as priorities, from a “next-generation MRI machine” to an eye prosthesis that communicates with the brain, show how pushing biomedical boundaries could save and improve lives. But the proposal the White House is weighing on gun violence would do far more harm than good.

The president has been insisting since the El Paso shooting that mental illness is the main cause of mass shootings in the United States. Research contradicts the contention. Studies suggest psychological struggles are rarely a predictor of gun violence; the mentally ill are far more likely to be victims of violent crimes than to commit them, and other factors, such as childhood physical or sexual abuse, are shared far more often by mass shooters.

Even if the premise of a mental-health monitoring program weren’t fatally flawed, predicting violence would be a messy task. It might even be impossible. A comprehensive report from the Defense Department concluded that anticipating attacks before they happen with any sort of marker is likely to fail. Authorities would find themselves swimming in false positives, and the struggle is greater still when it comes to untested technologies. In one recent example, microphones equipped with algorithms designed to detect stress and anger before aggression erupts in the classroom were triggered by children cheering for pizza and playing Pictionary.

Now imagine those same mistakes, with far higher stakes than school principals summoning parents. SAFEHOME scientists would collect information only from volunteers, yet the prospect of eventually stopping shooters before they pull the trigger envisions a world in which the government has the ability to observe and act on the most personal details of vulnerable civilians’ everyday lives.

And all this because the administration refuses to envision what we most desperately need, and what would obviously help: a world in which fewer Americans had access to weapons of war.

Read more: