In 2018, John Rael, a volunteer track coach in Taos, N.M., was on trial on charges he raped a 14-year-old girl when his lawyer made an unusual request. He wanted the judge to admit evidence from EyeDetect, a lie-detection test based on eye movements that Rael had passed.
EyeDetect is the product of the Utah company Converus. “Imagine if you could exonerate the innocent and identify the liars . . . just by looking into their eyes,” the company’s YouTube channel promises. “Well, now you can!” Its chief executive, Todd Mickelsen, says they’ve built a better truth-detection mousetrap. He believes eye movements reflect their bearer far better than the much older and mostly discredited polygraph. Its popularity may be growing: The company says EyeDetect has gone from 500 customers in 2019 to 600 now.
Its critics, however, say the EyeDetect is just the polygraph in more algorithmic clothing. The machine is fundamentally unable to deliver on its claims, they argue, because human truth-telling is too subtle for any data set.
And they worry that relying on it can lead to tragic outcomes, like punishing the innocent or providing a cloak for the guilty.
EyeDetect raises a question that draws all the way back to the Garden of Eden: Are humans so wired to tell the truth we’ll give ourselves away when we don’t?
And to a more 21st-century query: Can modern technology come up with the tools to detect those tells?
An EyeDetect test has a subject placed in front of a monitor with a digital camera and, as with the polygraph, lobbed generically true-false queries like “have you ever hurt anybody” to establish a baseline.
Then come specific questions. If the subject’s physical responses are more demonstrative there, they are presumed to be lying. Less demonstrative, they’re telling the truth. (The exact number of flubbed questions that determines a failure is governed by algorithm. The computer spits out a yes or no based on an adjustable formula.)
Where the polygraph measures blood pressure, breathing and sweat to determine the flubbing, EyeDetect looks at factors like pupil dilation and the rapidity of eye movement. “A polygraph is emotional,” Mickelsen said. “EyeDetect is cognitively based.” He explains why eye movements could be affected: “You have to think harder to lie than to tell the truth.”
EyeDetect plays into a form of techno-aspirational thinking. Our Web browser already pitches us a vacation we swear has only lived in our minds while dating apps serve up a romantic partner dreamed up in our hearts. Surely an algorithm can also peer into our soul?
But experts say such logic may not have much basis in science.
“People have been trying to make these predictions for a long time,” said Leonard Saxe, a psychologist at Brandeis University who has conducted some of the leading research in the field of truth detection. “But the science has not progressed much in 100 years.”
Like most renowned experts, he has not reviewed EyeDetect’s research specifically. But, he says, “I don’t know of any evidence that eye movements are linked to deception.”
When it comes to the polygraph, experts have a long history of declaring failure.
The machine, which celebrates its centennial this year, continues to be used in areas like police interrogations, government security-clearance investigations and sex-offender monitoring. The market is valued at some $2 billion, powered by hiring at many federal and local offices.
Yet the American Psychological Association takes an unequivocal position. “Most psychologists agree that there is little evidence that polygraph tests can accurately detect lies,” it declares on its website. Good liars, after all, can cover up tics, while nervous truth-tellers might set the machine berserk.
A 1988 federal law bans private employers from administering polygraphs, though there are some loopholes. Most states don’t accept them as evidence in court either (New Mexico is famously looser), while a 1998 Supreme Court ruling denied federal criminal defendants had a constitutional right to them.
If it turns out to be more accurate than a polygraph, EyeDetect can conjure a number of useful consequences and a few dystopic ones. What grisliness awaits if anyone could know if you were telling the truth just by looking at you? That lie to spare your Aunt Lily’s feelings at Christmas would be out the window. So would being a teenager.
If it proves hollow, though, an entirely different danger lurks: With its veneer of authority, many legal experts worry, it could lead law enforcement, private employers, government agencies and even some courts even further down the wrong path than the polygraph.
“It’s the imprimatur that’s the issue. We tend to believe that where there’s science involved it’s reliable,” said Laurie Levenson, a law professor at Loyola Marymount University, who has studied the issue.
She said she was concerned that people wouldn’t get clearances for jobs or would otherwise be held accountable for things they did not do because of false positives. She noted it also could help the guilty get away.
There are historical reasons for skepticism about any new truth-telling tech. Like diet sweeteners to the soft-drink industry, such innovations come along in the legal world at regular intervals. But they often fall short of their promise. About 15 years ago, the functional MRI, which posited that blood flow to the brain could hold the key to truth detection, enjoyed a period of buzz. But the device largely did not meet scientific standards, and cost and intensiveness further inhibited broad adoption.
The P300 guilty-knowledge test championed by the longtime Northwestern University professor Peter Rosenfeld, who asserted that the future of truth detection lay in brain waves, gained some enthusiasm from the scientific community. It was the subject of more than a dozen outside journal articles, many encouraging, and has won the tentative support of Henry Greely, a Stanford Law School professor who is one of the leading experts on technology and the law. Among other advantages, it involves an EEG, which is fairly cheap and easy to use.
Still, Rosenfeld died last winter without the method catching on. The work is continued by his lab, which has done much of the research on the P300, and can count respected legal minds like John Meixner, an assistant U.S. attorney in Michigan, as past researchers and proteges of Rosenfeld’s.
On a recent afternoon, Mickelsen sat in his office in the technology corridor of Lehi, Utah, and, over Zoom, coolly screen-shared a series of graphs and charts to make the case why EyeDetect is different from failed past technologies.
EyeDetect’s accuracy rate is determined by a simulated-crime interrogation. A group of “innocent” and “guilty” subjects are told whether to commit a simulated crime of petty cash theft in a manufactured environment and then administered the ocular test. The rate at which the machine will then correctly predict a person’s truth-telling status — Converus researchers already know the right answer for each subject — is between 83 percent and 87 percent, according to the test results, the company says.
That’s about the same as a polygraph tends to achieve in its tests, though the polygraph can discard up to 10 percent of borderline results as “inconclusive,” while the EyeDetect gives a result on every test, leaving its accuracy percentage higher. Mickelsen also says the system is preferable to a polygraph because, by entirely automating the test, it avoids the possibility of human bias.
The man at EyeDetect’s scientific core is John Kircher, a now-retired University of Utah professor who has consulted for the CIA and had his lab funded by the Defense Department. Kircher had been researching and writing software for lie-detection technologies for decades when, in the early 2000s, he came across a University of New Hampshire professor who was researching how our eye patterns change during reading.
“Suddenly it hit me: All the software I had developed for 30 years could be applied to this problem,” Kircher said. He wedded the two and, for much of the past two decades, has been perfecting EyeDetect, now as Converus’s chief scientist.
Kircher says that the software for his eye tracker captures a machine-level 350,000 eye-movement metrics — including “fixations,” the milliseconds-long pause between words — over a 25-minute test. Four metrics are taken every second. (Converus also has the EyeDetect Plus, which adds a computer-administered variation on a traditional polygraph.)
EyeDetect has won its supporters in the field. Law-enforcement customers laud the system as smoother than a traditional polygraph.
“People will come in nervous because they’re expecting what they see on TV, where you’re hooked up to this machine and sweating and it just seems really invasive,” said Joshua Hardee of the Wyoming Highway Patrol. “This is just clean and quick.”
Hardee’s department has used EyeDetect to screen more than 150 prospective job candidates in the last two years. His department and others pay around $5,000 for the EyeDetect system — which consists of a laptop, eye tracker, mouse, headphones and chin rest along with software that generates questions and calculates the responses — and then $80 to Converus to score each individual test if a customer trains and uses its own personnel, and the fee more if they don’t. The machine reduces liar false alarms, Hardee believes, because anxiety plays less of a role.
Other public officials have also been persuaded. The Tucson Fire Department uses EyeDetect to screen employees. The machine is also put to use, Converus says, by law enforcement or corrections departments in states including Idaho, New Hampshire, Washington, Utah, Ohio and Connecticut. Defense lawyers in the ongoing case of Jerrod Baum, accused of killing two teens in Utah, have petitioned the judge to allow EyeDetect. The jury in the Rael case appeared amenable, too. (The defendant later pleaded guilty and avoided jail time. He was given four years of probation.)
But many experts are not swayed. Saxe makes the point many scientists and academics do: Even if eye movements are fundamentally different under different sets of circumstances, there’s no way to link them to lying. In fact, the difference could just have to do with the fact that the subject is nervously taking a test.
“Fear of detection is not a measure of deception,” he said. At heart, the issue may come down to the 21st century desire to automate and digitize a process — in this case, human emotion and motivation — that fundamentally resists the enterprise.
Stanford’s Greely says EyeDetect doesn’t dislodge his broader skepticism about truth-telling tech, either.
“I see no reason to believe that this works well or, really, at all,” he said in an email. “Show me large, well-designed impartial studies and I’ll be interested.”
In a phone interview, he noted that while it’s not hypothetically impossible that the body engages in particular physiological changes in response to lies, the burden of proof lies heavily on the new technologies. The tests involving simulated crimes that EyeDetect uses, he said, contain a fundamental weakness of people being instructed to lie in a test situation might react more demonstratively than a criminal in the real world.
He also noted a lack of published research by people not affiliated with Kircher or his lab.
Kircher says the Defense Department is currently conducting a study of ocular technologies that he hopes will conclude by the summer. A Pentagon spokesman did not reply to a request for comment.
Not all outside experts are unmoved, however. “All truth-detection methods are imperfect. But here’s the reason it’s worth relying — not over-relying, but relying — on them,” said Clark Freshman, a law professor at the University of California at Hastings who specializes in lie detection, expressing optimism about the EyeDetect.
“People can do even worse than a coin flip at telling whether someone is lying. So if you get better results — even if it’s just 70 percent or 80 percent accurate — than if you didn’t use it, and it’s generally free from bias, I don’t understand why you wouldn’t make it part of the picture,” Freshman said.
He said studies showed juries do not overly rely on these technologies, but instead include them as one factor among many.
There is also a particular abuse concern with EyeDetect. Without the human element, there may be less bias. But the device could also be intentionally set at algorithmic levels that would make it difficult to pass, at least with the polygraph there’s a transcript of human conversations. (Converus says that it trains customers on how to use the machine and, while it acknowledges that it allows them to set their own “base rate of guilt,” a spokesman also says that “if we observe a BRG set outside of a reasonable range, we make an inquiry.”)
Even if ocular technology can’t actually root out fibbers, there may be some value in how it could discourage them in the first place. In other words, EyeDetect may not need whiz-bang technology. It just needs to look high-tech. As Brad Bradley, the fire chief in Tucson, says in materials from Converus: “A guilty applicant, such as one with a drug history, looks at it and thinks, ‘I’m not going to pass. So, I’m not going to even apply.’ ”
Still, the odds of this moving from realms like hiring to mainstream courtrooms are slim. Plato said the eye is the window to the soul. He was silent on whether it was a ticket out of jail.
“This is going to be an uphill climb in almost any court, especially after the debacle that was the polygraph,” said Loyola Marymount’s Levenson. In her opinion, they do not deserve to scale that hill. “Truth-telling should be determined by people, not machines,” she said.