Apple settled its federal lawsuit Tuesday against Corellium, the maker of tools that allow security researchers to find software flaws in iPhones, according to court records.

The case, which became a lightning rod in the security industry, had been scheduled to go to trial in Fort Lauderdale, Fla., in federal court on Aug. 16. Apple filed suit against the Florida company in 2019 to shut down its “virtualized” iPhone business, which allows researchers to test iPhone software on computers, instead of on actual iPhone devices.

The terms of the settlement were confidential. An email from the Corellium sales team confirmed the company was still selling its virtual iOS devices.

Corellium co-founder Christopher Wade declined to comment for this story. Apple didn’t immediately respond to a request for comment.

Corellium was previously facing the prospect of years of expensive and drawn out legal action, and many in the security research community saw the lawsuit as having a chilling effect on independent research.

Apple alleged in its lawsuit that Corellium violated its copyrights and that its products were a violation of the Digital Millennium Copyright Act, which is meant to protect entertainment companies from online piracy.

“If the decision in the upcoming trial had gone badly, it could have cast a shadow on the security industry,” said Kurt Opsahl, deputy executive director of the Electronic Frontier Foundation, an internet advocacy organization. “Security research is vital to protection computers on which we all depend.”

In June, the EFF published a letter calling on technology companies like Apple to stop using the DMCA to hinder security research. In addition to the EFF, 22 companies signed on to support for the letter.

Security researchers sometimes must break digital “locks” on software in order to carry out their work. The DMCA prohibits breaking or circumventing those locks. While the law makes exceptions for security research, it’s unclear exactly how far those protections go, particularly with respect to tools such as the virtual iPhones Corellium sells.

Apple’s lawyers accused the company of selling its products to government agencies that could have used the software to find flaws in Apple software, according to court records.

One of Corellium’s co-founders, David Wang, helped the FBI unlock an iPhone belonging to the one of the terrorists responsible for the 2015 San Bernardino attack. Wang did that work when he was employed by an Australian firm called Azimuth security.

Apple also alleged Corellium circumvented Apple’s security measures to create the software, thereby violating the Digital Millennium Copyright Act. Corellium denied that accusation, which would have been a key point of debate at trial.

In December, U.S. District Judge Rodney Smith dismissed Apple’s copyright claims, calling some of Apple’s legal arguments “puzzling, if not disingenuous.” But Smith allowed Apple’s Digital Millennium Copyright Act claims to move forward.

Apple could still appeal the Judge Smith’s ruling on the copyright claim.

Corellium was co-founded in 2017 by Wade and his wife Amanda Gorton, among others. It was considered a breakthrough in security research because it makes it unnecessary to use physical iPhones that contain specialized software to poke and prod iOS, Apple’s mobile operating system.

Apple initially attempted to acquire Corellium in 2018, according to court records. Corellium turned Apple down.

Apple has long marketed its phones as secure. But the Pegasus Project, an investigative effort involving The Washington Post and 16 newsrooms around the world, revealed new details about how foreign governments use hacking tools to crack into iPhones to spy on journalists, dissidents and other political enemies.

Apple also restricts the access outside researchers have to iOS in a way that makes investigation of the code more difficult and limits the ability of consumers to discover when they’ve been hacked, researchers say.

Corellium also offers its virtual iPhones free to journalists, who could use the virtual iPhones to avoid surveillance from authoritarian governments.

Apple again found itself at odds with many in the security research industry last week, when it announced it would introduce new software to scan iPhone photo libraries for child pornography. Instead of scanning for the photos on Apple’s own servers, as most other technology companies do, Apple opted for the scanning to occur on Apple devices. The decision was an effort by Apple to protect the privacy of users, but critics accused Apple of overstepping its boundaries and creating software that could be abused in the future by authoritarian governments. Apple defended the decision and dismissed the notion that the software could be abused.

“It’s good to see that Apple chose to step back from this case," said Blake Reid, a clinical professor at University of Colorado’s law school, who has researched copyright law and security research. “Unfortunately, there remain a wide range of issues around the DMCA and its application to security research that remain unresolved,” he said.