A group of security researchers were prepping for a major reveal in 2013: They planned to disclose at a D.C. cybersecurity conference how a security flaw in luxury vehicles could let bad guys break in without keys and start the cars.

But Volkswagen stopped them, winning an injunction in a British court after arguing that publishing a paper detailing the problem would "allow someone, especially a sophisticated criminal gang with the right tools, to break the security and steal a car," according to the Guardian.

It took two years for the parties to agree to a compromise, Bloomberg reported this month: The researchers could give their presentation and publish the paper if they removed a sentence that provided specific details about the problem.

But holding back the paper didn't seem to stop criminals, anyway. London's Metropolitan Police says that 42 percent of car and van thefts in the city were carried out without the owner's keys last year — and the majority appeared to be "the result of organised criminals using key-programming devices to create duplicate keys for vehicles."

It's not clear whether the increase was directly tied to the issues discovered by the researchers. But the story seems to represent a cautionary tale about how efforts to suppress security research can backfire, according to some experts: Companies attempting to save face or avoid costly repairs by keeping quiet about problems may end up leaving consumers at risk and without the information they need to make educated decisions about whom to trust.

In fact, it's exactly what the researchers who discovered the luxury car issue, Flavio Garcia of the University of Birmingham and Baris Ege and Roel Verdult of Radboud University Nijmegen, were worried about: The public, they told the Guardian at the time, had "a right to see weaknesses in security on which they rely exposed." Otherwise, the "industry and criminals" would know that security is weak, but consumers would not.

Volkswagen did not immediately respond to a request for comment on the injunction or what effect it may have had on consumers.

It's unlikely the company would have have been able to keep the research quiet for so long if the case was litigated in the U.S. court system due to First Amendment protections, according to Kurt Opsahl, deputy executive director and general counsel at the Electronic Frontier Foundation.

But security researchers have long faced legal threats when attempting to share research on this side of the Atlantic. In 2008, the Massachusetts Bay Transportation Agency sued three Massachusetts Institute of Technology students and the school in an attempt to block a presentation about vulnerabilities in Boston's transit fare-payment system.

The agency was initially granted a gag order, but it was later lifted and officials eventually met with the researchers to discuss how to fix the issue. And Opsahl, who helped represent the students, argued that the case brought more attention to the flaws than the planned talk would have. 

But experts warn that these sorts of issues are likely to become even more common as the so-called "Internet of things" trend continues: Everything from connected coffee pots to connected cars will now need to involve considerations of digital safety, thrusting companies without a traditional tech background abruptly into the cybersecurity ecosystem.

Volkswagen's response seems to hark back to the way some major software-makers responded to independent researchers who discovered problems back in the 1990s, said Joshua Corman, a founder of a group called I am the Cavalry that focuses on working with companies to address the new physical safety risks of this next wave of connectivity.

Corman says he can remember researchers receiving cease and desist letters from Microsoft and other tech companies back then, but he notes that most firms have since found ways to work with researchers. Many have even started up so-called "bug bounty" programs that pay people for reporting vulnerabilities.

But that shift didn't happen overnight, he said,  and "this is brand new for car companies and medical device companies."

The result is "a mismatch," where longtime researchers are eager to get things fixed and disclosed quickly, but companies who are cybersecurity novices may lack policies or procedures for how to respond when problems are reported, according to Corman.

Whether a company wants to get litigious often depends on how accustomed it is to the security research, agreed Opsahl. "It's a first contact problem — these companies are not really comfortable with the idea that someone is going to expose a flaw in their products and are not really thinking through the freedom of expression implications of what they're trying to do," he said.

But both researchers and companies are going to need to meet somewhere in the middle, Corman argues, if the goal is to make products safer for consumers as soon as possible. Perhaps the most important step a company can take is coming up with a coordinated disclosure policy, a set of public guidelines for how it will respond when researchers come forward with problems, he said.