This is not what encryption actually looks like, strong or otherwise. (Reuters/Kacper Pempel/Files)

A group of privacy advocates and tech companies recently asked the Obama administration to publicly support "strong encryption."

But Obama administration officials have already said they support strong encryption -- repeatedly. In fact, President Obama called himself a "strong believer in strong encryption" in a February interview with re/code.

The problem is that no one can agree on what "strong encryption" actually means.

The petition is just the latest salvo in a battle over how much access law enforcement should have to communications and data secured by encryption -- and whether tech companies should be forced to enable the government to unlock that data, through a practice commonly known as a "backdoor."

Let's take a step back and talk about the basics: Encryption is the technology used to secure digital information. It works by scrambling data so only authorized users can unlock it. In practice, it can be used to lock data stored in a device so securely that it becomes nearly impossible to unlock. Encryption can also be used to secure communications that are "in transit" across the Internet so that only the sender and recipient can unlock it -- a practice widely known as "end-to-end" encryption.

Apple's iMessage and Facetime systems use end-to-end encryption to secure messages, as do other stand-alone apps, such as Signal and TextSecure. About a year ago, Apple announced that it would expand encryption on devices using its mobile operating system, or iOS -- automatically locking content stored on smartphones and other devices with a key that only the user could access. If a user doesn't automatically back up the data to Apple's servers, the company would not be able to turn over information on the device to law enforcement, even if served with a legitimate warrant.

To many cryptographers, Apple's steps sound like the company is embracing "strong encryption."

But some government officials have a different name for it. During a recent panel at Georgetown's  law school, Department of Justice lawyer Kiran Raj, dubbed the protection Apple offers its users "warrant-proof" encryption.

"Warrant-proof encryption, as distinct from strong encryption, is where you design a system in a manner where only the end user has access to the information," he said. Other companies offer "strong encryption," he said, but still maintained the ability to unlock the data for their own reasons.

That's a thinly veiled reference to how companies such as Google handle e-mail: Instead of encrypting a message directly between a sender and a recipient, they encrypt the connection between users and their servers. That gives Google the ability to scan messages and use the information in them to help target advertisements and provide other features, such as the ability to search e-mail archives.

But experts generally acknowledge that such systems, which rely on storing data in the cloud, are less secure than those in which only a user can unlock their data.

Then there is the issue of so-called back doors -- built-in ways for law enforcement to gain access to encrypted data. Most technologists say having a "back door" defeats the purpose of using "strong encryption," said Joseph Lorenzo Hall, the chief technologist at the Center for Democracy and Technology.

Even the term "back door" itself is in dispute: Law enforcement officials have dismissed the term, instead saying they want "front door" access.

The definition debates are just part of the difficulty of coming up with a policy for encryption. A more deep-seated problem is that technologists have long said it is impossible to provide the type of access law enforcement officials want without fundamentally undermining the security of communications products. Law enforcement officials have, in turn, suggested that tech companies just aren't trying hard enough to come up with a solution.