But Steinbach's testimony also suggests he meant that companies shouldn't put their customers' access to encryption ahead of national security concerns -- rather than saying the government's top priority should be preventing the use of the technology that secures basically everything people do online.
"Privacy, above all other things, including safety and freedom from terrorism, is not where we want to go," Steinbach said. He also disputed the "back door" term used by experts to describe such built-in access points. "We're not looking at going through a back door or being nefarious," he argued, saying that the agency wants to be able to access content after going through a judicial process.
But many technical experts believe that building intentional vulnerabilities into the systems that people around the world rely on reduces the overall security of the entire digital system, even if done to comply with legal requirements.
The policy fight over encryption has been going on since the 1990s -- when it resulted in policies that are still causing security problems for Internet users around the world, even though the policies have since been changed. But the debate regained steam after revelations about National Security Agency spying from former government contractor Edward Snowden.
In the wake of those reports, tech companies -- most notably Apple with its iPhone -- have expanded how they protect users with encryption, in some cases automatically rolling out a more robust form of encryption called end-to-end. End-to-end protections mean that only the sender and the recipient can unlock communications -- so tech companies can't provide access to law enforcement even if served with a legitimate court order.
This prompted a backlash from some law enforcement officials, who warn that encryption can allow criminals and terrorists to "go dark" -- making it harder for the government to track them. Leaders such as Comey have argued that Congress should make tech companies build in ways for law enforcement to access secured content from their products.
It's this argument Steinbach made in the hearing:
So, when a company, a communications company or a ISP or social media company elects to build in its software encryption, end-to-end encryption, and leaves no ability for even the company to access that, we don't have the means by which to see the content. When we intercept it, we intercept encrypted communications.So that's the challenge: working with those companies to build technological solutions to prevent encryption above all else.
Many encryption experts say building in such technical solutions would fundamentally undermine the security of the technology because there's no guarantee hackers couldn't use the same "back door."
But government officials are still examining their options. Earlier this year, National Security Agency chief Adm. Michael Rogers floated an idea that would involve splitting up the keys to decode encryption to provide more oversight and make an access point harder for hackers to exploit.
A recent report from the United Nations Office of the High Commissioner for Human Rights recommended governments avoid mandating back doors along with other policies that could weaken encryption because it would weaken the security many around the world depend on to exercise freedom of expression.
Last month, a group of tech companies, civil society groups and academics sent a letter to President Obama urging him to oppose efforts force companies to build in ways for law enforcement to access products and services protected by encryption.
And despite testimony like Steinbach's, opponents of back doors seem to be making inroads on the hill: Earlier this week, the House approved two amendments about the issue to an appropriations bill. The first, from Reps. Zoe Lofgren (D-Calif.) and Ted Poe (R-Tex.), would bar the government from forcing a company to alter its security measures to spy on users -- unless it is already required to comply with an existing wiretap law.
Another, from Rep. Thomas Massie (R-Ky.), would stop funds provided to the agency that sets cryptographic standards from being used to consult with the NSA unless it strengthens, rather than weakens, information security.