WASHINGTON — Apple and Google have created dangerous “hiding places” by augmenting smartphone encryption that blocks government access to phones’ information, a Massachusetts prosecutor told a congressional committee Wednesday. But a Democrat on the committee called law enforcement’s demand for such access “technologically stupid.”
The House Oversight and Government Reform’s technology subcommittee heard a barrage of testimony Wednesday both for and against encryption standards that would allow law enforcement and intelligence agencies to get into private smartphone data as part of lawful investigations.
The hearing was the latest chapter of a national debate that has been raging since Edward Snowden, a former National Security Agency contractor, leaked classified information about several wide-ranging government surveillance programs beginning in June 2013.
In response to the leaks, Apple and Google updated their operating systems last September by beefing up their encryption, making it effectively impossible for government agencies to obtain information managed by those operating systems, including private photographs, emails and location data.
“This was a private sector response to a demand from the public,” said Rep. Ted W. Lieu, D-Calif., who argued that the tech giants were acting in the public interest by implementing the new encryption standards.
But national security and law enforcement officials countered that access to private cell phone data is a necessary tool in the fights against crime and terrorism.
“Apple and Google have created hiding places beyond the reach of law enforcement,” said witness Dan Conley, a district attorney from Suffolk County, Massachusetts. “What they’re doing is dangerous and should not be allowed to continue.”
The heart of the debate, however, which included witnesses from the FBI, the app development industry and academia, was not the ethical and legal back-and-forth that usually takes center stage in discussions of surveillance but the technological realities that underpin them.
“The fact is that you cannot create a back door just for the good guys,” said Kevin Bankston,
policy director at the Open Technology Institute of the New America Foundation. “If you create a way for the FBI to access these systems, you make them that much more vulnerable to attack from far more nefarious actors.”
Redesigning encryption systems to allow for these back doors would have big implications for business as well as for security.
“For one thing,” said Jon Potter, president of Application Developers Alliance, “every market would want its own back door. If the U.S. has one, then Europe will want one, South Korea will want one. All of this will add tremendously to the cost of developing products.”
Lieu dismissed back doors as “technologically stupid.”
Though the prevailing sentiment among committee members lined up with Lieu’s dismissal by the end of the hearing, subcommittee chairman Rep. Will Hurd, R-Texas, steered the conversation toward comprise by stressing the balance between civil liberties and national security.
“These are not mutually exclusive goals,” he said. “By strengthening our civil liberties, we undoubtedly make ourselves safer.”