Apple CEO Tim Cook accused the government of asking it to build a "backdoor" into an iPhone
WASHINGTON:
It was a compelling set of facts for the government: A terrorist attack in California that killed 14 people, an iPhone possibly containing crucial evidence that could not be unlocked, and a warrant to search the phone.
But the phone's contents are encrypted and Apple, according to the Justice Department, has refused to help the FBI find a way to unlock device. So this week, the government got a court to order Apple to help.
In a memo that verged on the apocalyptic, Apple CEO Tim Cook accused the government of asking it to build a "backdoor" into an iPhone and to design software that amounts to "hack[ing] our own users."
Apple argues that starting from the hours after the Dec. 2 attack up until days ago, it has worked with the FBI to give the agency what data it has-material backed up from the phone into iCloud, for instance. But Apple did not want to do anything that it said would weaken the device's security, such as creating software that would effectively let officials try to crack the phone's password.
In the escalating fight over encryption, the U.S. government has moved to force a showdown that's been years in the making. By Wednesday morning, the Justice Department and the Silicon Valley giant had torqued up the encryption debate, raising the stakes for those who support widespread strong encryption to protect privacy and security, and those who believe that courts should be able to compel tech firms to accommodate law enforcement's need to thwart criminals and terrorist attacks.
"The government wants to lay down a marker here that companies do have to provide assistance when they can," said Timothy Edgar, senior fellow at Brown University's Watson Institute and a former privacy officer with the Office of the Director of National Intelligence.
"And Apple is saying, 'We don't want to have to hack our own customers.' The outcome of the case is going to be hugely important for the balance between privacy and security."
Anyone watching the debate over the last year and a half knew that this day would come. "This is the ideal case for the government to challenge industry in the encryption debate," said Michael Sussmann, a former Justice Department official and a partner at Perkins Coie law firm.
"The facts are sympathetic to the government and present the starkest example of their need to gain access to encrypted data to protect the American public."
The phone at the center of the debate, an iPhone5C, was used by Syed Rizwan Farook, who with his wife, Tashfeen Malik, opened fire at a holiday party at the Inland Regional Center, a county facility. The couple, who pledged loyalty to the Islamic State terrorist group, died a few hours later in a shootout with police.
What the government wants Apple to do is design new software to install on the phone that would block it from automatically wiping data after 10 tries at a password. That would enable the FBI to "brute force" the phone, or to try to crack the password - attempting tens of millions of combinations without risking the deletion of the data.
The government also wanted the software to permit the FBI to send passwords to the phone electronically, rather than having someone manually type them in. And the software must prevent the phone from adding delays between password attempts.
The request, the Justice Department said, does not require Apple to redesign its products, to disable the phone's encryption or open its contents. The software, it said, would operate only on that one phone.
Technical experts said that all of that is possible. The question is, is it desirable?
Said Cook in his memo: "Make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a back door." Though government officials say the software would be designed just for one phone, he said, "Once created, the technique could be used over and over again, on any number of devices."
He said it would set a dangerous precedent for future orders. "The implications of the government's demands are chilling." If the government has its way, he said, it could "demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge."
But New York City Police Commissioner William J. Bratton said the government's demands are reasonable and justified, especially in a case that has ties to the Islamic State, also known as ISIL. "No device, no car, and no apartment should be beyond the reach of a court-ordered search warrant," he said. "As the threats from ISIL become more divergent and complex, we cannot give those seeking to harm us additional tools to keep their activity secret."
The polarization of the debate can be seen in the use of the term "backdoor," which Cook accused the government of seeking. The term itself is imprecise and can be understood to mean anything that is intended to create a way around encryption-or more broadly that would weaken security. The government rejects the term for describing what it's asking for.
The White House on Wednesday pushed back against Apple and its framing of the argument. "This case doesn't require Apple. . .to create a new backdoor," Press Secretary Josh Earnest said. "It's a very specific request that the Department of Justice has made, and a judge agreed with them."
Reaction from Capitol Hill was swift and divided.
"Court orders are not optional, and Apple should comply," said Sen. Richard Burr, R-North Carolina chairman of the Senate Intelligence Committee.
His colleague on the committee, Sen. Ron Wyden, D-Oregon, said that "companies should comply with warrants to the extent they are able to do so, but no company should be forced to deliberately weaken its products. In the long-run, the real losers will be Americans' online safety and security."
Some legal analysts said the order issued by a federal magistrate judge in Riverside, California, opens a Pandora's box of unknowns. "If a court has the power to order a third party like Apple to devise software that it does not already possess [to aid in surveillance], what can't a court order a company to do?" said Stephen Vladeck, a law professor at American University. "There's a real search for a limiting principle here that we haven't identified."
Apple has five business days to respond and has vowed to challenge the order.
© 2016 The Washington Post
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)
But the phone's contents are encrypted and Apple, according to the Justice Department, has refused to help the FBI find a way to unlock device. So this week, the government got a court to order Apple to help.
In a memo that verged on the apocalyptic, Apple CEO Tim Cook accused the government of asking it to build a "backdoor" into an iPhone and to design software that amounts to "hack[ing] our own users."
Apple argues that starting from the hours after the Dec. 2 attack up until days ago, it has worked with the FBI to give the agency what data it has-material backed up from the phone into iCloud, for instance. But Apple did not want to do anything that it said would weaken the device's security, such as creating software that would effectively let officials try to crack the phone's password.
In the escalating fight over encryption, the U.S. government has moved to force a showdown that's been years in the making. By Wednesday morning, the Justice Department and the Silicon Valley giant had torqued up the encryption debate, raising the stakes for those who support widespread strong encryption to protect privacy and security, and those who believe that courts should be able to compel tech firms to accommodate law enforcement's need to thwart criminals and terrorist attacks.
"The government wants to lay down a marker here that companies do have to provide assistance when they can," said Timothy Edgar, senior fellow at Brown University's Watson Institute and a former privacy officer with the Office of the Director of National Intelligence.
"And Apple is saying, 'We don't want to have to hack our own customers.' The outcome of the case is going to be hugely important for the balance between privacy and security."
Anyone watching the debate over the last year and a half knew that this day would come. "This is the ideal case for the government to challenge industry in the encryption debate," said Michael Sussmann, a former Justice Department official and a partner at Perkins Coie law firm.
"The facts are sympathetic to the government and present the starkest example of their need to gain access to encrypted data to protect the American public."
The phone at the center of the debate, an iPhone5C, was used by Syed Rizwan Farook, who with his wife, Tashfeen Malik, opened fire at a holiday party at the Inland Regional Center, a county facility. The couple, who pledged loyalty to the Islamic State terrorist group, died a few hours later in a shootout with police.
What the government wants Apple to do is design new software to install on the phone that would block it from automatically wiping data after 10 tries at a password. That would enable the FBI to "brute force" the phone, or to try to crack the password - attempting tens of millions of combinations without risking the deletion of the data.
The government also wanted the software to permit the FBI to send passwords to the phone electronically, rather than having someone manually type them in. And the software must prevent the phone from adding delays between password attempts.
The request, the Justice Department said, does not require Apple to redesign its products, to disable the phone's encryption or open its contents. The software, it said, would operate only on that one phone.
Technical experts said that all of that is possible. The question is, is it desirable?
Said Cook in his memo: "Make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a back door." Though government officials say the software would be designed just for one phone, he said, "Once created, the technique could be used over and over again, on any number of devices."
He said it would set a dangerous precedent for future orders. "The implications of the government's demands are chilling." If the government has its way, he said, it could "demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge."
But New York City Police Commissioner William J. Bratton said the government's demands are reasonable and justified, especially in a case that has ties to the Islamic State, also known as ISIL. "No device, no car, and no apartment should be beyond the reach of a court-ordered search warrant," he said. "As the threats from ISIL become more divergent and complex, we cannot give those seeking to harm us additional tools to keep their activity secret."
The polarization of the debate can be seen in the use of the term "backdoor," which Cook accused the government of seeking. The term itself is imprecise and can be understood to mean anything that is intended to create a way around encryption-or more broadly that would weaken security. The government rejects the term for describing what it's asking for.
The White House on Wednesday pushed back against Apple and its framing of the argument. "This case doesn't require Apple. . .to create a new backdoor," Press Secretary Josh Earnest said. "It's a very specific request that the Department of Justice has made, and a judge agreed with them."
Reaction from Capitol Hill was swift and divided.
"Court orders are not optional, and Apple should comply," said Sen. Richard Burr, R-North Carolina chairman of the Senate Intelligence Committee.
His colleague on the committee, Sen. Ron Wyden, D-Oregon, said that "companies should comply with warrants to the extent they are able to do so, but no company should be forced to deliberately weaken its products. In the long-run, the real losers will be Americans' online safety and security."
Some legal analysts said the order issued by a federal magistrate judge in Riverside, California, opens a Pandora's box of unknowns. "If a court has the power to order a third party like Apple to devise software that it does not already possess [to aid in surveillance], what can't a court order a company to do?" said Stephen Vladeck, a law professor at American University. "There's a real search for a limiting principle here that we haven't identified."
Apple has five business days to respond and has vowed to challenge the order.
© 2016 The Washington Post
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)
Track Latest News Live on NDTV.com and get news updates from India and around the world