Nuala O’Connor is one of the nation’s leading authorities on technology, security, and privacy. The president and CEO of the Center for Democracy & Technology (CDT) in Washington, D.C., O’Connor traces her career to the early days of DoubleClick, where she served as deputy general counsel. She later became the global privacy leader at General Electric. And she later became the first statutorily appointed chief privacy officer (CPO) in the Federal government, when she was named as CPO at the Department of Homeland Security (DHS).

MeriTalk caught up with O’Connor as she prepares to give a Tech Talk at the upcoming Symantec Government Symposium on Aug. 30 in Washington, D.C. We talked to her about some of the hot-button issues that are likely to be discussed at the symposium.

MeriTalk: Technology is clearly advancing faster than our ability to think about and plan for its unintended consequences, particularly in the areas of privacy and civil liberties. Where do you see the balance today between security and privacy?

Symantec-Gov-Symposium_Website-HeaderO’Connor: Let me start from the premise that I don’t think that privacy and security are antithetical at all. I see them as two sides of the same coin. You cannot protect people’s privacy without good security in your systems and good security systems are weakened if privacy is not one of the values you are trying to build to.

But it’s not just the data but the decisions–what are the decisions being made about you in the online world based on the data organizations are collecting and we are providing (hopefully knowingly). Our Digital Decisions program at CDT is really looking at whether the decision-making is fair and is there unintended bias. We don’t ascribe ill will or malicious intent on the part of the companies or institutions, but we think it’s a worthy endeavor to stress-test the algorithm, the device, or the product to make sure there aren’t unintended biases.

We’re also looking at the concept of inference. What are not only the knowing decisions but the assumptions we are making about people, sometimes in an entirely anonymous fashion, based on behavior and conduct in the online world. So, for example, how do your Web surfing habits, your friends, affiliations, and social networks create inferences about you that might perhaps limit the scope of what you see online? It’s not a long leap from the world that we are building with these algorithms to the fact that we have Republicans and Democrats in this country who can’t talk to each other because they are both seeing completely different sets of facts with no connective tissue between those two groups at all. We’re seeing the death of speech on campus because kids on either side of whatever debate are simply demonizing the other side because they’re seeing a completely different set of facts. And I think the Internet community has some responsibility to bear in all of this.

MeriTalk: The Department of Homeland Security recently announced a proposal to ask certain categories of visitors to the United States about their social media use. You were the first official chief privacy officer at DHS; where do you come down on this issue?

O’Connor:  I respect the needs of our country to keep its citizens and institutions safe. But I have very grave concerns with the use of social media profiles or patterns of behavior, especially when those individuals don’t know that those decisions are being made about them by government entities. I am probably harder on the government than I am on the private sector, having worked in both places, and that is because the consequences of deprivation of rights by a government institution or a decision about you by a government entity are far greater. The government has the right to put you in jail. The government has the right to deny you benefits. At the end of the day, a private sector entity cannot deprive you of your rights or your citizenship. I would want to look very hard at the assumptions being built into those algorithmic analyses–are they biased in a race, gender, or other way?

MeriTalk: FBI Director James Comey is scheduled to provide a keynote at this year’s Symantec Government Symposium. And he’s been under fire for the FBI’s position on encryption in the Apple iPhone case and the bureau’s face-off with Silicon Valley. When it comes to cybersecurity and protecting the nation, is there a way out of this deadlock that would satisfy the needs of both privacy and security?

O’Connor: I worked with Jim Comey when I started in the Bush administration. We’re all trying to get to the same place. We’re all trying to get to a country that is safe and secure. We filed an amicus brief on behalf of Apple in that case, so we are very clear on where we stand. End-to-end unbreakable encryption is actually essential to national security. I believe encryption is an asset to national security. So much of our critical infrastructure runs on encrypted software. To have profound weaknesses or defects in the products such that our government would have a direct pipeline into the content or the data is, first of all, an overreach by the Federal government, and secondly it profoundly weakens our national security in a very meaningful way.

I think end-to-end encryption is essential to the growth, stability, and continued innovation and productivity of our country. I think having backdoors or weaknesses will only be exploited by bad actors far more than it would be helpful to our government. But I’m also mindful that our government needs to be able to act quickly and in an agile manner to catch bad people. Most people are OK with breaking into one phone if it’s a terrorist in San Bernardino, or Brussels, or Paris. But the problem is creating a permanent defect in the encryption code is not just breaking into one phone, it’s breaking into all of our phones. And when people start to think about it that way, there’s actually a lot less public support for handing the keys to encryption to any Federal actor no matter how noble and laudable their motives might be.

 

Read More About