Tip:
Highlight text to annotate it
X
Ben Wizner: Not a lot of applause for us. Chris Saghoian: I know.
Ben Wizner: Are they ready? Chris Saghoian: I think so.
Ben Wizner: Ok, I think we'll get started. Thank you all so much for being here. There
wasn't a lot of applause when we came on stage so I guess you're here to see somebody else. (( Applause ))
My name is Ben Wizner; I'm joined by my colleague Chris Saghoian from the ACLU. Maybe we can
bring up on screen the main attraction.
Edward Snowden: Hello. Ben Wizner: With his very clever green screen. Please bear with us today, the technology
may have some kinks. The video may be a little bit choppy. Our friend is appearing through
7 proxies so if the video is a little slow.
You're joining us for the event that one member of congress from the great state of Kansas hoped would not occur. He wrote to the organizers
or SXSW urging them rescind the invitation to Mr. Snowden. The letter included this very
curious line, "The ACLU would surely concede that freedom expression for Mr. Snowden has
declined since he departed American soil". No one disputes that freedom of expression
is stronger here than there but if there's one person for whom that's not true, it's
Ed Snowden. If he were here in the United States he would be in a solitary cell, subject
probably to special administrative measures that would prevent him from being able to
communicate to the public and participate in the historic debate that he helped launch.
We're really delighted to be here. One more bit of housekeeping. As I'm sure
most of you know, you can ask questions for Mr. Snowden on twitter using the hash tag
AskSnowden. Some group of people backstage will decide which of those questions we see
here. We'll try to leave at least 20 minutes or so for those questions. As I said, Ed Snowden's
revelations and the courageous journalism of people like Bart Gellman who you just heard,
Glenn Greenwald, Laura Poitras, and others has really launched an extraordinary global
debate. You might think of that debate as occurring over 2 tracks. There is a debate
in Washington in the halls of power about law and policy about what democratic controls
we need to reign in NSA spying that takes place in courts that are considering the legality,
the constitutionality of these programs in the legislature, considering legislation.
There's a very different conversation that you here in conference rooms in technology
companies, particularly among people working on security issues. Those people are talking
less about the warrant requirement for meta data and more about why the hell the NSA is
systematically undermining common encryption standards that we all use. Why is the NSA
targeting telecommunication companies, internet companies, hacking them to try to steal their
customer data, basically manufacturing vulnerabilities to poke holes in the communication systems
that we all rely on? We're hoping to mostly focus on that latter conversation here.
With that in mind, Ed if you're with us, maybe you could say a few words about why you chose
for some of your first public remarks to speak to the technology community rather than say
the policy community in Washington. Edward Snowden: Well thank you for the introduction.
I will say SXSW and the technology community, the people who are in the room at Austin right
now, they're the folks who can really fix things. Who can enforce our rights through
technical standards even when congress hasn't yet gotten to the point of creating legislation
to protect our rights in the same manner? When we think about what's happened with the
NSA in the last decade, in the post 9/11 era... the result has been an adversarial internet,
a sort of global free fire zone for governments that's nothing that we ever asked for. It's
not what we wanted. It's something we need to protect against.
When we think about the policies that have been advanced... sort of erosion of fourth
amendment protections, the proactive seizure of communications, there's a policy of response
that needs to occur. There's also a technical response that needs to occur. It's the makers,
it's the thinkers, it's the development community that can really craft those solutions and
make sure we are safe. The NSA... the sort of global mass surveillance
that's prying at all of these countries not just the US, and it's important to remember
that this is a global issue, they're setting fire to the future of the internet. The people
who are in this room now, you guys are all the firefighters. We need you to help us fix
this. Ben Wizner: So Chris, you heard Ed say that
the NSA offensive mass surveillance programs, the sort of manufacturing of vulnerabilities,
is setting fire to the future of the internet. Do you want to comment on that?
Chris Saghoian: Sure. Many of the communications tools that we all rely on are not as secure
as they could be. Particularly for the apps and services that are made by small companies
and small groups of developers, security is often an afterthought if it's a thought at
all. What that's done is enable global passive surveillance by the US but other governments
too. What I think has been the most lasting impression
for me from the last 8 months is the fact that the real technical problems that the
NSA seems to have are not, "How do we get people's communications" but, "How do we deal
with the massive amounts of communication data that we're collecting?" The actual collection
problem doesn't seem to be a bottleneck for the NSA. That's because so many of the services
that we're all relying on are not secure by default.
I really think for this audience, one of the things that we should be thinking about and
hopefully taking home is the fact that we need to lock things down. We need to make
services secure out of the box. That's going to require a rethink by developers. It's going
to require the developers start to think about security early on rather than later on down
the road. Ben Wizner: Let me pick up on that. Ed, you
submitted written testimony last week to the European parliament. I want to quote a very
short part of that and have you elaborate on it. You said, "In connection with mass
surveillance, the good news is that there are solutions. The weakness of mass surveillance
is that it can very easily be made much more expensive through changes in technical standards".
What kind of changes were you talking about and how can we ensure that we make mass surveillance
more expensive and less practical? Edward Snowden: The primary challenge that
mass surveillance faces from any agency, any government of the world, is not just how do
you collect to communications as they cross the wires, as they sort of find their way
through the global network, but how do you interpret them? How do you understand them?
How do you direct them back out and analyze them? [inaudible 00:08:35] at least on the
easiest, the simplest, most cost effective basis by encryption.
There are 2 methods of encryption that are generally used, one which is deeply problematic.
One of those is what's called key escrow. It's sort of what we're using with Google
type services, Skype type services, right now where I encrypt a video chat and I send
it to Google. Google decrypts it and then re-encrypts it to you guys and we have it.
End-to-end encryption, where it's from my computer directly to your computer, makes
mass surveillance impossible at the network level without a crypto break. They are incredibly
rare and they normally don't work. They're very expensive. By doing end-to-end encryption,
you force what are called threat model global passive adversaries to go through the end
heads, that is the individual computers. The result of that is a more constitutional,
more carefully overseen sort of intelligence gathering model, law enforcement model, where
if they want to gather somebody's communications, they'd have to target them specifically. They
can't just target everybody all the time and then when they want to read your stuff, they
go back in a time machine and they say, "What did they say in 2006?"
They can't pitch exploits in every computer in the world without getting caught. That's
the value of end-to-end encryption and that's what we need to be thinking about. We need
to go, "How can we enforce these protections in a simple, cheap, and effective way that's
invisible to [users 00:10:17]. I think that's the way to do it.
Ben Wizner: So Chris, one of the obstacles to widespread end-to-end encryption is that
many of us get our e-mail service from advertising companies that need to be able to read the
e-mails in order to serve us targeted ads. What are steps that even a company like Google
that's an advertising company or companies like that can do to make mass surveillance
more difficult? Are there things or do we really need new business models to accomplish
what Ed is talking about? Chris Saghoian: In the last 8 months, the
big Silicon Valley technology companies have really improved their security in a way that
was surprising to many of us who have been urging them for years to do so. Yahoo was
kicking and screaming the whole way but they finally turned on SSL encryption in January
of this year after Bart Gellman and Ashkan Soltani shamed them on the front page of the
Washington Post. The companies have locked things down, but
only in a certain way. They've secured the connection between your computer and Google's
server or Yahoo's server or Facebook's server which means that governments have to now go
through Google or Facebook or Microsoft to get your data instead of getting it with AT&T's
help or Verizon's help or Comcast or any party that watches the data as it goes over the
network. I think it's going to be difficult for these
companies to offer truly end-to-end encrypted service simply because it conflicts with their
business model. Google wants to sit between you and everyone you interact with and provide
some kind of added value whether that added value is advertising or some kind of information
mining, improved experience, telling you when there are restaurants nearby, where you can
meet your friends. They want to be in that connection with you. That makes it difficult
to secure those connections. Ben Wizner: Is this the right time for a shout
out to Google that is in this conversation with us right now?
Chris Saghoian: The irony that we're using Google hangouts to talk to Ed Snowden has
not been lost on me or our team here. I should be clear; we're not getting any advertising
support from Google here. The fact is that the tools that exist to enable secure end-to-end
encrypted video conferencing are not very polished. Particularly when you're having
a conversation with someone who's in Russia and who's bouncing his connection through
several proxies, the secure communications tools tend to break.
This I think reflects the state of play with many services. You have to choose between
a service that's easy to use and reliable and polished or a tool that is highly secure
and impossible for the average person to use. I think that reflects the fact that the services
that are developed by large companies with the resources to put 100 developers on the
user interface, those are the ones that are optimized for security. The tools that are
designed with security as the first goal are typically made by independent developers,
activists, and hobbyists. They're typically tools made by geeks for geeks.
What that means is the world... the regular users have to pick. They have to pick between
a service they cannot figure out how to use or a service that is bundled with their phone
or bundled with their laptop and works out of the box. Of course rational people choose
the insecure tools because they're the ones that come with the devices they buy and work
and are easy for people to figure out. Ben Wizner: Let's bring Ed back into this.
In a way, this whole affair began with Glenn Greenwald not being able to use PGP which
is somewhat of a joke in the tech community but really not outside the tech community.
PGP is not easy to install and it's not easy to use. Using Tor, using Tails... I feel like
I need new IT support in my office just to be able to do this work. You're addressing
an audience that includes a lot of young technologists. Is there a call to arms for people to make
this stuff more usable so that not only technologists can use it?
Edward Snowden: There is. I think we're actually seeing a lot of progress being made here.
Whisper Systems, the sort of Moxie Marlinspike of the world, are focusing on new user experiences,
new UIs. Basically ways for us to interact with cryptographic tools which is the way
it should be, where it happens invisible to the user, where it happens by default. We
want secure services that aren't [opt in 00:14:46]. It's got to pass the Glenn Greenwald test.
If any journalist in the world gets an e-mail from somebody saying, "Hey, I have something
that the public might want to know about" they need to be able to open it. They need
to be able to access that information. They need to be able to have those communications
whether they're a journalist, an activist, or it could be your grandma. This is something
that people have to be able to access. The way we interact with it right now is not
good. If you have to go to command log, people aren't going to use it. If you have to go
3 menus deep, people aren't going to use it. It has to be out there. It has to have it
automatically. It has to happen seamlessly. That's [inaudible 15:27].
Ben Wizner: So who are we talking to, Chris? Are we talking now to technology companies?
Are we talking to foundations to support the development of more usable security? Are we
talking just to developers? Who's the audience for this call to arms?
Chris Saghoian: I think the audience is everyone. We should understand that most regular people
are not going to go out and download an obscure encryption app. Most regular people are going
to use the tools that they already have. That means they're going to be using Facebook or
Google or Skype. A lot of our work goes into pressuring those companies to protect their
users. In January of 2010, Google turned on SSL,
the lock icon on your web browser. They turned it on by default for Gmail. It had previously
been available but it had been available through an obscure setting, the 13th of 13 configuration
options. Of course, no one turned it on. When Google turned that option on, suddenly they
made passive bulk surveillance of their users' communications far more difficult for intelligence
agencies. They did so without requiring that their users
take any steps. One day their users logged into their mail and it was secure. That's
what we need. We need services to be building security in by default an enabled without
any advanced configuration. That doesn't mean that small developers cannot play a role.
There are going to be hot new communications tools.
WhatsApp basically came out of nowhere a few years ago. What I want is for the next WhatsApp
or the next Twitter to be using encrypted end-to-end communication. This can be made
easy to use. This can be made useable. You need to put a team of user experience developers
on this. You need to optimize. You need to make it easy for the average person.
If you're a startup and you're working on something, bear in mind that it's going to
be more difficult for the incumbents to deliver secure communications to their users because
their business models are built around advertising supported services. You can more effectively
and more easily deploy these services than they can. If you're looking for an angle here,
we're slowly getting to the point where telling your customers, "Hey, 5 dollars a month for
encrypted communications. No one can watch you" I think that's something many consumers
might be willing to pay for. Edward Snowden: If I could actually take you
back on that real quick, one of the things that I want to say is for the larger company,
it's not that you can't collect any data. It's that you should only collect the data
and hold it for as long as necessary for the operation of the business. Recently EC-Council,
one of the security certification providers intact, they actually spilled my passport,
a copy of my passport and my registration, and posted them to the internet when they
defaced the site. I submitted those forms back in 2010. Why
was that still [inaudible 00:18:20]? Was it still necessary for the business? That's a
good example of why these things need to age [off 00:18:26]. Whether you're Google or Facebook,
you can do these things in a responsible way. You can still get the value out of these that
you need to run your business [inaudible 00:18:38] without [inaudible 00:18:40]
Ben Wizner: We didn't have great audio here that response but what Ed was saying is that
even companies whose business model relies on them to collect and aggregate data don't
need to store it indefinitely once its primary use has been accomplished. His example was
that some company was hacked and they found some of his data from 4 years ago that clearly
there was no business reason for them still to be holding on to.
Let's switch gears a little bit. Last week General Keith Alexander who heads the NSA
testified that the disclosures of the last 8 months have weakened the country's cyber
defenses. Some people might think there's a pot in the kettle problem coming from him
but what was your response to that testimony? Edward Snowden: It's very interesting to see
officials like Keith Alexander talking about damage that's been done to the defense of
our communications. More than anything, there have been 2 officials in America who have
harmed our internet security and actually our national security because so much of our
country' economic success is based on our intellectual property. It's based on our ability
to create, share, communicate, and compete. Those two Officials are Michael Hayden and
Keith Alexander, two directors of the National Security Agency in the post 9/11 era who made
a very specific change. That is they elevated offensive operations, that is attacking, over
the defense of our communications. They began eroding the protections of our communications
in order to get an attacking advantage. This is a problem for 1 primary reason. America
has more to lose than anyone else when every attack succeeds. When you are the one country
in the world that has a vault that's more full than anyone else's, it doesn't make sense
for you to be attacking all day and never defending your full vault. It makes even less
sense when you set standards for vaults worldwide to have a big back door that anybody can walk
into. That's what we're running into today. When
he says these things have weakened national security... these are improving our national
security. These are improving the communications not just of Americans but everyone in the
world. When you rely on the same standard, we rely on the ability to trust our communications.
Without that, we don't have anything. Our economy cannot succeed.
Ben Wizner: So Chris, Richard Clarke testified a few weeks back that it's more important
for us to be able to defend ourselves against attacks from China than to be able to attack
China using our cyber tools. I don't think everybody understands that there is any tension
whatsoever between those 2 goals. Why are they in opposition to each other?
Chris Saghoian: As a country, we have public officials testifying in Washington saying
that cyber security is the greatest threat this country now faces, greater than terrorism.
We've had both the director of the FBI and the director of National Intelligence say
this in testimony to congress. I think it's probably true that we do in fact face some
kind of cyber security threat. Our systems are not as safe as they could be and we are
all vulnerable to compromise in one way or another.
What should be clear is that this government isn't really doing anything to keep us secure
and safe. This is a government that has prioritized for offense rather than defense. If there
were 100% increase in murders in Baltimore next year, the Chief of Police of Baltimore
would be fired. If there was 100% increase in fishing attacks, successful fishing attacks
where people's credit card numbers get stolen, no one gets fired.
As a country, we have basically been left to ourselves. Every individual person is left
to defend themselves online. The government has been hoarding information about information
security vulnerabilities. In some cases, there was a disclosure in the New York Times last
fall revealing the NSA has been partnering with US technology companies to intentionally
weaken the security of the software that we all use and rely on.
The government has really been prioritizing its efforts on information collection. There
is this fundamental conflict. There's a tension which is that a system that is secure is difficult
to surveil and a system that is designed to be surveiled is a target waiting to be attacked.
Our networks have been designed with surveillance in mind. We need to prioritize cyber security.
That's going to mean making surveillance more difficult. Of course the NSA and their partners
in the intelligence world are not crazy about us going down that path.
Ben Wizner: So Ed, if the NSA is willing to take these steps that actually weaken security,
that spread vulnerabilities that make is sometimes easier not just for us to do surveillance
but for others to attack, they must think there's an awfully good reason for doing that.
Their bulk collection programs that these activities facilitate, the collected mentality,
that it really works. This is a very, very effective surveillance method in keeping us
safe. You sat on the inside of these systems for
longer than people realize. Do these mass surveillance programs do what our intelligence
officials promise to congress that they do? Are they effective?
Edward Snowden: They're not. That's actually something that I'm a little bit sympathetic
to because we got to turn back the clock a little bit and remember that they thought
it was a great idea but no one had ever done it before, at least publically. They went,
"Hey, we can spy on everybody in the world all at once. It'll be great. We'll know everything".
The reality is when they did it, they found out it didn't work. It was such [inaudible
00:24:58]. It was so successful in securing funding and so great at getting [inaudible
00:25:02]; it was so great at winning new contracts that nobody wanted to say no.
The reality is now, we have reached a point where the majority of Americans' telephone
communications are being recorded. We got all this meta data that's being stored for
years and years and years. Too many White House investigations have found it has no
value at all. It's never helped us. Beyond that, we've got to think about what are we
doing with those resources? What are we getting out of it?
As I said in my European parliament testimony, we actually had tremendous intelligence failures
because we're monitoring the internet. We're monitoring everybody's communications instead
of suspects' communications. That lack of focus has caused us to miss leads that we
should have had, Tamerlan Tsarnaev of the Boston bombers. The Russians had warned us
about it but we did a very poor effort investigating [inaudible 00:26:04]. We had people looking
at other things. If we hadn't spent so much on mass surveillance, if we followed the traditional
models, we might have caught that. Umar Farouk Abdulmutallab, the underwear bomber,
same thing. His father walked into a US embassy, he went to a CIA officer, he said, "My son
is dangerous. Don't let him go to your country. Get him help". We didn't follow up. We didn't
actually investigate this guy. We didn't dedicate a team to figure out what was going on because
we all this money, we spent all of this time, hacking into Google and Facebook's back ends
to look at their data center communications. What did we get out of it? We got nothing
and 2 White House investigations that confirmed that.
Ben Wizner: Chris, if as Ed says these bulk collection programs are not all that effective,
that the resources that go into this would be better directed at targeted surveillance,
why are they dangerous? Chris Saghoian: Because the government has
created this massive database of everyone's private information. In an NSA building somewhere
probably in Maryland, there is a record of everyone who's called an abortion clinic,
everyone who's called an alcoholics anonymous hotline, everyone who's called a gay bookstore.
They tell us, "Don't worry, we're not looking at it" or "We're not looking at It in that
way. We're not doing those kinds of searches" but I think many Americans would have good
reason to not want that information to exist. Regardless of which side of the political
spectrum you are, you probably don't want the government to know that you're calling
an abortion clinic, a church, or a gun store. You may think quite reasonably that that is
none of the government's business. I think when you understand that the government can
collect this information at this scale, they can hang onto it and figure out uses for it
down the road, I think many Americans are quite fearful of this slippery slope, this
surveillance that happens behind closed doors. Even if you trust this administration we have
right now, the person who sits in the oval office changes every few years. You may not
like the person who's going to sit there in a few years with that data that was collected
today. Ben Wizner: Ed we lost you for a moment but
can you still hear us? Edward Snowden: I can hear you.
Ben Wizner: Ok. Just before this began, I got an e-mail from Sir Tim Berners-Lee, the
creator of the World Wide Web who asked for the privilege of the first question to you.
I think I'm willing to extend that to him. He wanted to thank you. He believes that your
actions have been profoundly in the public interest. That was applause if you couldn't
hear it. He asks if you could design from scratch an
accountability system for governments over national security agencies, what would you
do? It's clear that intelligence agencies are going to be using the internet to collect
information from all of us. Is there any way that we can make oversight more accountable
and improved? Edward Snowden: That's a really interesting
question. It's also a very difficult question. Oversight models, [inaudible 00:29:41] models,
these are things that are very complex. They've got a lot of moving parts. When you add in
[inaudible 00:29:47], when you add in public oversight, it gets complex. We've got a good
starting point and that's what we have to remember. We have an oversight model that
could work. The problem is when the overseers aren't interested in oversight, when we've
got 7 intelligence committees, house intelligence committees that are cheerleading for the NSA
instead of holding them to account. When we have James Clapper the Director of
National Intelligence in front of them and he tells a lie that they all know is a lie
because they're briefed on the program because they got the collections a day in advance
and no one says it, allowing all the American people to believe that this is a true answer,
that's an incredibly dangerous thing. That's the biggest [inaudible 00:30:36]. When I would
say, "How do we fix our oversight model? How do we structure an oversight model that works?"
the key factor is accountability. We can't have officials like James Clapper
who can lie to everyone in the country, who can lie to the congress, and face not even
a criticism. Not even a strongly worded letter. The same thing with courts. In the United
States we've got open courts that are supposed to decide [inaudible 00:31:06] constitutional
issues to interpret and apply the law. We also have the FISA court, which is a secret
rubber stamped court, but they're only supposed to approve warrant applications. These happen
in secret because you don't want people to know the government wants to surveil. At the
same time, a secret court shouldn't be interpreting the constitution when only NSA's lawyers are
making the case about how it should be [inaudible 00:31:40]. Those are the 2 primary factors
that I think need to change. The other thing is we need public advocates.
We need public representatives. We need public oversight. Some way for trusted public figures,
sort of civil rights champions, to advocate for us and to protect the structure and make
sure it's been fairly applied. We need a watchdog that watches congress. Something that can
tell us these guys didn't tell you that you were just lied to. Otherwise, how do we vote?
If we're not informed, we can't consent to these policies. I think that's damaging.
Ben Wizner: For what it's worth, my answer to Sir Tim is Ed Snowden. Before these disclosures,
all 3 branches of our government had gone to sleep on oversight. The courts had thrown
cases out, as he said. Congress allowed itself to be lied to. The executive branch did no
reviews. Since Ed Snowden and since all of us have been read in to these programs, we're
actually seeing reinvigorated oversight. It's the oversight that the constitution had in
mind but sometimes it needs a dusting off. Ed has been the broom.
Chris Saghoian: I just wanted to also note that without Ed's disclosures, many of the
tech companies would not have improved their security either at all or at the rate that
they did. The prism story, although there was a lack of clarity initially about what
it really said, put the names of billion dollar American companies on the front page of the
newspaper and associated them with bulk surveillance. You saw the companies doing everything in
their power publically to distance themselves and also show that they were taking security
seriously. You saw companies like Google, Microsoft, and Facebook rushing to encrypt
their data centers with data center connections. You saw companies like Yahoo! finally turning
on SSL encryption. Apple fixed a bug in its address book app that allowed Google users'
address books to be transmitted over networks in an unencrypted form. Without Ed's disclosures,
there wouldn't have been as much pressure for these tech companies to encrypt their
information. There are going to be people in this audience
and people who are listening at home who think that what Ed did is wrong. Let me be clear
about one really important thing. His disclosures have improved internet security. The security
improvements we've gotten haven't just protected us from bulk government surveillance. They've
protected us from hackers at Starbucks who are monitoring our Wi-Fi connections; they've
protected us from stalkers, identity thieves, and common criminals.
These companies should have been encrypting their communications before and they weren't.
It really took, unfortunately, the largest and most profound whistle blower in history
to get us to this point where these companies are finally prioritizing the security of their
users' communications between them and the companies. We all have Ed to thank for this.
I cannot emphasize enough. Without him, we would not have Yahoo! users getting SSL, we
would not have this data going over the network in encrypted form.
It shouldn't have taken that. The companies should have done it by themselves. There should
be regulation or privacy regulators who are forcing these companies to do this. That isn't
taking place so it took Ed to get us to a secure place.
Ben Wizner: Great. Remember the hash tag is AskSnowden. We'll take our first question,
please forgive pronunciations, from Max [Zerk-an-tin 00:35:36]. The question for Ed and Chris to,
why is it less bad if big corporations get access to our information instead of the government?
Ed, did you hear it? Edward Snowden: Yes, I did. This is something
that's actually been debated. We see people's opinions, people's responses to this evolving
which is good. This is why we need to have these conversations. We don't know. Right
now my thinking, and I believe the majority's thinking, is that the government has the ability
to deprive you of rights. Governments around the world, whether it's the United Stated
Government, whether it's the Yemeni government, whether it's Zaire, any country, they have
police powers, they have military powers, they have intelligence powers. They can literally
kill you. They can jail you. They can surveil you.
Companies can surveil you to sell you products, to sell your information to other companies,
and that can be bad but you have legal [inaudible 00:36:41]. First off, it's typically a voluntary
contract. Secondly, you've got court challenges you use. If you challenge the government about
these things, and the ACLU itself has actually challenged some of these cases, the government
throws [inaudible 00:36:58] and says, "You can't even ask about this". The courts aren't
allowed to tell us whether this is legal or not because we're just going to do it anyway.
That's the difference and it's something we need to watch out for.
Ben Wizner: Chris do you want to address it or should we take the next question?
Chris Saghoian: Sure, just quickly. I'm not crazy about the amount of data that Google
and Facebook collect. Of course everything they get, the government can come and ask
for too. There's the collection the government is doing by itself, and then there's the data
that they can go to Google and Facebook and force them to hand over. We should remember
that the web browser that you're likely using, the most popular browser right now is Chrome.
The most popular mobile operating system is now Android. Many of the tools that we're
using, whether web browsers or operating systems or apps, are made by advertising companies.
It's not a coincidence that Chrome is probably a less privacy-preserving browser. It's tweaked
to allow data collection by third parties. The android operating system is designed to
facilitate disclosure of data to third parties. Even if you're ok with the data that companies
are collecting, we should also note that the tools that we use to browse the web and the
tools that ultimately will either permit our data to be shared or prevent it from being
shared are made by advertising companies. This makes the NSA's job a lot easier. If
the web browsers we were using were locked down by default, the NSA would have a much
tougher time but advertising companies are not going to give us tools that are privacy
preserving by default. Ben Wizner: Let's take another question from
[Jodi Se-ra-no 00:38:31] to Snowden from Spain. Do you think the US surveillance systems might
encourage other countries to do the same? Edward Snowden: Yes. This is actually one
of the primary dangers not just of the NSA's activities but in not addressing and resolving
these issues. It's important to remember that Americans benefit profoundly on this because
again, as we discussed, we've got the most to lose from being hacked. At the same time,
every citizen in every country has something to lose. We all are at risk of unfair, unjustified,
unwarranted interference in our private lives. Through ought history, we've seen governments
sort of repeat the trend where it increases and it gets to a point where they crossed
the line. If we don't resolve these issues, if we allow the NSA to continue unrestrained,
every other government, the international community, will accept that sort of as the
green light to do the same. That's not what we want.
Chris Saghoian: I think there's a difference between surveillance performed by the NSA
and surveillance performed by most other governments. It's not really illegal when it's more of
a technical one and that is the whole world sends their data to United States. Americans
are not sending their e-mail to Spain; Americans are not sending their photographs to France.
This means that the US, because of silicon valley, because of the density of tech companies
in this country, the US enjoys an unparalleled intelligence advantage that every other government
just doesn't have. If we want the rest of the world to keep using
US tech companies, if we want the rest of the world to keep trusting their data with
the United States, then we need to respect them. We need to respect their privacy in
the way that we protect the privacy of Americans right now. I think the revelation to the last
8 months have given many people in other countries a very reasonable reason to question whether
they should be trusting their data to United States companies.
I think we can get that trust back through legal changes but I think tech companies can
also do a lot to earn that trust back by employing encryption and other privacy protecting technologies.
The best way to get your user's trust is to be able to say when the government comes to
you, "Sorry, we don't have the data" or "Sorry, we don't have the data that's going to be
in a form that will be of any use to you. That's how you win back the trust of people
in Brazil, in Germany, and people around the world.
Ben Wizner: Let me just cut in with a question here because I do think that a certain degree
of perhaps hopelessness may have crept in to the global public with this constant barrage
of stories about the NSA's capabilities, the GCHQ's capabilities and their activities,
all the ways that they're able to get around defenses. I hear, Chris, you and Ed both coming
back to encryption again and again as something that still works. Maybe we could just take
a moment, Ed, after the discussions that we've had about has NSA has worked to weaken encryption.
Should people still be confident that the basic encryption that we user protects us
from surveillance or at least mass surveillance? Edward Snowden: The bottom line, and I've
repeated this again and again, is that encryption does work. We need to think about encryption
not as this sort of arcane black art but sort of a basic protection. It's the defense against
the dark arts of the digital world. This is something we all need to be [inaudible 00:42:21].
Not only implemented, but actively researching and improving on the academic level. The grad
students of today, tomorrow, need to keep today's [inaudible 00:43:33] online to inform
tomorrow's. We need all those brilliant [inaudible 00:43:38]
cryptographers to go, "All right, we know that these encryption algorithms we're using
today work. Typically it's the random number generators that are attacked is if they were
to be encryption algorithms themselves. How can we make them [fool proof 00:42:52]? How
can we test them? This is [inaudible 00:42:54]. It's not going to going to go away tomorrow
but it's the steps that we take today, it's the moral commitment, the philosophical commitment,
the commercial commitment to protect and enforce our liberties through technical standards
that's going to take us through [tomorrow 00:43:11] and allow us to reclaim the open
and trusted [inaudible 00:43:15]. Ben Wizner: Chris very briefly, you hand out
with cryptographers. They're not happy campers these days.
Chris Saghoian: No. Of all the stories that have come out, the one that has really had
the biggest impact in the security community is the news that the NSA has subverted the
design of cryptographic and random number generator algorithms. I think it's fair to
say that there is a group within the cryptographic community now who have become radicalized
as a result of these disclosures. Cryptographers actually can be radicals. They're not just
mild mannered people. We should remember that regular consumers
do not pick their own encryption algorithms. Regular consumers just use the services that
are provided to them. The people who pick the crypto, who pick the particular algorithms,
who pick the key sizes; they are the security engineers at Google and Facebook and Microsoft.
The cryptographers who are working with open source projects, those people are all really
pissed. I think that's good. Those people should be mad. Those people can make a difference.
The fact that these disclosures have so angered the security community is a really good sign
because ultimately the tools that come out in 6 months or a year or 2 years are going
to be far more secure than they were before. That's because that part of the tech community
feel like they were lied to. Ben Wizner: Let's take a couple more questions
from Twitter. Melissa [nick sick 00:44:40] I hope. What steps do you suggest the average
person take now to ensure a more secure digital experience? Is there anything we can do at
the individual level to confront the issues of mass surveillance that we're talking about
today? Ed, it's ok if the answer is no. Edward Snowden: There are basic steps. It's
a really complicated subject matter today and that's the difficulty. Again, it's the
Glenn Greenwald test. How do you answer this? [inaudible 00:45:11] For me, there are a couple
of key technologies. There's full disc encryption to protect your actual physical computer and
devices in case they're seized. There's network encryption which are things like SSL but that
happens sort of transparently, you can't help that.
You can install a couple browser plugins, NoScript to block active exploitation attempts
in the browser, Ghostery to block ads and tracking cookies. But there's also Tor. Tor,
t-o-r, is a mixed routing network which is very important because it's encrypted from
the user through the ISP to the end of a cloud, a network of routers that you go through.
Because if this, your ISP, your telecommunications provider can no longer spy on you by default.
The way they do now, today, when you go to any website. By using Tor, you shift their
focus to either attacking the Tor cloud itself which is incredibly difficult, or to try to
monitor the exits from the Tor and the entrances to Tor, and then try to figure out what fits.
That's very difficult. [inaudible 00:46:31] those basic steps. You encrypt your hardware
and you encrypt your network communication. You're far, far more hardened than the average
user. It becomes very difficult for any sort of a mass surveillance to be applied to you.
You'll still be vulnerable to some targeted surveillance. If there's a warrant against
you, if the NSA is after you, they're still going to get you. Mass surveillance, this
untargeted collect it all approach, you'll be much safer.
Ben Wizner: When there's a question about average users and the answer is Tor, we failed,
right? Chris Saghoian: Yeah, I mean ill just add
to what Ed said in saying that a privacy preserving experience may not be a secure experience
and vice versa. I'm constantly torn. I personally feel like Firefox is the more privacy preserving
browser but I know that Chrome is the more secure browser. I'm stuck with this choice...
am I more worried about passive surveillance of my communications and my web browsing information
or am I more worried about being attacked? I go back and forth on those.
I think until we have a browser or a piece of software that optimizes for both privacy
and security, I think users are going to be stuck with 2 bad choices. I'll just not that
in addition to what Ed said, I really think that consumers need to rethink their relationship
with many of the companies to whom they entrust their private data. I really think what this
comes down to is if you're getting the service for free, the customer isn't going to be optimizing
the experience with your best interest in mind. I'm not going to say if you're not paying
for the product, you are the product because we pay for our wireless service and those
companies still treat us like crap. If you want a secure online backup service,
you're going to have to pay for it. If you want a secure voice or video communications
product, you're going to have to pay for it. That doesn't mean you have to pay thousands
of dollars a year but you need to pay something so that that company has a sustainable business
model that doesn't revolve around collecting and monetizing your data.
Ben Wizner: We have another question about encryption from Sean. Isn't it just a matter
of time before NSA can decrypt even the best encryption? Ed, I'm particularly interested
in your answer to this in light of your confidence that data that you were able to take is secure
and has remained secure. Edward Snowden: Well, let's put it this way.
The Unites States Government has assembled a massive investigation team into me personally,
into my work with the journalists, and they still have no idea what documents were provided
to the journalists, what they have, what they don't have because encryption worked. The
only way to get around that even over Tor is either have a computer that's so massive
and so powerful you convert the entire universe into the energy powering this crypto breaking
machine and it's still might not have enough to it, or you can break in a computer disc
and try to steal the keys and bypass that encryption.
That happens today. That happens every day. That's the way around it. There are still
ways to protect encrypted data that no one can break. That's by making sure the keys
are never exposed. The key itself cannot be observed. They key can't be stolen. It can't
be captured. The encryption can't be [inaudible 00:50:13]. Any cryptographer, any mathematician
in the world will tell you that the math is sound.
The only way to get through encryption on a targeted basis, particularly when you start
layering encryption, you're not using one algorithm, you're using every algorithm You're
using key splitting, you're using all kinds of sort of sophisticated techniques to make
sure that no one person, no single point of failure exists. There's no way in. There's
no way around it. That's going to continue to be the case until our understanding of
mathematics and physics changes on a fundamental level.
Actually if I could follow up on that, I would say the US government's investigation actually
supports that. We've had both public and private acknowledgements that they know at this point
neither the Russian government, nor the Chinese government, nor any other government has possession
of any of this information. That would be easy for them to find out. Remember, these
are the guys who are spying on everybody in the world.
They've got human intelligence assets embedded in these governments. They've got electronic
signals assets in these governments. Suddenly, if the Chinese government knew everything
the NSA was doing, we would notice the changes. We would notice the chatter. We would see
officials communicating and our assets would tell us, "Hey, suddenly they got a warehouse.
They put a thousand of their most skilled researches in there". That's never happened
and it's never going to happen. Chris Saghoian: I'll just add that I think Ed's right. If the government really
wants to get into your computer, if they want to figure out what you're saying and who you're
saying it to, they will find a way. That won't involve breaking the encryption. That will
involve hacking into your device. Whether your phone or your laptop, they'll take advantage
of either vulnerabilities that haven't been patched or vulnerabilities that no one knows
about. Hacking technologies don't scale. If you are
a target of the NSA, it's going to be game over no matter what unless you are taking
really, really sophisticated steps to protect yourself but for most people that will be
beyond their reach. Encryption makes bulk surveillance too expensive. The goal here
isn't to blind the NSA. The goal isn't to stop the government from going after legitimate
surveillance targets. The goal here is to make it so they cannot spy on innocent people
because they can. Right now so many of our communications, our
telephone calls, our e-mails, our text messages, our instant messages, are just there for the
taking. If we start using encrypted communication services, suddenly it becomes too expensive
for the NSA to spy on everyone. Suddenly they'll need to actually have a good reason to dedicate
those resources to either try and break the encryption or to try and hack into your device.
Encryption technology, even if imperfect, has the potential to raise the cost of surveillance
to the point that it no longer becomes economically feasible for the government to spy on everyone.
Ben Wizner: Can we get another question on the screen from Twitter? Please? Thanks. Good
question from David Meyer. Is it possible to reap the benefits of big data on a societal
level while not opening ourselves to constant mass surveillance? How do we enjoy the scientific
benefits, even some of the commercial benefits of this without turning ourselves into a dystopian
surveillance state? In 2 minutes or less. Ed?
Edward Snowden: This is a really difficult question. There are a lot of advancements
and things like encrypted search that make it possible for the data to be an unreadable
format, until [inaudible 00:54:10] or something. In general, it's a difficult problem. The
bottom line is data should not be collected without people's knowledge and consent. If
data is being clandestinely acquired and the public doesn't have any way to review it and
it's not legislatively authorized, it's not reviewed by courts, it's not [constant 00:54:33]
with our constitution, that's a problem. If we want to use that, it needs to be the
result of a public debate in which people's [inaudible 00:54:43].
Ben Wizner: Chris, do you want to take on that question?
Chris Saghoian: No. Ben Wizner: We have another question that
is about every day users. Maybe you can give us another one because I think we've answered
this one. Friends backstage? Ok, from Tim [Sho-ruck 00:55:04]. Wasn't NSA mass surveillance
the solution... Chris can you read that? Chris Saghoian: Wasn't NSA's mass surveillance
solution to the internet driven by privatization and handling of our signals intelligence analysis
to SCIC, Booz Allen so... Ben Wizner: I don't understand.
Chris Saghoian: Tim is basically saying, "Isn't this a result of letting the contractors in
to run the show?" Edward Snowden: The problem is when the NSA
gets a pot of money, they don't typically develop the solutions themselves. They bring
in a bunch of contractors. The Booz Allens, the SCICs, the [khakis 00:55:42], and they
go, "Hey, what can you guys do for us? What solutions are you working on?" These guys
get a gigantic [inaudible 00:55:50] song and dance. I actually used to do it professionally,
I know how it works. The problem is you got contractors and private companies at that
point influencing policy. It was not uncommon for me at the NSA as a
private employee to write the same point papers and sort of policy suggestions that I did
as an official employee of the government at the CIA. The problem with that is you got
people who aren't accountable, they've got no sort of government recourse against them
who are saying, "Let's do this. Let's do that. Let's put all this money in mass surveillance
because it'll be great. We'll all get rich" but it doesn't serve the public interest.
One thing you've seen recently is the government's gone and changed its talking points. They
moved their verbiage away from public interest into national interest. We should be concerned
about that because when the national interest, talking about the [sake 00:56:51] becomes
distinct from the public interest, what benefits the people, we are at a point where we have
to marry those up where it gets harder and harder to control and we risk losing control
of a representative democracy. Ben Wizner: So Ed, maybe let me ask you what
will turn out to be a final question. In your early interviews with Glenn Greenwald and
Laura Poitras, you said that your biggest fear was that there would be little or no
reaction to these disclosures. Where you sit now, how satisfied are you with the global
debate that you helped to launch and do you feel that it was worth the price that you
paid in order to bring us to this moment? Edward Snowden: One of the things that I told
Bart Gellman was when I came public with this, it wasn't so I could single handedly change
the government, tell them what to do, and sort of override what the public thinks is
[inaudible 00:57:59]. What I wanted to do was inform the public so they could make a
decision, they could provide the consent for what we should be doing. The results of these
revelations, the results of all the incredibly responsible, careful reporting that by the
way has been coordinated with the government. The government's never said any single one
of these stories have risked a human life. The result is that the public has benefitted.
The government has benefitted. Every society in the world has benefitted. We live in a
more secure place, we have more secure communications, and we're going to have a better civic interaction
as a result of understanding what's being done in our name and what's being done [inaudible
00:58:52]. When it comes to "Would I do this again?" the answer is "absolutely yes." Regardless
of what happens to me, this is something we had a right to. I took an oath to support
and defend the constitution and I saw that the constitution as violated on a massive
scale. The interpretation of the fourth amendment
had been changed [inaudible 00:59:14]. Thank you. The interpretation of the constitution
had been changed in secret from no unreasonable search and seizure to "Any seizure is fine,
just don't search it." That's something the public ought to know about.
Ben Wizner: You can see behind Ed is a green screen of... is that Article 1 of the Constitution?
Edward Snowden: That's correct. Ben Wizner: "We the people"... There's also
another organization here that is also interested in the Constitution. I'd be [re-missed 01:00:07]
if I didn't say to all of you that the ACLU has a table 1144. I promise that it will not
be all about surveillance. Please come and say hi to us. If you're not members of the
ACLU, it's cheap to sign up. We have ACLU whistles, we have t-shirts that you can get
with membership, you can talk to me and Chris a little bit more about the other work that
we are going and our ACLU colleagues. With that, I'd like all of us to thank Ed Snowden
for choosing this venue for this kind of conversation. Edward Snowden: Thank you all very much. Thank
you Austin