Tip:
Highlight text to annotate it
X
MALE SPEAKER: I want to really quickly introduced
Joe Hall, which is a total pleasure.
Although last minute duties, so apologies
for not being over rehearsed.
I've known Joe for a long time.
We were in grad school together at Berkeley.
While I was in law school, Joe was there
getting his Ph.D. He was actually first prominent,
in my life and around the country,
for working on voting systems, electronic and online voting
systems, and also voting machines
and the security around them.
And I think that was the focus of your dissertation.
Right?
JOE HALL: Oh, yeah.
MALE SPEAKER: Yeah, so I worked on that,
and an election season couldn't go without Joe appearing
on every various media outlet talking about it.
And then he's, of course, moved on from that
into a broader range of work.
He was a post-doc fellow at Princeton
with Ed Felten, who is no stranger to policy
and technology issues, and also worked
with Helen Nissenbaum at NYU.
And in the last year, has come down to Washington DC,
has joined the beast, and now works
at the Center for Democracy and Technology.
He's their first ever-- is that right? --chief technologist.
JOE HALL: Yeah.
Well, we had a chief computer scientist before me,
but I didn't feel comfortable.
MALE SPEAKER: Right, and computer scientist, too small
of a title for Joe Hall, who's the Chief
Technologist at the Center for Democracy and Technology,
a fantastic online civil liberties organization
and advocacy group.
And also, just because at Google we love our well roundedness,
in addition to all the other things that he's done,
Joe has also been a pastry chef.
He has been a ranch hand, and he's also
instructed yoga, been a life guard,
and a riflery instructor for a youth camp.
By the way, if the apocalypse occurs,
I want this guy on my side.
And with that, I want to pass it over to Joe Hall for a talk
and, hopefully, some time for questions.
If you have questions, please save them until the end.
And also, if you are and GBC, please leave it
muted to avoid further technology disruption.
Thank you so much.
Joe.
JOE HALL: Thank you, sir.
Yes.
[APPLAUSE]
JOE HALL: Oh.
So I have quite a slide deck here for you.
I think I'm going to try and go for 40 minutes,
and I may skip over some things.
And if I skip over something that's
really true and dear to your heart for some reason,
let me know, and we'll go back during Q&A.
But I want to make sure there's plenty of time
for Q&A. Q&A will not be recorded,
so feel free to do whatever you want to do during Q&A.
[LAUGHTER]
JOE HALL: So what I want to talk about here is tech policy,
and specifically, I've been in DC for a year
now after leaving academia.
And it's been a crazy year, man, and I just
want to give you a flavor for how crazy it's
been for what's going on.
And what I want to do is talk a little bit
about the FBI and backdoors, the NSA and what
we call the cryptapocalypse, and then
novel technologies like drones.
And I am the chief technologist at the CDT,
and we are hiring, by the way, not that any of you
should leave wonderful jobs at Google.
But please, let your friends know
that might be interested in this kind of stuff.
Quick outline, so I got tell you a little bit about who I am.
We'll do some of that.
What is CDT?
Some of you may have never heard of us.
We're pretty low profile kind of an organization and what I do.
And then talking about how does tech policy work inside DC?
How does anything about technology get done?
Is it as frustrating as you may think
it is here in Silicon Valley?
And then walk through some of these specific case studies.
So who am I?
Well, I have a background in hard sciences.
I did my undergrad at Northern Arizona University
in Flagstaff, Arizona, where I modeled planetary atmospheres,
which is a whole lot of fun.
And we actually-- oops.
Oh, I see what's going on.
We actually found clouds on a moon of Saturn,
called Titan, that has sort of a methane cycle,
just like the Earth has a water cycle with oceans and rain.
And on Titan, they actually have that, but with something else.
It's very, very cold.
Anyway, that's just to give you an idea
of some of the stuff I've done.
I tend to tell people I hacked voting machines for my Ph.D.
It was much more complicated than that.
We were on the policy side of this stuff.
We were funded by an enormous grant by NSF, a $10 million
grant for five years, to basically think about
how do we make these black boxes a little more trustworthy.
And we were invited, as you can see,
we were invited to hack voting machines
for the state of California.
This is our current Secretary of State Debra Bowen.
A lot of interesting people here,
Matt Blaise, Alex Halderman, Dierdre Mulligan,
Ping Ye, who is a Googler.
I don't know if anyone else here is a Googler.
Many of them have been interns.
And you can see right here, when we did it
for Ohio on the right, this is resetting
all the administrative passwords on this voting machine using
a magnet and a palm pilot we just
had sort of sitting around.
Yeah, it's that bad, and it's actually
kind of a little bit worse than that these days.
The stuff that I work on regularly
involves consumer privacy, health technology,
national security, cyber security,
and technical standard setting.
I still do some e-voting and some space policy stuff.
You can kind of think of me-- this
is why I call myself a technologist.
I'm sort of half a lawyer and a half a computer scientist,
but my degree is from a school of information,
so it's like a Ph.D. in information of all things.
But I say that because this is my Ph.D. committee.
Pam Samuelson was my Ph.D. adviser on the left,
and Dierdre Mulligan was my grant supervisor next to her,
and those two were law professors.
And then David Wagner at Berkeley
was sort of a computer scientist,
very, very well respected computer security guy,
the guy I often say what would David Wagner do
when I do things.
I may not do that thing, but I at least ask that question.
And then [INAUDIBLE] Cheshire on the far right,
who does social psychology.
So what is CDT?
The Center for Democracy and Technology
is a nonprofit think tank.
We're sort of half think tank, half advocacy outfit.
We're a bunch of nerds, and we do a whole lot
of actual advocacy work in DC.
We're about 20 years old.
We used to actually be EFF's DC office, the Electronic Frontier
Foundation's DC office.
There is a little bit of a falling out of sorts,
and we broke off about 20 years ago.
I can talk about that.
What's our support come from?
Largely foundations and corporate
support, so not half and half.
It's much more complicated than that.
You can go to cdt.org financial and add it up yourself
if you want to figure these things out.
But this sort of underlines our principles, which
involve empowering people with digital technology
and the internet, and this means that we
want to have things that are forward looking,
very progressive types of collaborative solutions,
and tangible pragmatic policy outcomes.
And this often involves making sure we bring businesses
into the conversation.
We get a lot of flack for taking corporate money,
but at the same time, we think that it's really a good idea.
The internet doesn't just exist without the economic case
for doing things on the internet,
and we think it's really important to make sure
that's done responsibly and that things don't chill that.
And a lot of what we do is never public,
which people-- it's very hard to fund things with grant
money, foundation money, when a lot of what you do
is just not public.
When it comes down to face to face meetings and stuff
like that, you may not be able to say here's
a big press release, because it may
deep six entire relationships you have.
It's important to think a little bit
about what technology policy is.
The basic way to say it is it's the rules
that we set about technology, and it's not just laws.
If you have two small kids arguing over a very large piece
of cake, you can set up a really easy experiment
because they both have the interest of getting the most
cake, where one cuts and the other gets to choose it.
And that is essentially aligning incentives
to get the best outcome, the outcome being both kids
are somewhat convinced that they're happy,
and they participated in the process,
so they know what's going on.
But it's not just sort of social outcomes.
It's also technical, so to what extent do the things
we do result in better things happening,
better technology being able to exist and thriving.
And economic interests, to what extent
does it help the country, and other countries,
sort of build innovative products
and employ people and things like that?
This can mean that certain things are prohibited
when you make rules about technology,
and if you're a technologist-- I don't
know how many of you in the room are, but maybe a lot of you.
Technologists can often be frustrated when someone tells
them, oh, you can't do that, and so for example,
you cannot currently take satellite imagery that is lower
than about 50 centimeter resolution, at least in the US.
If you're a US company, you cannot put a satellite
in the air and then sell the products that come out
the other end.
That means you guys at Google can't actually
use products that have things that
are lower than that resolution.
There's currently a company, Globalize,
currently petitioning US government
to be able to launch a satellite that would do that,
and they're trying to make the case,
look, there's not a lot more that's
going to be able to-- you know, 25 centimeter
resolution doesn't mean you're going
to be seeing faces and things that
might be problematic from a privacies prospective.
Or particular installations, that's
not going to make a big difference.
Now, we should also mention we're
going to be able to start doing things on the consumer end
really soon that can be kind of scary, print gun parts,
print molecules like Sarin gas.
How do you control that when it's a very cheap consumer
technology?
Can you control that?
Do you have to resort to more informal methods of control,
like norms, "Don't do Sarin gas," to your kids?
I don't know.
You also can't do certain things.
Like for example, you can go buy a pager right now,
hack it, and eavesdrop on all the pager traffic around you,
very similar to plugging into an ethernet port.
If you have the right technical capabilities,
you can read a lot of traffic that's flying around,
and your cellphone broadcasts stuff, too.
You can't just do those things, and maybe I'm
preaching to the choir, just because it's
technically possible.
There's reasons we have rules that
make those things hard to do, or at least to disincentivize
people doing that by punishing them later.
So in DC specifically-- and if you've been in DC for a while,
you sort of know how this works, so
feel free to go to sleep for about five or 10 minutes.
There's various sort of constituencies and stakeholders
that we work with to get things done.
Congress is obviously a big one, lately
not passing laws so much, but preventing bad laws
from being passed.
In Congress, there's all these support staff
called staffers, legislative councils,
various types of people.
They're so key to talk with when you're in DC in the sense
that they're often the ones that, not only do you have
to explain very complicated topics to,
but you have to explain them in a way
that they can then explain them to their bosses.
So it's not just, oh, I get it.
It's about, no, look, this is exactly what
you need to be able to say your boss in order
for them to have a hope of understanding
what we're talking about.
You do that sometimes through briefings,
so you'll go in and talk to a single staffer,
or a whole bunch of them, or maybe a whole room
like this full of staffers, and it's like pedagogy in a sense.
You're teaching them something.
You're trying to show them your way of view.
But it's people who have maybe 30 minutes between one
high powered negotiating meaning where they're doing something
on nuclear waste, and something else
where they may be trying to get to different kinds of visas
for immigrant workers to come in.
We do end up-- people will hand us bills
and say what do you think about this a law
that we haven't released yet, and we
have to say it doesn't makes sense for us to look at that.
Or should we throw it in the trash?
Because the second you start working with people
and saying this is how the law should work,
it becomes lobbying.
When you start to work with the executive agencies or Congress
about how laws should look and how they should pass,
and how this one's bad, or you really
should add a feature like that to it, you're doing lobbying.
And we are not registered lobbyists,
so we have to, by law, keep it to about 20%
of our total resource allocation,
to be technical about it.
The very, very high profile and valuable way
to work with Congress is through testimony.
So we are often, at CDT, asked to come in and testify
because a number of us have expertise in certain areas.
But more often, we say, look, you don't want us.
You want someone else.
You want Susan Landau, for example.
You want Ed Felten.
You want Ave Rubin.
You want whoever it is, Kevin Fu.
You want some of these people to come in and talk
to you about these things.
That is a big goal of us, in that process,
is to make it really plug and play,
to sort of have experts come in and, like I
was telling Will over lunch, and Troy,
that you don't want to have to worry about printing
50 copies of some written testimony, which
is what they ask you do.
Bring 50 copies of the things you
just spent two days of your time writing with you,
and you're like, wait, where am I going to do that?
How am I going to do that?
That sucks.
It's heavy.
What?
So we do that, and we also will do things like look
at the testimony and say that's going
to trigger someone's other policy agenda on this side.
They may have a dog in the fight on this other issue.
So we try and sort of make it to where experts want to come back
and do this again.
And that's hard to do because, sometimes, people
are like I'm never setting foot back in Washington DC
after the grilling I got by that Congress member.
And if that's the case, that's too bad,
but we hope to make sure that we have people come in regularly.
We do work with the executive branch quite a bit,
so this includes the White House and a slew of agencies.
This is sort of like the alphabet soup slide,
and there's some-- unfortunately,
this is not a technical alphabet soup,
and so it's not nearly fun as some of the ones
later, where I'll talk about technical stuff.
But the White House has various sort of arms that we work with.
NSS for the National Security Staff,
which has been a huge challenge lately.
There's just a whole lot going on,
and we'll talk a lot about that later.
OSTP, the Office of Science and Technology Policy.
We actually have, at least at the National Security Staff,
we have a former CDT employee who's
the head of the civil liberties and privacy part of that,
and so he's extremely busy these days.
And USTR, the US Trade Representative,
how do we work with other countries,
and how do we negotiate certain kinds of instruments that
make sure that the US is on equal footing
with other countries where the things we care about
are embodied in what we do?
And that can be a challenge because, on the copyright side,
the US tend to be very, very aggro about,
very, very aggressive about things.
Whereas what we want to see is like it's
very hard to use IETF standards in Europe
because they favor government developed
standards, not open consensus kinds of flavors and standards.
In terms of agencies, there's just a ton of them.
Department of Commerce is pretty big,
and we work a lot of them in order to say here
are the incentives involved in what we do.
The NTIA, the National Telecommunications
and Information Administration, is the White House's advisors
in terms of internet policy, and so
if you can convince some of them to care about your issue,
then you're doing a good job.
And again, one of the CDT's former staffers, John Morris,
is at NTIA.
And NIST is what I've been doing a lot lately.
National Institute of Standards and Technology
makes all sorts of standards about cryptography,
computer security, cyber security, stuff like that.
And this is all to say there's a bunch of them, Federal Trade
Commission, Federal Communication
Commission, Health and Human Services,
the FDA, which is a part of the Health and Human Services
in terms of medical device rules,
Department of Transportation is when
it comes to drones, the Federal Aviation Administration.
The thing that I want to get across here
is, with regulatory proceedings--
so those are things like RFIs, requests for information;
ANPRMs, and advanced notice of proposed rulemaking;
NPRMs, notice of proposed rulemaking.
These are extremely important for people to be involved with.
In your private capacity if you can, but it's very important
to hear from industry and trade associations,
blah, blah, blah, whatever you want.
RFIs are a way, for an agency, of saying, look,
we have no clue what's going on here.
We need some input.
Please, come to us and write a short little paper
and tell us what's going on.
ANPRMs, advance notice of proposed rulemaking,
says there's a law that says we have
to write a rule about some subject.
We're thinking about writing it like this.
We don't know if that's the right way to do it.
Please, come in and let and let us know if that's right.
And then NPRM is like, OK, here's what we're going to do.
Because, I should have said this earlier,
but when laws get passed in Congress,
they're typically very, very high level.
They're typically go do x, y, or z,
and then you have to actually operationalize that.
You have to go and write the specific rules down and get
them passed, such that that works.
So like for example, healthcare.gov, there
are a ton of really crazy thousands
of pages of rules that talk about how the exchanges work,
how things are supposed to work.
There's not a lot of information about the technical stuff
in this, which is kind of a pain in the butt for those of us
who need to talk to the press or other people
about why it's sort of failing so amazingly.
And I'm not talking about here.
We can talk about in Q&A. There's just
some really horrible things going on.
We also participate in workshops and proceedings,
so I'm going to be speaking at a Federal Trade Commission
workshop on the Internet of Things, which
is a-- I guess it's been a-- I was going to say it's
a marketing term, but it's been around for a long time.
So you can't just sort of write off like that.
But I'm talking about telehealth stuff on the 19th,
and we'll see what happens.
It should be pretty interesting.
We don't do a lot of court stuff, like litigation.
We very rarely do that.
I think the last time we did it was something called
Reno versus ACLU, which makes sure
that-- it's about intermediary liability.
This is why YouTube can have people come and put things
up and not get in trouble themselves for things
that people put up.
We do file as amici, so amicus briefs.
That basically means a friend of the court brief,
where we come in and we're not a party to the matter,
but we want to say, look, these guys are duking it out.
But we think you really need to care about something else
that neither of them are going to tell you about,
or they're not going to value as much as we do,
like the public interest for example.
But we do, in litigation, support people or partners
that do litigate and are all over the place,
like the Electronic Frontier Foundation, the ACLU
chapters in every state, public knowledge.
There's a lot of copyright and intellectual property stuff.
Press, so this is the last way that we
work a lot with people in DC.
Press is amazingly important and really effective
when you do it well, and man, I have a lot of challenges
with doing press well.
I was on Glenn Beck's-- Glenn Beck has a cable news network,
or cable network, called the Blaze apparently.
I didn't know this.
But I was on it, and they really are
on Obama's *** about the whole NSA thing,
and they wanted to come in and someone to criticize it
and say why are they doing all these things?
And we're like, oh, happy to come and criticize it.
We think it's wrong.
It's not a partisan thing at all.
It's that you shouldn't collect that much stuff.
But the disconcerting part about this
is that when I want to be doing a technical analysis
or writing a paper or interfacing with someone to get
something done, it can be like half of my day some days.
Especially if there's a Snowden revelation lately,
I have to clear things for about a day and a half,
and it's just a pain in the butt.
But you've got to do it, or they'll never
come again, which is not what you want.
But it's really important because you
have this core part of what people like me do,
which is translation.
I often have to talk to technical people about the law,
like yeah, you did that.
You can't do that.
Here's why you can't do that.
Or talk to lawyers about technology.
Here's how that thing, IPv6, works.
Here's why you should care about it.
Here's how you want to change what you're talking about
and how your writing about.
It can be a big challenge, and what
I mean by that-- so for example, here
is a article from the other day, where
I was quote talking about healthcare.gov.
And the quote comes out, the beginning-- and there's
like eight other people quoted in this article.
But the first quote and the second one,
if you can't read it, it says "Some of these things
are really amateur hour.
This might just be an error, but you could not
pass an undergraduate computer science
class by making these mistakes."
And it's very aggressive, and CDT
doesn't tend to want to come off that aggressive.
But at the same time, we were seeing things
like refer headers leaked to Pingdom and doubleclick.net,
and these refer headers had user names and password reset
codes in them.
And it's this long URL with all these ampersand encoded things,
and we're just like I don't know what any of that crap is,
but that is bad, bad, bad.
That would be a fail on a web security class.
And then at the very end, the problem
is even when you get quoted, sometimes, they
don't get the quote right, and you sort of
have to live with it.
So at the very end, it says, "When
it comes to functionality, the last thing you think about
is security."
Being experienced in this kind of thing,
what I tried to get across was people
are trying to make things work.
They don't think about the adversaries.
They don't think about ways that people
could mess with your stuff, and that's
what I tried to get across.
But it makes it look like I'm saying, oh,
securities the last thing you do.
But I did follow up there by saying, look,
you need to think about it.
It's so deeply ingrained that you can't just add it on later.
This is something that we've learned about over and over
again from failures.
You can't add it on later.
The last part, and the most important part-- and this
is why it's so important that people
like Heather West and others, and there is people, Pablo,
all these people are in DC.
If you've ever interacted with them,
it's because you have to meet with people face to face.
You have to talk directly to other companies about things
that you may find to be of common interest,
and then trade associations, people
who can sort of, like a union does,
aggregate all of various companies
in a sector to say things of real import.
There's a whole bunch of other people we talked to.
To give you an example, this is the way
that we-- two things we really value at CDT are convenings
and expertise, so we'll often bring people
together in a very closed door meeting, Chatham House rules.
Or something we even call stricter than Chatham House
rules, and that means you can talk
about what people-- the Chatham House is
you can talk about what someone said there,
but you can't attribute anything said to any specific person.
The stricter than Chatham House rules
is you can talk about the meeting was about,
but you can't talk about anyone, or any industry,
specifically said.
And so it gets kind of weird.
But we talked about-- recently, we had a convening on hack
back.
This is the idea of companies are getting
attacked constantly, and they may
want to take some measures to respond.
And how far can they go?
Right now, under the law, not very far.
You can do pretty much anything you
want inside your own network, but the second it leaves
your network, there's not a lot you can do.
And there's a really good book that I
can refer you to about this stuff.
It's brand new, and it gives sort of a really neat framework
to think about it.
But the whole point of that convening
was what are people doing, what do
they want to do, how would this be received by DC, what
are some of the risks that can involved.
If you take down a hospital in attacking back,
and it's in the middle of nowhere China
or something like that, that's going to be really bad,
and you don't want to have the kind of collateral damage.
And that can happen because it's very hard to attribute things
on the internet.
It's very hard to know who you're actually
interacting with.
They could be coming through a hospital to attack,
and you never even know.
OK, so let's get to some of the cases here.
First thing I'm going to talk about
is the FBI and their quest to have
more backdoors into things.
So in 1994, there was a law passed
called CALEA, the Communications Assistance for Law Enforcement
Act, and this was essentially the law enforcement community
realizing that people are moving from regular landline phones,
that you could take two little alligator clips to and listen
to what they're talking about if you
get to the write little wire loop somewhere.
There is increasingly digital switching.
There is a whole bunch of new technologies
that made it very hard to do wiretapping,
and so they passed a law that said anyone
who does telecommunications has to build, and has to buy
and has to build into these systems,
a way for us to actually wiretap people.
And curiously enough, that did not
apply to sort of the burgeoning internet, at that time,
and the web, which was about to be born,
and that's a very good thing, in our opinion.
Now, fast forward about 2010, about 2010, the FBI
started to argue via their general counsel,
a woman named Valerie Caproni, who
I think now works for Boeing, that they were,
quote, unquote, going dark.
And what they argued is, geeze, people
aren't using-- it's not just about the copper
wire and the alligator clips anymore.
People are not even using phones that much anymore.
They're talking in ways, to each other, that don't even
include telecommunications protocols at all.
For example, if you think about web mail, like email,
you may start your day off at your home answering some email.
You may go to a coffee shop and answer some more email.
If they want to wiretap you using
the signals on the very end there,
they have to actually, first of all,
have a relationship with your internet service provider
to do that.
And if they're using Gmail, they will get entirely encrypted
stuff because you guys use SSL.
But if they're using someone else who doesn't do SSL,
but then they have to figure out,
OK, well, that's the coffee shop you went to.
Who's their ISP?
I need to wiretap you there, too.
And so they're really worried about people sort
of hopping around so much that they
can't do it in a turnkey manner.
In fact, a number of people have argued
sort of the opposite, that now, in this day and age,
it's actually a golden age of surveillance.
The more that we do every day-- more and more,
what we do every day is mediated by things that can record what
we're doing, and there's so many signals
being passed around that it's not
so hard to imagine recording that stuff.
And this is actually Peter Swire,
and [INAUDIBLE] Ahmad wrote this for us.
It's a really neat, wonderful piece just
to point to and say, look, in fact,
it's not exactly this going dark problem.
What we, in fact, have is more and more people
collecting more and more stuff about what we do.
Then about round 2010, we started hearing grumblings
about something that we're calling CALEA II,
and this is plugging this loophole
in the original wiretapping law that
would extend wiretapping obligations to software itself.
And we think that is a horrifically bad idea.
And some of these proposals would basically
require you to write software that
was born with a wiretap capability in it, so born
wiretap capable.
And if you didn't do that, they would come to you,
and you'd get these escalating fine regimes.
And it's actually totally-- some of these things
were leaked in the press, and they got pretty strange.
Like the Washington Post claimed they would serve you
with this thing saying you need to change your software
to build in this wiretap capability, this backdoor
into it.
If you didn't do it in 90 days, you'd get a $10,000 fine,
and that $10,000 would be doubled every day indefinitely.
[LAUGHTER]
JOE HALL: That's the kind of reaction I like to hear.
You know, it doesn't take very long
to get to the entire market cap of most organizations,
or most businesses in some small countries.
A couple of weeks is all you need of that kind of thing
to totally wipe someone out.
So then the next week, about a week or two later,
it was, oh, this is going to be $25,000 a day,
and it's going to be additive.
And I'm like, OK, at least then, i
can start seeing you making a case
that this is just a cost of business
if you're going to do secure communications.
But in the end, we really don't think
that there should be any law that
outlaws purely encrypted end-to-end communication
systems, which it sounds like this would do,
sort of require you to serve clear text to people.
So we've been hearing about this, and we're like, OK,
what can we do here?
And this is one of the key ways that we operate.
We organized an expert report, so we
got a bunch of extremely smart people together.
We wrote a sort of draft, and said
what do you guys think about this?
How can we make this better?
And it involved sort of a curated list of 20
of the top cryptography, computer security, and network
security experts in the world, one of which
is sitting in this room, Susan.
But this is a really interesting list of people,
and this is curated in a way that it's not
sort of simply people that everybody would recognize.
So there's people on here that definitely
have cachet, so the Peter Neumanns, Edward Feltens,
Phil Zimmerman who created PGP.
He was involved with this.
But they're sort of young people and people
that keep a low profile.
So Eric [INAUDIBLE], who now works in Mozilla.
Fred Schneider, very, very smart professor
at Cornell, who was on the, believe, the Air Force defense
or one of the defense advisory boards,
so very top secret cleared individual
who was willing to make this argument.
Anyway, so a whole bunch of really-- Matt Green,
who's probably one of the most accomplished
and promising young applied cryptographers.
Feel free to take issue with any of this in the Q&A.
But our argument was relatively simple.
We talked about a few things.
We said the architected vulnerabilities,
what you're asking, these back doors
just fundamentally undermine security.
When you engage in communications,
these are things that are designed
to undermine your ability to speak privately and securely
with someone, and it's designed to do so
in a way that is undetectable, which is fundamentally
a problem.
These things are backdoors.
These kinds of backdoors and endpoint software and products,
are really difficult and messy.
So for example, if you do say, OK, FBI tells me, via this law,
we have to build a backdoor into what we're doing,
how do you manage that?
How do you manage access to that?
If you get a valid law enforcement request
from Iran or from China, what do you do?
So you say, OK, well, we have a business case that we're not
going to do that or whatever.
Or you get one from a police officer
in a tiny town somewhere in America,
and you may not be-- especially if you're a mom and pop shop,
you may not have the capability to sort of understand
what's going on there or the desire
to think about those kinds of things.
And this sort of slam dunk argument,
it's always good to have one of those,
is that this is just not going to work.
If you put backdoors in these products, a lot of them
are open source, Chrome, Firefox.
You can see these things, rip them out, open SSL.
The things that are not open source,
you can have binary patches.
That's what a jailbreak is, is a binary patch to something
to make it do something else.
And finally, you could actually procure this stuff
into some country that doesn't have a stupid law like that,
so it doesn't make any sense to do any of this.
We took a small group of these people, so Ed Felten, Bruce
Schneier, Eric [INAUDIBLE], and Mike [INAUDIBLE],
a former student of Matt Blaise who's
now a professor at Georgetown, to the White House
and the Department of Commerce.
The Department of Commerce was very much sort of-- it
felt like they wanted ammo to try and shoot
these kinds of things down, whereas in the White House,
it was very clear that they were sort of listening
but not playing in other cards or showing us anything.
They didn't really make any commitments.
They didn't seem to be very shocked.
I think this is sort of something
they expected and heard before.
Now, the key part of this, and something
that I have to emphasize, is that this is only
the first part of the battle against the FBI in terms
of backdoors.
It didn't deal with centralized services.
Our argument was all about endpoints,
and unfortunately, and sadly, if you, as someone in the middle,
have clear text, if you have access to the stuff,
you can be compelled to give it over.
And so there's examples, hush mail about 10 years ago.
Lavabit, most recently, shut itself down rather than
give up 400,000 of its users to, I believe, the FBI.
We're going to need help making these arguments
about centralized services.
For example, how does lawful intercept undermine security?
I need to be able to make that argument sometime
in the future, not anytime really soon.
But I think this underlines that, especially at CDT,
and I think hopefully, people in this room
agree, that you can't tap everything.
Technically, there will be something that you cannot
listen to.
I think that there's a lot of interest these days
in the venture capital community about end-to-end secure
communication tools of various types.
There's just a ton of action there,
and that was before Snowden.
That was a thing before Snowden.
It's definitely bigger, now.
And this may sound kind of sci-fi, but it's true.
We will have things you put in your head
to computationally support cognition
at some point in the future.
Maybe it's farther away than some people think,
but who knows.
It's going to happen.
That stuff should not-- you should
have an enclave inside your head that you
can keep your secrets to yourself.
I mean, we need some sort of-- I don't know what we need.
We need something that basically makes that happen.
So that was all in play, and then Snowden happened.
So that was May 16 when we wrote that, and then June 4 is
when Snowden happened.
And so to transition into another case study,
the cryptapocalypse, and I have to go a little bit faster here.
And what I mean by the cryptapocalypse-- so
cryptography involves math that is hard.
So math that you can do easily in one direction,
but it's very hard to do in the other direction,
like really hard, like billions of years of computational power
hard.
This year has been, since June, a real pain
in the butt for some of us just because every week you
get this new revelation, and we have to make sense of it.
We have to make sure that we have a policy response,
we got a press responsible, blah, blah, blah.
But for now, I'm going to focus on two things.
I need to move a little quicker than I thought,
but Bullrun and something called MUSCULAR.
Bullrun is essentially the NSA's program,
and the GCHQ, the UK version of NSA,
the Government Communications Headquarters.
It's not a very descriptive title.
But they're using tricks, as Bruce Schneier says,
to undermine security.
They are either stealing or compelling people
to write certificates for them, SSL certificates.
You can think of it as stealing encryption
keys or compelling people to give you the keys.
It seems like they're planting moles in internet
and technology companies, so someone in this room
may by paid by the NSA in addition
to being paid by Google.
Who knows?
And we've seen some companies actually
institute military security protocols,
to where very sensitive things are handled
by two people, and your hopeful that not both those people are
a mole.
And they've been planning backdoors and standards,
and when someone comes with a set of source code to the NSA
and says please help us evaluate this implementation
of a cryptography product, they will say,
oh, here's a couple of things we found that were problems.
Here's a couple of vulnerabilities,
and they're also keeping some for themselves to use later.
And MUSCULAR, you may all know about.
They're up on your data center traffic, apparently,
and they're using the sort of lack
of constitutional oversight in foreign countries
to tap that traffic.
Obama, at some point, realized that this wasn't going to stop
and decided he'd put a panel to study the thing
and tell him what he should do.
We were all bit disappointed to see
that it was, essentially, all law
professors and all insiders, so everyone
had some sort of involvement with the Obama administration.
Although, Peter Swire is one of them, and Peter Swire,
if any of you know him, is awesome.
He wrote that thing about the golden age of surveillance.
Definitely a skeptic about surveillance.
The CDT, Dan Auerback, if you knew him.
I think he used to be a Googler.
And EFF, we basically responded in a public comment
kind of forum, so not this is an expert report that we're
going to put up here, but this is sort of an argument
that we want to make that a lot of people believe in.
And 47 people signed onto it.
You probably can't read this, but there's
a lot of really neat people here.
I'm going to bring your attention.
We're trying to continue to sort of bring
in a bunch of different kinds of people,
so there are some-- Ross Anderson
and Stephen Farrell are both UK based
security experts of various types.
Joan Feigenbaum, we have some sort of theoretical crypto kind
of people, people who do sort of bigger thinking,
and a whole bunch of scrappy activists.
Ash [INAUDIBLE], Runa Sandvik from the Tor Project,
although they didn't list that affiliation here.
Morgan Marquis-Boire, who affiliation here
with Citizen Lab, but he is a Googler, very, very smart guy.
We made the following argument, which
is that if you're going have a secret court that's
going to look at technical topics,
you need that secret court to have someone
with technical expertise on it to evaluate the arguments that
are being made.
Not only that, but we pointed out
how not having a technical person
in the room for this stuff resulted
in a massive overcollection of surveillance data in the past,
specifically with something called
MCTs, multi communications transactions.
That's like they grabbed a Gmail screen shot as it flies by,
and so it has a bunch of to and from email traffic on it.
One of those may be a foreign person,
which means they can keep it, but all the rest aren't.
And they claim that it's not technically possible
for them to sort these things out,
which is *** because, if you look at the JSON
structure for that, you could very easily grab
just what you wanted.
And there's other things, like call detail records,
which they're grabbing all of our call detail
records every day in the US.
Apparently, at one point, they include cell towers.
They don't now, they tell us.
I don't know how we're supposed to know that for sure,
but at some point they did.
Because when the cops come and ask a telecommunications
carrier for call detail records, they typically
get the cell towers you connected to when you made
calls.
That's very easy, and increasingly easier,
to use to pinpoint your location throughout the day.
We talked about wholesale attacks.
These are non targeted.
You're weakening everyone's security.
But I think the most important argument that we made,
that got Ross Anderson and Stephen Farrell on-- this
is something that we're going to be making
a lot more noise about very soon.
--is that the US's commitments to civil liberties and privacy
really demand that we treat non US persons' data with respect.
We can't just sort of say, oh, they're foreigners.
We can do anything we want to them.
That is just totally offensive.
It's especially offensive to anyone
who runs a global internet business.
Part of these revelations revealed
that the NSA is doing some things that-- it's
kind of hard to tell exactly what they're doing,
but it seems to be clear that they planted a backdoor,
that there may be a backdoor-- we can't know for sure.
--in this particular thing called DUAL_EC_DRBG.
It's a random number generator.
In cryptography we have to generate
a lot of random numbers very quickly.
You can't just use dice.
It would take you forever, and it would not be fun.
But NIST actually, when they heard about this,
thought long and hard about it for a few days, and then said
please don't use this thing.
We think there's a problem with it.
If you think there's a problem with it, let us know.
Please don't use this thing.
And then you saw RSA security, which
makes a lot of libraries for cryptography products,
issue a product recall.
The first product recall I know of,
and I'd love to be proven different, where
it was essentially someone recalling
a product because their own government had sort of hacked
their stuff.
That was not a good thing to see.
The fact that they had been accused of or had,
or it was unclear that they'd inserted a backdoor in this one
random number generator caused people
to start looking around other things,
and this curious case of SHA-3 came around.
SHA-3 is a hash standard, something
used in the cryptographic hashes.
This is a five year competition to replace SHA-1.
SHA-1 is based on this way of doing hashes that
had been attacked, and was basically
everyone thought that, at some point, it will fall entirely,
and you won't be able to use any of the stuff anymore.
They won the competition in October of 2012,
this is one called Keccak.
It's a really fascinating thing.
I can talk about more if you hear about it.
It turns out it was modified, after the competition,
by NIST before it wrote it up into a long, boring document
that says if you're going to build something that does this,
here's how you do it.
And a lot of people are like why did you modify that?
Why did you reduce the security of it?
And a number of different arguments.
So when I heard about this, first of all,
I tried to make sense of it, and so I wrote up
a little blog post that basically
said what the heck is going on with SHA-3.
And I figured the cryptographic community isn't that big.
Nerds who care about this, there's not that many of them.
I'm going to put this up.
It totally took our server down.
I don't know what it was.
Part of the problem is, because of who we are,
we don't drop cookies, and we don't
drop locally stored objects, which
means we can't use things like Cloudflare to sort of-- anyway,
I could talk more about that.
But anyway, so it brought our thing to its knees,
and we're like, oh, geeze, people care about this.
I met with the White House, two people at the White House, Tim
Polk and R. Schwartz, and [INAUDIBLE]
really concerned about this.
Why don't you just standardize the thing
that won the competition?
It looks bad if you don't.
Turns out, that resonated, and in fact, SHA-3
is now going to be standardized, and it's
as won in the competition form.
It's going to have all the strong secure-- I mean,
this is like pre-image collision on two to the 500-- I mean,
it's just so enormously huge.
It'll probably be a very long time before anyone actually
cracks it, but the whole point was have a really strong one
for people who do really crazy things like elections
or certain kinds of mixed [INAUDIBLE].
You need a whole lot of collision resistance.
Anyway, but NIST is also doing some review
of their entire cryptographic standards suite.
Let's talk about drones.
[LAUGHTER]
JOE HALL: So drones are really interesting because it's
sort of a novel technology.
It's not like something before it.
It's really new in DC, and I'll try and be really quick here.
They have a whole lot of promise.
100% of crop dusting is done in Japan
with drones, unmanned aircraft, and the whole point
is crop dusting is extremely dangerous.
You often have wind breaks at the end.
You're really low, and if you don't pull up quick enough,
you hit a bunch of trees, and you die.
And it's horrible.
It's useful for a whole bunch of things.
There is a very serious potential
for abuse with unmanned systems, especially
these kinds of systems.
They're really tiny.
You often hear before you seem them.
They're very, very cheap, so right now, you
can get one for about $300, a quadracopter from Parrot.
There are no humans on board, and that's a good thing,
but it means that they can go places that planes can't.
So it can go between buildings, and it
can do things that planes can't do.
It can handle crazy G-Forces.
Some of these things can stay aloft
for many, many days, weeks in some cases.
These are gas powered, obvious, not battery powered.
And they're going to, very soon, have all types of surveillance
packages that manned aircraft have.
Gigapixel cameras, so being able to get centimeter resolution
in a place like Manhattan can be very useful
for a variety of things.
LiDAR, laser kinds of things, can
see through foliage, an image what's
behind sort of disuse kinds of barriers.
Synthetic aperture radar could see through your drapes
at home, and it can tell if something in your front lawn
hasn't move a couple centimeters and things like that.
And these aren't things that people really understand,
or we should understand or should begin to understand.
The FAA has to integrate drones into the national air space
by 2015, and they're going through a regulatory process
to do that.
We're really thinking a lot about privacy and safety
for drones.
You know, how do other aircraft know what's around them?
If you ever flown aircraft, birds are real problem.
Well, think of these as metal and plastic birds
that are flying around that you might smack into.
And how do people in their backyard
get an idea of what's in the sky above them.
We imagined pretty simple analogy.
For vehicles, some of this stuff works with just a license
plate.
License plates aren't there just for the cops.
It's for you, too.
Someone does something, or someone left their lights on,
there's a whole bunch of uses for sort
of a meat space cookie for vehicles.
And planes have something similar
in terms of an N-number on their fuselage or their tail.
So I thought what about a license plate for drones?
Well, an N-number is way too damn tiny.
You'd need a telescope to be able to visualize that,
and that's not going to work.
So what about a radio frequency license plate?
What if you broadcast an identifier?
And so I wrote this thing that's kind of-- I
hadn't thought about this much.
I don't know anything about flight regulations.
We got a lot of feedback, and they clued us
into this thing called ADS-B Out, automatic
dependent surveillance, mode B out.
This is, essentially, a radio frequency standard required
by the FAA for all aircraft by 2020.
It broadcasts an identifier, position, velocity,
altitude every couple seconds, and it's
used on manned aircraft to sort of replace-- when you do radar,
you get an echo back from every sweep,
and you have to extrapolate where the plane's going,
how fast it's going, where it currently is,
and you have a very big error radius on that value.
If it knows where it is, and it tells people where it is,
and it's a very precise type of thing,
you can actually reduce that error radius and pack planes
in for landing a whole lot quicker.
This is sort of how it works.
The planes have very fine GPS, or GLONASS,
the Russian version of GPS.
And then they broadcast this stuff,
and other planes can see them, and ground based receivers
can see them, too.
We basically said, OK, FAA, you need
to have a registry that has who owns--
UAS is another word for drone.
Ask me why later.
You need to talk about what kind of thing this is.
What kind of airframe type?
What weight does it have?
What's it's wingspan?
How big is it?
And then it needs to have a data collection statement
that shows here's all the stuff that we want to do.
Here's what we're going to collect.
Here's what we're going to share it for.
Here's how long we're going to keep it.
Here's someone to talk to if you get upset.
And you should issue something called a Mode S hex code, which
is the thing you broadcast on your airplane.
It's a 24-bit identifier.
And then a U-number, sort of like a N-number,
something that's human friendly.
You can think of those IP address and domain name
if you want to.
We said everyone should have to broadcast this thing,
but it's unclear if it should have to be from the airframe
or from the ground.
And the whole idea here is that both people in aircraft
and people on the ground can, in real time,
see what drones are doing.
And the FAA-- or you could have a Google Drone
that sort of showed where all the drones were in a given
area.
In fact, you can buy stuff that does this now
because some companies already have their planes outfitted
to the 2020 standard, so you can go, for $900,
buy this little antenna that hooks up to your iPad.
For some reason, they'll only work with iPads.
Not sure why.
Maybe they just want to support one platform.
Here's another one from Garmin, a little bit cheaper.
You can use these in planes.
Maybe that's why, because a lot of the flight plan software
is iPad based.
Oops, since we said this, and essentially said that,
the FAA responded and said, oh, we don't care,
which is too bad.
But there's been a lot of other interest in this,
and I think the FAA may actually mandate this.
There are a lot of complications.
If you want to have these things on aircraft,
you need a flight certified GPS, which is really expensive.
And you have to have a transponder.
So this is what these things look
like they have a human interface, which
you don't need for a drone.
And that hardware box is about the size of the pack of cards,
so it's not that big.
And you probably can't read this,
but I actually wrote these guys at Trig Avionics that
make that things, and it's like, hey, how much does this cost?
And said, you know, we're actually
outfitting these things on drones
that have to operate in airspace that require this right now.
It's about $6,000, so if you have a $300 drone, in order
to play, maybe you have to spend six grand to get it up.
Maybe that's OK.
Maybe it raises the barrier high enough
to where not everyone will have one.
I don't know.
Anyway, that's it.
Sorry, I went way over.
[APPLAUSE]