Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

October 17, 2014

New Evidence of the NSA Deliberately Weakening Encryption

Posted by Tom Leinster

One of the most high-profile ways in which mathematicians are implicated in mass surveillance is in the intelligence agencies’ deliberate weakening of commercially available encryption systems — the same systems that we rely on to protect ourselves from fraud, and, if we wish, to ensure our basic human privacy.

We already knew quite a lot about what they’ve been doing. The NSA’s 2013 budget request asked for funding to “insert vulnerabilities into commercial encryption systems”. Many people now know the story of the Dual Elliptic Curve pseudorandom number generator, used for online encryption, which the NSA aggressively and successfully pushed to become the industry standard, and which has weaknesses that are widely agreed by experts to be a back door. Reuters reported last year that the NSA arranged a secret $10 million contract with the influential American security company RSA (yes, that RSA), who became the most important distributor of that compromised algorithm.

In the August Notices of the AMS, longtime NSA employee Richard George tried to suggest that this was baseless innuendo. But new evidence published in The Intercept makes that even harder to believe than it already was. For instance, we now know about the top secret programme Sentry Raven, which

works with specific US commercial entities … to modify US manufactured encryption systems to make them exploitable for SIGINT [signals intelligence].

(page 9 of this 2004 NSA document).

The Intercept article begins with a dramatic NSA-drawn diagram of the hierarchy of secrecy levels. Each level is colour-coded. Top secret is red, and above top secret (these guys really give it 110%) are the “core secrets” — which, as you’d probably guess, are in black. From the article:

the NSA’s “core secrets” include the fact that the agency works with US and foreign companies to weaken their encryption systems.

(The source documents themselves are linked at the bottom of the article.)

It’s noted that there is “a long history of overt NSA involvement with American companies, especially telecommunications and technology firms”. Few of us, I imagine, would regard that as a bad thing in itself. It’s the nature of the involvement that’s worrying. The aim is not just to crack the encrypted messages of particular criminal suspects, but the wholesale compromise of all widely used encryption methods:

The description of Sentry Raven, which focuses on encryption, provides additional confirmation that American companies have helped the NSA by secretly weakening encryption products to make them vulnerable to the agency.

The documents also appear to suggest that NSA staff are planted inside American security, technology or telecomms companies without the employer’s knowledge. Chris Soghoian, principal technologist at the ACLU, notes that “As more and more communications become encrypted, the attraction for intelligence agencies of stealing an encryption key becomes irresistible … It’s such a juicy target.”

Unsurprisingly, the newly-revealed documents don’t say anything specific about the role played by mathematicians in weakening digital encryption. But they do make it that bit harder for defenders of the intelligence agencies to maintain that their cryptographic efforts are solely directed against the “bad guys” (a facile distinction, but one that gets made).

In other words, there is now extremely strong documentary evidence that the NSA and its partners make strenuous efforts to compromise, undermine, degrade and weaken all commonly-used encryption software. As the Reuters article puts it:

The RSA deal shows one way the NSA carried out what Snowden’s documents describe as a key strategy for enhancing surveillance: the systematic erosion of security tools.

The more or less explicit aim is that no human being is able to send a message to any other human being that the NSA cannot read.

Let that sink in for a while. There is less hyperbole than there might seem when people say that the NSA’s goal is the wholesale elimination of privacy.

This evening, I’m going to see Laura Poitras’s film Citizenfour (trailer), a documentary about Edward Snowden by one of the two journalists to whom he gave the full set of documents. But before that, I’m going to a mathematical colloquium by Trevor Wooley, Strategic Director of the Heilbronn Institute — which is the University of Bristol’s joint venture with GCHQ. I wonder how mathematicians like him, or young mathematicians now considering working for the NSA or GCHQ, feel about the prospect of a world where it is impossible for human beings to communicate in private.

Posted at October 17, 2014 3:18 PM UTC

TrackBack URL for this Entry:   http://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/2773

39 Comments & 0 Trackbacks

Re: New Evidence of the NSA Deliberately Weakening Encryption

I am a pure mathematician connected to one of the institutes named in this post; one who is deeply concerned by the NSA revelations and who regards the primary players in Citizenfour (E.S., G.G., L.P.) as inspirational heroes.

The last part of your post is something I have often wondered about as I go about my professional life. In a sense it is connected to a much larger question:

Is it morally admissible for me to have a relationship with an entity that engages in other activities I consider morally wrong?

This is a question with no easy answers. My personal view is that this is something individuals must figure out for themselves.

Consider the following (decreasing) levels of involvement a mathematician might have with the NSA/GCHQ. Admittedly, there are a lot of possible scenarios, and I only mention a few. For example, being strategic director of the Heilbronn Institute probably falls around 2.

  1. Work for them whether as a contractor (like Snowden) or directly.

  2. Receive direct grants from the NSA/GCHQ to perform (non-classified, publicly disseminated) mathematical research.

  3. Attend a conference that received a part of its funding from the NSA/GCHQ. (equivalently, attend or organize a seminar in number theory whose budget indirectly has a component coming from the NSA/GCHQ)

Many mathematicians who oppose a large part of the NSA’s activities (e.g., in sabotaging encryption) would never do 1, but I think would be ok with 2 and 3. Personally, I will not do 2 either, but I will partake in 3 (fwiw, with some degree of discomfort).

Thus, while I will not apply for a research grant from the NSA or GCHQ or otherwise engage with them directly (and I wouldn’t dream of working for them except for the express purpose of leaking their misdeeds), I would not stop being involved with seminars, conferences etc in my field which happen to get some part of its funding from the NSA/GCHQ.

Is this somewhat arbitrary line I have drawn a moral error on my part? I do not know. What about an artist agreeing to an award or a grant from a democratic government (e.g. the USA) that sometimes engages in horrific wars killing thousands of people? What about a person receiving social security payments while simultaneously believing that the economy would run much better if the government spent much less money?

So, coming back to your last sentence. If you were to ask someone like me what I feel about the prospect of a world where it is impossible for human beings to communicate in private, I would tell you the truth, that it deeply disturbs me. I would tell you that I have for over six years regularly donated to organizations fighting against such a dystopia (e.g. EFF, Freedom of the Press Foundation, ACLU, open software etc). I would tell you that I sometimes fantasize of actively fighting the good fight to protect privacy and freedom and I only go back to doing my research because that is what I do best. I would tell you that Edward Snowden gives me hope in mankind, and that if I were ever to be in his position, I hope I would have the courage to do what he did.

And then I would go back and attend my conference/seminar, knowing that it is indirectly funded in part by an organization I detest, and I would continue to be conflicted about my moral lines and wonder if my choice is the right one.

Posted by: Conflicted on October 17, 2014 7:45 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Thanks for the very interesting comment.

Personally, I don’t think in terms of “morality”. I much prefer to think in terms of “the predictable consequences of one’s actions”, in other words, the likely effect of what I do, and whether it changes a situation in the direction that I want it to be changed. Maybe what I just said is what most people mean by morality anyway; I’m not trying to start a semantic debate. My point is that I view these things in purely practical terms — what will be caused by your or my actions — and not as something to be judged via an abstract set of rules.

So, what is the likely effect of attending a conference or seminar funded by the NSA or GCHQ or their counterparts in other countries? I suppose it’s the case that every participant does a little bit to normalize the presence of the NSA/GCHQ in the academic community, a little bit to further the idea that a collaboration between academia and secret intelligence agencies is a normal state of affairs. Every potential participant who decides not to go does a little bit in the opposite direction. But these are very small effects, of course. Arguably, a more effective action would be to speak up about it in person, or to find some way to try to effect a positive change.

People sometimes make the argument that accepting money from an organization you disapprove of is unproblematic — you’re taking money from them, after all. I don’t entirely accept this. In order to believe this argument, you have to be very confident that the organization has miscalculated, that they’re mistaken in their belief that they’re getting value for money. And in order to believe that, you have to be confident that you’ve correctly guessed what they want out of the transaction.

Sometimes, controversial organizations (e.g. the NSA or branches of the military) offer academic funding for fields of study that appear to have no direct benefit to them — and may indeed be of no direct benefit to them. At first glance, it’s pure altruism. But of course, that’s a very naive belief. What they actually get is social standing.

For instance, what would the British army do if it wanted to start hiring lots of mathematicians? What would you advise? The obvious strategy would be for the army to start involving itself in the UK mathematical community (which it doesn’t at present, at least to any great degree). Cheap and easy ideas would be to sponsor some conferences, give out a few grants, maybe set up some prizes or fellowships. Knowing that not everyone’s comfortable with the military, they’d be well-advised to run mathematical grants etc. with no conceivable military application (maybe even making that an explicit condition). Soon enough, it becomes normal for acknowledgements of army funding to appear on papers and talk slides and lists of sponsors, the army gets stands at conferences, and its efforts to recruit mathematicians begin to take off.

So, maybe you’re a noncommutative algebraist, and the army wants to give you money to do some noncommutative algebra. What harm could there be? What fools the army are to think they’ll get benefit out of theorems on noncommutative rings! Well, actually, I can immediately think of some noncommutative algebraists who’d see right through it and refuse to play along.

As many readers know, the hypothetical strategy that I was suggesting for the army is pretty much exactly what both the NSA and GCHQ do. It’s completely natural from their point of view. They want their organizations to be an integral part of the mathematical community. They want no one to bat an eyelid at their involvement. They want young mathematicians to meet mathematicians working for the agencies and think “Hey! They’re not the scheming sociopaths the news stories made me expect! They’re people just like us!”

Anyway, I suspect you’ve thought all this through rather carefully, and that nothing I’ve written is anything new to you. But those are my thoughts.

Posted by: Tom Leinster on October 19, 2014 9:51 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Dear Tom,

Thanks for the great comment. I completely agree with all your points about how taking money from an organization can normalize or enhance its standing and can thus be a benefit for it. I would like to add a few personal observations. None of them are particularly deep, but this seems a good place to express these thoughts.

Below, EO stands for “evil organization”, that you take as a shorthand for the NSA/GCHQ or more generally any entity whose actions you oppose.

  1. Even if one were to agree that the act of taking money from the EO to do harmless pure research or the act of attending a seminar or conference that EO partly sponsored, does result in a small benefit for them, this does not automatically imply that one should never do so. One has to balance this against other (predictable) benefits of such a transaction, and also against (predictable) harms of not indulging in such a transaction.

  2. For one, there is the benefit to mathematics. If I believe that basic mathematical research has some real value, and if there is a situation where I have to choose between significantly harming my mathematical career/harming mathematical progress in my field vs slightly enhancing the stature of EO, it may still be reasonable to do the latter. As an example, suppose I am at a university where the regular seminar in my area of basic research is partly funded by an institute that is indirectly connected to the EO. By attending the seminar, I am enhancing the stature of the said institute and hence slightly enhancing the stature of EO. But this effect is very slight; few people who come to the seminar actively make this very indirect connection in their head, and no mention of the EO occurs anywhere in the seminar mailing lists etc. On the other hand, if I were to boycott this seminar every week, the effect on my career would be quite large, till such a time as I found myself a new job.

  3. Another benefit is quite simple; any act that gets me extra money is an act that potentially frees up more money for me to donate to the EFF, the ACLU or to political candidates sympathetic to freedom and privacy, as well as potentially allows me to spend more time advocating for these causes.

  4. The point I am trying to make is, it is important to try to estimate the above effects. Acknowledging the EO explicitly in your papers arguably has a bigger effect in boosting their stature than going to a conference where they are a sponsor but where their name never appears (I have been to a few conferences like this). It is similarly important to measure the harm you would suffer by not engaging with the EO. There is little harm for a pure mathematician in not applying for an NSA grant (there being other funding agencies) or not going to said conference (there being other conferences) Another example is when I was boycotting Elsevier (my main reason was their support of SOPA/PIPA). Yes, that may have been a worthy cause, but it is also true that I did not suffer in any substantial way from this boycott (there are lots of non-Elsevier journals in my field). But there are many cases where the harm (both to myself and to mathematics, and maybe even to my cause as measured in my reduced donations to charities) is much larger if I do not engage with EO. In such a case, e.g., if I were in a situation as described in point 2, perhaps it is reasonable to compromise.

  5. None of these mean that there is no third way. For example, one could be clever and avoid being in a situation like 2 above. To give another (true) example, when I was applying for jobs in the USA some years back, I omitted Illinois completely because of my deep opposition to their wiretapping law. (As it turned out, the law was struck down by the U.S. Court of Appeals for the Seventh Circuit in 2012, but by then I had gotten a job. So, you can try to avoid putting yourself in a position where you are forced to have a slight relationship with EO (in order to not have a huge detrimental effect on yourself), and if you are already in such a position, it is worthwhile to not be too complacent about it, and perhaps look for a way out.

  6. There are a lot of values I hold dear, and privacy is just one of them. In being too zealous about dissociating myself from EO, I should not land myself in a situation where some other value is jeopardized.

  7. It is often easier for senior mathematicians to take a stand than more junior ones. This is both because the harm is potentially less (senior mathematicians have more secure jobs, and are less likely to get as impacted by negative fallout from their advocacy) AND the gain is potentially more (more people are likely to listen to you if you are famous).

Anyway, none of the above points are particularly deep, but they indicate some of the thoughts I had at the back of my mind when I wrote that last comment.

Finally thanks for all your work, and your posts, on this topic. I really appreciate it.

Posted by: Conflicted on October 20, 2014 11:07 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Yes, I agree that the effects are small. There are all sorts of things I do in my daily life that I’m uncomfortable with (e.g. buying clothes that were quite possibly made in a sweatshop), and although I think it’s important not to be defeatist and to try to effect positive change where one can, there will always be things like this — it’s really just part of being human. And I agree with all of your numbered points!

Posted by: Tom Leinster on October 21, 2014 1:34 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

In order to believe this argument, you have to be very confident that the organization has miscalculated

I don't find that much of a stretch. Their calculation is based on the assumption that a mathematician with average principles will get the grant, but instead it went to one with radical opinions. So, their calculation was wrong. They didn't scrutinize your politics before giving you the grant, they took a risk, they lost the bet this time.

I endorse the practice of considering the consequences (which is indeed what morality means to some people: moral consequentialists). It should give you pause that the Evil Organization has also considered the consequences and came to a different conclusion. But since you have more information about your particular circumstances than they do, it shouldn't be strange if, upon reflection, it turns out that you're the one who's right.

Posted by: Toby Bartels on October 26, 2014 3:43 AM | Permalink | Reply to this

Asymmetric information

They didn’t scrutinize your politics before giving you the grant, they took a risk, they lost the bet this time.

But since you have more information about your particular circumstances than they do, it shouldn’t be strange if, upon reflection, it turns out that you’re the one who’s right.

We are talking about the NSA, here, right?

I would have thought that, if your politics were at all germane to their calculation of the benefits of giving you a grant, they would have no trouble whatsoever finding them out.

Posted by: Jacques Distler on October 26, 2014 7:44 AM | Permalink | PGP Sig | Reply to this

Re: Asymmetric information

I would have thought that, if your politics were at all germane to their calculation of the benefits of giving you a grant, they would have no trouble whatsoever finding them out.

Yes, and a longer version of my comment mentioned this before I edited it out.

The point is that they usually won't bother to find this out. (If they ever offer grants to people who would consider not accepting them, then this is already a good sign that they didn't bother.) If they don't check up on you as an individual, then you have more information on your own case than they do, not because they couldn't get that information but because they didn't get it.

Realistically, I don't expect the grant-accepting branches of the NSA or GCHQ to ask the data-collecting branches for personal information about potential grantees. Of course, I could be wrong!

Posted by: Toby Bartels on October 26, 2014 8:29 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Toby wrote:

[The NSA’s etc.] calculation is based on the assumption that a mathematician with average principles will get the grant, but instead it went to one with radical opinions. So, their calculation was wrong. They didn’t scrutinize your politics before giving you the grant, they took a risk, they lost the bet this time.

I’m not sure quite what you mean. Suppose you get an NSA grant. Then the NSA will be able to say that it funds you, and you will probably be contractually bound to acknowledge its support in papers that you write and talks that you give. Furthermore, if you work at a university, then the NSA will be able to say that it funds mathematics at the University of California (or wherever), and the University of California will also say that they receive funding from the NSA.

All of this contributes to the normalization of the NSA’s presence within the mathematical community.

Now, knowing some of your views, Toby, I imagine that’s not something you’d want to do. What actions would you propose to avoid doing it?

Posted by: Tom Leinster on October 26, 2014 2:29 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

What actions would you propose to avoid doing it?

People have proposed such actions elsewhere in this series of posts. The main thing that I have seen is to publish specific disclaimers. I think that there was one which suggested, tacked on to the common line about one's funders not necessarily agreeing with one's conclusions, a line about not necessarily agreeing with one's funders' activities. (These lines are not common in math papers, but the publishers should be familiar with them.) In some cases, that may not get the point across; but if it's the NSA, and in the current news climate, I think that people will understand it!

But it depends on what is required. I have never applied for a research grant, and, given my career path, probably never will; so I don't have first-hand experience with this. If there is a specific acknowledgement text required, I might not be able to print or say it in good conscience. And as you said, they can also make claims about their funding that I can't respond to; if this is likely to be more than just padding some statistics (such as if I am famous), then I'd probably better not take the money.

In any case, I don't think that I could take such money casually. It has to inspire some activity by me that I would not have otherwise done, so as to make the total benefit to them of giving me the grant negative. If the benefit that they expect for the marginal grantee is small, just a little bit higher of a statistic to contribute to their social standing, then this can be countered. But I do have to actually do something to counter it, not just think that they're suckers for giving me money.

Posted by: Toby Bartels on November 3, 2014 8:28 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

In the spirit of your comments, Toby, here is a relevant link or two. The U.S. Senate is set to vote on NSA reform, and the Electronic Frontier Foundation is organizing support, by petition and call-in.

There is also a letter from several large internet companies supporting passage of the bill.

Despite my bias towards wanting this bill to pass, I plan to do a little more study of its provisions. Here is the location of the actual text, with summary.

Posted by: stefan on November 17, 2014 9:31 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

I have always agreed with the Nuremberg Trials in that ‘I was just following orders’ is a completely illegitimate moral dodge and that people remain personally responsible for the morality of their actions. Even if someone literally has a gun to their head, the person still has a decision to make. If they decide their life is worth more than acting immorally, that is defensible - but their actions are still immoral and should be punished, though likely more leniently because I believe the likelihood of re-offending needs to be considered when deciding sentencing (most judges do include this kind of reasoning in reaching their sentencing decisions I believe).

The other moral dodge that people often employ, “I was just doing my job” has always seemed to be totally indistinguishable from another excuse that people more generally would see as invalid - “I just did it for money.” For some reason, doing something “for money” is unseemly to most people but “just doing my job” is not seen the same way. I don’t quite understand why.

I’m a software developer and was faced with a similar sort of choice as what you are discussing. I couldn’t take a job and tell myself that I am still a good person because I’m “just doing my job” when actively engaging in immoral activity like spying (even through automated analysis) on people. But I got a job with a defense contractor. Doing completely civilian work, and nothing which could even be re-used without my knowledge to support aggressive ends. In fact, the work I was doing wasn’t even just neutral, but it legitimately helped protect people without violating anyones rights at all.

So my choice was, for me at least, quite complicated. In accepting the job, I was performing a socially responsible task… but I was also generating profit for a company which does very immoral things (not that I consider all military things immoral or anything, their position on drones alone was tremendously disturbing (shortly, ‘the Geneva Conventions don’t mention robots, therefore nothing a robot does can violate them’)). BUT, I was making the civilian and moral side of their business more profitable which might encourage them to more moral action. Not an easy choice at all. I’m still not sure I made the right one… but working for a different company now doing the same work has alleviated a lot of those worries.

In my opinion, issues like these are not discussed enough in the field of Computer Science. This might be a consequence of my having studied Philosophy alongside CS, but even when the issue of social responsibility is discussed it is always in terms of the workers responsibility to be cautious and safe with what they produce and learn good practices so if you have to develop a system to control a system that gives radiation to patients it doesn’t fail and hurt them.

Posted by: codetaku on October 24, 2014 5:25 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

The other moral dodge that people often employ, “I was just doing my job” has always seemed to be totally indistinguishable from another excuse that people more generally would see as invalid - “I just did it for money.” For some reason, doing something “for money” is unseemly to most people but “just doing my job” is not seen the same way. I don’t quite understand why.

This is an excellent point, and made me laugh! How did I never realize this before?

What makes it more interesting is that you can read it both ways. Instead of starting from the position that “I just did it for money” is not respectable, you could start from the position that “I’m just doing my job” is respectable (food on the table, family to support, etc.). So then, “I did it for the money” begins to sound better. Food for thought.

Posted by: Tom Leinster on October 24, 2014 10:58 AM | Permalink | Reply to this

Just doing my job

There’s a difference that seems important here.

“My job” consists of a bundle of tasks, some less savoury than others. When I say, “I’m just doing my job,” with respect to one of the less savoury bits, what I’m saying is, “I don’t really like this particular part of my job, but the other parts (at least partially) redeem it.”

This wouldn’t work if all the parts of my job were unsavoury. If I were employed as a hit-man, “just doing my job” wouldn’t carry much exculpatory weight.

Similarly, when I am “just doing it for the money,” the implication is that I agreed to do this particular task à la carte; there are no counterbalancing tasks to redeem it.

Posted by: Jacques Distler on October 25, 2014 5:58 AM | Permalink | PGP Sig | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

So my choice was, for me at least, quite complicated. In accepting the job, I was performing a socially responsible task; but I was also generating profit for a company which does very immoral things (not that I consider all military things immoral or anything, their position on drones alone was tremendously disturbing (shortly, the Geneva Conventions dont mention robots, therefore nothing a robot does can violate them)). BUT, I was making the civilian and moral side of their business more profitable which might encourage them to more moral action. Not an easy choice at all. Im still not sure I made the right one….

My grandfather was, if I understand a historical document rightly, the head of the bookkeeping department of the Berlin treasurer before and during world war II (WW2). As I was told he was imprisoned after the war by the britons for his deeds during the Nazi time. He spend about one year in jail. In addition I was told that this sentence was mostly for “having this position within the Nazi regime and doing this particular “math” job” (I though never saw documents about this, so I don’t know how much of this is true, apparently he was also an early member of the Nazi Party) If this was his job then his job was of course important for the Nazi’s. Sabotaging the bookkeeping of the Berlin treasurer during WW2 could have had probably quite some consequences.

His two children were born in 1942 and 1944.

Posted by: nad on October 26, 2014 1:39 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

I was wondering for a while why US citizens did not react stronger to those NSA revelations, but I think I begin to understand why.

Civil asset forfeitures, no-knock police raids, police men shooting unarmed citizens for no good reason, identity theft by the DEA, etc. etc. those are real issues many people face in the US and compared to that an NSA encryption backdoor is just an ‘elegant’ example of this general lawlessness.

As somebody said “they will (try to) break your cipher and if they fail they will break (into) your computer and if this fails they will break your bones”. Actually I prefer they break ciphers…

Posted by: wolfgang on October 18, 2014 3:54 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

I can think of three reasons why people aren’t more angry than they are:

  • Concepts such as privacy, personal liberty and freedom of association are rather abstract.

  • The technological nature of the revelations. In all probability, no human being is listening to your phone calls. But metadata analysis is being done, and that could reveal as much or more. How many people are technically-minded enough to have a visceral sense of that?

  • Self-centredness. As I wrote here:

    A lot of people know now that the intelligence agencies are keeping records of almost all their communications, but they can’t bring themselves to get worked up about it. And in a way, they might be right. If you, personally, keep your head down, if you never do anything that upsets anyone in power, it’s unlikely that your records will end up being used against you.

    But that’s a really self-centred attitude. What about people who don’t keep their heads down? What about protesters, campaigners, activists, people who challenge the establishment — people who exercise their full democratic rights? Freedom from harassment shouldn’t depend on you being a quiet little citizen.

    And rights violations typically begin with vulnerable groups, which makes it easy to think of the violations as somehow exceptional. If I say that three years ago, the CIA assassinated an American child, you might wonder why there wasn’t the most enormous scandal. If I add that his name was Abdulrahman al-Awlaki, it suddenly sounds depressingly plausible.

Posted by: Tom Leinster on October 19, 2014 10:11 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

I think an often ignored and very significant factor is simply a failure of understanding. Most people believe that the furor (what little there is) over the NSAs activities boils down to not wanting some human analyst sitting in a room somewhere listening in on their personal communications. The NSA itself focuses exclusively upon this scenario in explaining their actions - assuring people that no human being sits there and reads their emails or listens to their phonecalls.

The danger, however, is many orders of magnitude worse - but it requires some technical knowledge and maybe even some knowledge of history (to know that ‘security services’ are absolutely able to find employees willing to engage in absolutely any degree of violation of peoples rights). If I had the information and capabilties of the NSA, I believe that I could guarantee the persistence of the status quo indefinitely. I could almost guarantee that no new political party, or significant movement for political change, would ever arise.

Being able to map how communication spreads through a network gives you the ability to identify nodes (people) which bridge mostly-disconnected groups. Those people are absolutely key in the ability for any idea to spread widely enough that an idea gains the national traction necessary to motivate change. Interrupting their communications even partially can drastically affect the ability to ideas to spread.

Take the Six Degrees of Kevin Bacon game as a good example. How many items need removed from the network of movies and actors to change it so instead of an average of 6 or fewer links to reach Mr. Bacon you need an average of 15 links or 20? A few dozen. Remove the right links and the topology of the graph changes radically. The more links an idea has to spread through to reach separated groups, the less likely that idea will ever spread beyond the group it starts in. And the “disruption” of communication doesn’t have to be extreme like black-bagging people to Diego Garcia. You can just drop a few deliveries of emails sent to a list of people. Hide a Facebook posting here or there. The people who bridge mostly-disconnected groups are almost never highly connected people themselves. And they run a high chance of not having strong ties to one or the other groups. It could be done very quietly, and the chance of being detected would be low - and chance of a social movement developing opposing your actions would be stymied by the actions just as well.

Or how about running some statistical analysis on the behavior of people online and just automatically flagging outliers for investigation? If you’re weird, you automatically become a target. And if they want to remove you from the equation, they can discredit you or have you arrested for some crime (since there are enough laws regarding harmless activities that everyone breaks many every day without knowing it).

I don’t really care about what most people worry about with the NSA - some analyst stroking himself to my love life or passing around recordings of my sisters phone-sex (her husband is military and the NSA admits to analysts trading recordings of phone-sex between wives and husbands deployed overseas). I’m very worried by what their automated software systems can do, though.

I watched a documentary long before Snowden where they talked about the NSAs secret legal defense which they have trotted out now. ‘Communications are not intercepted, legally speaking, until a human being has read them.’ That is disturbing of the highest order. They believe that if a machine reads it, summarizes it, classifies it, derives any amount of information from it, they haven’t ‘intercepted’ any communications at all until a human being reads the fulltext. That is literally the LEAST dangerous thing I could imagine the NSA doing with the worlds communications.

Posted by: codetaku on October 24, 2014 5:59 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Thanks for a very interesting comment. The potential of automated interventions based on large-scale data analysis is scary to contemplate.

the NSA admits to analysts trading recordings of phone-sex between wives and husbands deployed overseas

Can you give a reference or link for this? I’ve certainly heard similar things; Snowden said in some interview that analysts would show each other nude photos that they’d picked up from surveilling the population. But I hadn’t heard of quite what you’re describing.

I watched a documentary long before Snowden where they talked about the NSAs secret legal defense which they have trotted out now. ‘Communications are not intercepted, legally speaking, until a human being has read them.’ That is disturbing of the highest order. They believe that if a machine reads it, summarizes it, classifies it, derives any amount of information from it, they haven’t ‘intercepted’ any communications at all until a human being reads the fulltext.

The language games that the intelligence chiefs have played in trying to defend themselves recently are just contemptible. How can they not be ashamed of themselves? It’s exactly as dignified as saying “2+2=52 + 2 = 5” then, when called on it, replying “oh, but I was using ‘55’ in a special technical sense.” E.g. here’s Bruce Schneier on the word “collects”, and here’s James Clapper on his famous lie to Congress (or “least untruthful” answer, as he put it).

(I can’t get enough of watching Clapper tell that lie. It’s also shown in Citizenfour, and a friend I watched that with pointed out that Clapper really couldn’t have looked more shifty if he’d tried. It’s virtually a cartoon of “man lying”.)

Posted by: Tom Leinster on October 24, 2014 2:51 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Info about analysts trading recordings of phone sex between military people overseas and their spouses: http://boingboing.net/2008/10/09/nsa-enjoys-eavesdrop.html

That kind of thing is what the NSA is referring to when they often say “there have been failures of our internal policies in the past, but we’ve worked to address them”.

Posted by: c on October 25, 2014 12:41 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Thanks for the link.

That kind of thing is what the NSA is referring to when they often say “there have been failures of our internal policies in the past, but we’ve worked to address them”.

Here’s another example of that which I find quite thought-provoking. In this 2009 New York Times article, the following revelation is made in an almost incidental way:

The former [NSA] analyst added that his instructors had warned against committing any abuses, telling his class that another analyst had been investigated because he had improperly accessed the personal e-mail of former President Bill Clinton.

This is close to the bottom of the article — you’ll have to scroll down or search for “Clinton”.

Now, you might think that this was a major revelation, the stuff of headlines. The journalists who wrote the story (both highly respected and Pulitzer-prize-winning) could have gone with “NSA read Bill Clinton’s email” as a headline and centred the story around that fact.

Instead, the revelation is buried deep in an article with the mild headline “E-mail surveillance renews concerns in Congress” and only mentioned in passing, and in the context you mentioned — of the NSA claiming to have implemented or tightened up anti-abuse procedures.

After the NYT published this story, other publications picked up on this detail. But it’s a puzzle to me why the NYT downplayed it so much.

Posted by: Tom Leinster on October 26, 2014 2:57 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Tom, by coincidence Snowden addressed this issue in the Lessig interview. The NYT is very close to the government. This is the reason they were not involved in the leaks, as opposed to the Guardian (US and UK) and the Washington Post. Obviously, I am not an expert in this matter, just quoting.

Posted by: Bas Spitters on October 26, 2014 5:17 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

I wrote:

(I can’t get enough of watching Clapper tell that lie. It’s also shown in Citizenfour, and a friend I watched that with pointed out that Clapper really couldn’t have looked more shifty if he’d tried. It’s virtually a cartoon of “man lying”.)

And now here’s a cartoon of “man doing embarrassingly woeful job of pretending he knows what he’s talking about”:

Tweet by Ryan Gallagher

Context: this is the UK Foreign Secretary being asked about the scope of government surveillance powers, and more specifically the Regulation of Investigatory Powers Act, which according to Wikipedia is an act of parliament

regulating the powers of public bodies to carry out surveillance and investigation, and covering the interception of communications. It was ostensibly introduced to take account of technological change such as the growth of the Internet and strong encryption.

He’s asked a precise question, but not only is he unable to answer it, he gets a much broader and more basic matter completely wrong. It’s actually really cringeworthy to watch — he’s caught in public so obviously having no clue what he’s talking about, and does such a miserable job of covering it up. And, of course, it’s shocking that he’s the man supposedly in charge.

Posted by: Tom Leinster on November 2, 2014 9:22 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Your comment also reminds me of an underappreciated fact about the relationship between mathematicians and the intelligence agencies.

There’s been a lot on this blog and elsewhere about encryption and (implicitly) the applications of number theory and arithmetic algebraic geometry to state surveillance. I think everyone’s well aware that the agencies are interested in employing number theorists. However, I’m told that they’re just as interested in mathematicians with expertise in network analysis.

For instance, I can think of one UK maths department that has a very large number of staff involved with GCHQ, and they split into two camps: on the one hand, the number theorists and algebraic geometers and discrete/problem-solving pure people, and on the other, the statisticians and probabilists.

I’d be interested to know more about what mathematicians actually do for the intelligence agencies in the realm of network analysis.

Posted by: Tom Leinster on October 24, 2014 3:05 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

The grant prefix commonly used when the NSA funds research is MDA904. You can find much research that the NSA has funded by searching for that prefix on Google Scholar or similar sources. Lots of papers don’t mention the NSA by name, but do include the grant number.

And if you do such a search, you will certainly notice that there are a great deal of papers done with that funding that focus on network analysis, the spread of ideas through social networks, etc.

Posted by: codetaku on October 25, 2014 12:58 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Thanks again. Here’s a direct link to a Google Scholar search.

Posted by: Tom Leinster on October 26, 2014 3:07 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

“The people who bridge mostly-disconnected groups are almost never highly connected people themselves.”

I don’t think that’s how Pareto distributions / scale free networks typically work. See. e.g., Theory of Rumour Spreading in Complex Social Networks.

Posted by: Marshall Eubanks on October 25, 2014 1:45 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Tom, I had you had a pleasant evening. Any new insights?

In Lessig’s interview with Snowden (worth watching), Snowden gives the example of the Athen’s affair, a case where lawful intercept software/backdoors in switches were abused by unknown agents. When encryption is weakened anyone can use it, not just the government.

Posted by: Bas Spitters on October 26, 2014 11:41 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

I did, thanks!

What I thought of Citizenfour was that it was very unusual in its tonal quality. It’s somehow a quiet film, very directly about human beings and human lives. I mean, it’s very compelling and watchable, but what I found so striking was this direct, almost intravenous, human connection. At a couple of points the film shows clips from CNN news, and in contrast they seem absurdly flashy and superficial.

The news stories based on Snowden’s documents are sometimes quite complicated individually, and all the more so in aggregate. It can be overwhelming. Even in terms of this blog, someone at CT14 told me that he basically felt beaten around the head (he put it more politely than that) by just the sheer quantity of links in some of the posts I’ve written here. Citizenfour conveys lots of information too, but tonally I think it’s a good antidote to that feeling of being bludgeoned with news.

Posted by: Tom Leinster on October 26, 2014 5:36 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

All of this seems like an important discussion to be had and I’m very happy that this is taking place. I’ve been a silent bystander until now, partly because I don’t seem to have anything further to say that might be of value or interest, and partly because I currently have other higher priorities (such as writing job applications).

What has been discussed so far is the issue of mathematicians working for dubious organizations or receiving funding by them. While is this a very important concern, there’s another point that’s bugging me: can the mathematics that we develop be of potential use to dubious organizations for extending their capabilities?

As “pure” mathematicians, and in particular category theorists, we are motivated and led mostly by our own fascination with abstract mathematics. Thus we might find it hard to entertain the possibility that our research may be of some use for purposes like mass surveillance. But there certainly are mathematicians working in applied areas like image recognition to whom the question obviously applies, and not all of them are funded by dubious organizations themselves. Personally, in the light of category theory starting to be used in formalizing certain aspects of applied mathematics, I find this possibility plausible enough to be of relevance to my own moral considerations as a (wannabe) applied category theorist, even if on a very small scale.

So, if we assume that this possibility exists, what does it mean for us? Should we be careful to avoid research with undesirable potential applications? Or should we point to our moral dilemma in talks and try to encourage a public discussion about the use of mathematics?

This is an instance of an old philosophical debate about whether technology is “neutral” with respect to its use or whether technology itself can be “evil” by determining particular use cases. I’m not sure what the arguments for either side are.

As a concrete case in point, consider the announcement of Fusion 2015:

The International Conference on Information Fusion is a premier forum for interchange of the latest research in information fusion and discussion of its impacts on our society [..] brings together researchers and practitioners from industry, government, and academia [..]

Topics of interest: [most relevant ones selected]

  • Probability theory; category theory[!].

  • combined detection and tracking; automatic target recognition; target tracking and localization; behavior modeling; predictive and impact assessment.

  • defense and intelligence; homeland security; public security; autonomy; big data.

Apologies if this has already been discussed elsewhere on the café; I haven’t followed all of the NSA/GCHQ threads here in detail.

Posted by: Tobias Fritz on October 26, 2014 4:34 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Good questions. It seems that at the very least we have to all become applied mathematicians, in thought if not in deed. That is, we have to put some effort into thinking about the applications of our purely abstract mathematical inventions. Most of the time this effort will yield a spectrum of distant future techniques and technologies from good to bad, and our work is safely neutral. However, we want to avoid this situation (presented in hilarious hyperbole, but with a scary kernel of truth.)

Posted by: stefan on October 28, 2014 5:27 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Hi, great reading through the comments. Though I don’t have time for them all, I see this community has morals, reguardless of how you define them. The point I wish to bring up is about ed snowden seeking refuge in russia. Does not russia have an identical system of surveillance to the point where it is a wash for him to choose the other side of the same coin? Can’t wait to review thoughts on this. Thanks.

Posted by: underedumacated on October 26, 2014 5:45 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Hi. I want to keep this discussion in the general orbit of what the NSA and its partners do and what effect it has when mathematicians work with them. Edward Snowden revealed some facts, and his motivation is not particularly relevant. I could view him as the worst traitor who ever walked the earth, and that wouldn’t change the facts he revealed, the documentary evidence he provided, or what that evidence implies for society in general and mathematicians in particular.

So I’m not going to let this subthread go much further. Instead, I’ll just let Snowden answer your question for himself.

Posted by: Tom Leinster on October 26, 2014 5:55 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Thank you for the link, and this article. Also would like to mention how this thread appears absent of the usual bickering that takes place with issues like these. Its nice to have a friendly conversation now and again despite controversy, which is why i asked. Then again I think this forum may be too smart for those unreasonable trolling types. Thanks again!

Posted by: unedumacated on October 26, 2014 7:57 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

There’s an interesting article in Stanford’s alumni magazine about the NSA’s efforts in the 1970s to suppress academic research in cryptology. It’s enlightening for all sorts of reasons, but one episode in particular has real resonance today:

The first controversy over the NSA’s hand in these standards erupted in the 1970s when it persuaded the [National Bureau of Standards, predecessor to NIST] to weaken the Data Encryption Standard (DES) algorithm, an NBS-designed cryptosystem widely used by banks, privacy-sensitive businesses and the public. Hellman and his then-student Diffie mounted a vigorous — and ultimately unsuccessful — public relations campaign to try to improve the strength of the DES algorithm.

At the time, NSA leadership emphatically denied that it had influenced the DES design. In a public speech in 1979 aimed to quell some of the controversy, Inman [then NSA Director] asserted: “NSA has been accused of intervening in the development of the DES and of tampering with the standard so as to weaken it cryptographically. This allegation is totally false.”

Recently declassified documents reveal that Inman’s statements were misleading, if not incorrect. The NSA tried to convince IBM (which had originally designed the DES algorithm) to reduce the DES key size from 64 to 48 bits. Reducing the key size would decrease the cost of certain attacks against the cryptosystem. The NSA and IBM eventually compromised, the history says, on using a weakened 56-bit key.

The “Inman” referred to is Bobby Ray Inman, director of the NSA from 1977 to 1981. Interestingly, he later came out against the G.W. Bush-era domestic surveillance programme — a minnow by today’s standards.

Posted by: Tom Leinster on November 18, 2014 3:25 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

there is a partial solution to this problem:

free software

Posted by: hans wurscht on November 30, 2014 11:03 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Unfortunately software itself seems only a partial solution. That is stronger encryption needs usually also more machine power (see also bitcoin). So even without back doors this is quite a energy intensive race between machines. Like what would happen if for example there would be a sudden unexpected leap in the physics of quantum computers? As far as I understood if one would suddenly have a big quantum computer then in principle most of the normal keys would all be way to short. And at the example of the millenium bug one saw what giant infrastructural task this would be if one would need to enlarge all the keys as fast as possible (even if there were indeed some …. who claimed that the Y2K fixing was easy).

Wikipedia writes:

As of 2015, the development of actual quantum computers is still in its infancy, but experiments have been carried out in which quantum computational operations were executed on a very small number of qubits.[6][citation needed] Both practical and theoretical research continues, and many national governments and military agencies are funding quantum computing research in an effort to develop quantum computers for civilian, business, trade, gaming and national security purposes, such as cryptanalysis.[7]

As I understood there are now a couple of private companies working at the development of such quantum computers. If one of those would make a technological leap then what would happen then ? I guess they won’t rob banks with their knowledge, but what about their development costs, if they can’t use their “competitive advantage”?

Posted by: nad on March 18, 2015 7:50 PM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Here a top security expert points out how the NSA has been lying:

This article is based on new revelations that are spinoffs of the story I discussed here:

Posted by: John Baez on August 28, 2016 4:14 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

Importantly, Andrew Appel provides a careful analysis of the vulnerabilities of the (american) voting process.

His conclusion: Cyberoffense is not the best cyberdefense: But really we should ask: Should the FBI and the NSA be hacking us or defending us? To defend us, they must stop hoarding secret vulnerabilities, and instead get those bugs fixed by the vendors.

Posted by: Bas Spitters on August 28, 2016 9:26 AM | Permalink | Reply to this

Re: New Evidence of the NSA Deliberately Weakening Encryption

I was interested in Schneier’s discussion of the “Vulnerability Equity Process”. In the article, he more or less explicitly divides software vulnerabilities into three types:

  • Those that can’t be exploited. If you’re the NSA, there’s no harm in making these public, since you can’t use them anyway. (It’s not even clear that they should be called “vulnerabilities”.)

  • Those vulnerabilities that you’ve found and you think other people can find too.

  • Those vulnerabilities that you’ve found, and you think no one else will find.

The distinction between the last two is very important. If you judge that the vulnerability will soon be found and exploited by others, e.g. foreign intelligence agencies or crooks (but I repeat myself), then you should take immediate steps to get the vulnerability fixed rather than keeping it to yourself. Otherwise, you’re putting everyone’s systems at risks.

On the other hand, if you’re confident — somehow — that no one else will find the vulnerability, then you can keep it to yourself. And if you’re the NSA, you can use that vulnerability however you like, confident that no one’s system security will be damaged (aside from any damage you might deliberately inflict yourself).

Schneier begins by saying that the NSA have been lying: telling us that they don’t hoard zero day exploits when they do. But he also accuses them of being hubristic, evaluating the vulnerabilities that they’ve found as being in the third category (“nobody but us”) when actually, many of them were discovered by other people not long afterwards.

Posted by: Tom Leinster on August 29, 2016 4:52 AM | Permalink | Reply to this

Post a New Comment