The Infinite Monkey Cage - GCHQ

Episode Date: August 13, 2018

The Code BreakersBrian Cox and Robin Ince are joined by comedian Katy Brand, as they transport the cage of infinite proportions to the home of modern day cryptography and codebreaking., GCHQ. They'll ...be discovering how far we've come from the days of the humble code book and the birth of machines like Enigma. and how the new digital era has turned us all into modern day code breakers and cryptographers, without us even realising it. Producer: Alexandra Feachem.

Transcript
Discussion (0)
Starting point is 00:00:00 In our new podcast, Nature Answers, rural stories from a changing planet, we are traveling with you to Uganda and Ghana to meet the people on the front lines of climate change. We will share stories of how they are thriving using lessons learned from nature. And good news, it is working. Learn more by listening to Nature Answers wherever you get your podcasts. This is the BBC. I'm Robin Ince. And I'm Brian Cox. And this is the Infinite Monkey Cage. And what are you doing?
Starting point is 00:00:45 I'm speaking in a code that I've made. Did you understand any of it? No. It's a brilliant code then, isn't it? How does that code work? Well, as opposed to saying the words that are in my head, I just kind of make a random selection of noises. That's what I do.
Starting point is 00:00:59 Yeah, but how might I decipher those sounds? You can't. That's why it's a brilliant code. And it's brilliant because... No-one knows what it means. No, even I don't. See, it's impenetrable. I'm like a young Alan Turing.
Starting point is 00:01:10 Anyway. Well, hopefully Robin will have understood what he's not understood or even understood that he's not understood something after the show because we are at GCHQ in Cheltenham to talk about codes and code-breaking. What makes a good code? What makes a good code-breaker? And how has the art and science of code-breaking changed since GCHQ and its predecessor, the Government Code and
Starting point is 00:01:29 Cipher School, began in 1919? And if you press your red button now, you can hear the entirety of this show in Morse. Now, joining us on our panel today, because we're at GCHQ, are Ms Smith, Ms Smith, Ms Smith, and X. We may not hear from X. And they are... Oh, hi, I'm Ian. I'm the Deputy Director for Counter-Terrorism here in GCHQ. And my favourite code is the genetic code, because I think it's amazing that you can just have a big string of letters
Starting point is 00:01:55 and that's enough to understand how to make a person. And, yeah. I'm Tony Comer. I'm GCHQ's Departmental Historian. And my favourite code is the German High Sea Fleet code book of 1914. Da da da da. Everybody
Starting point is 00:02:12 says that. Everybody. I'm Katie Brand. I shouldn't really be here. I won't go into why. No, no, don't worry. Don't panic. This now sounds like you're a double agent. You're not going to be let out. I'm really regretting it already. I'm talking myself into a serious small room. I am a writer and a comedian, and my favourite code is the alphabetical code that me and my friends made up,
Starting point is 00:02:37 where we created a whole new alphabet set of symbols when we were at school, so we could write rude messages to each other about the teachers. And this is our panel. Tony, before we begin, I wanted to ask you, the German High Sea Fleet Codebook, what was that? This was the codebook that the German Navy was using in 1914. And the Russians, who for a very short period of time were our allies, captured a copy of the codebook and sent it to the UK. And in November 1914, for the first time, all the different bits of work that go up to make what GCHQ does today were done in the same place at the same time. Is that one of the first examples of military codes being important,
Starting point is 00:03:32 or the breaking of military codes being important? The military have been using codes of one sort or another since ancient times, but it's the first time in the modern era, after the invention by Marconi of wireless, that all of the military powers were beginning to use encryption techniques to send their messages to their forces and command and control them in a way that other countries wouldn't be able to know what they were saying. So basically, once Marconi cemented the radio, that's when an organisation like this becomes a necessity?
Starting point is 00:04:08 Yes, but the UK didn't realise it for 16 years. It's a grand tradition and we continue to do that. But it was 1919 when the predecessor, Ian, to GCHQ was set up. So how was that functioning at the time? So, I mean, back then it was a very different world really to today you know a very much more kind of simple technology environment obviously you know it was it was back then. If you think about people like you know Marconi and folks like that thinking about how to get the transatlantic kind of wireless and telegraphy kind of systems to work information around the world was starting
Starting point is 00:04:45 to flow our job really back then was very much more about supporting kind of the military and also thinking about the ways in which information and radio in particular was a thing that actually was carrying information that we needed to know about right and over the years as you sort of look at the whole kind of history of gchq and we've got our centenary next year, right, so we've got kind of 100 years of this to look at, it's kind of unbelievable really to think how we as an organisation has had to transform what we do to keep track of the way that technology has moved, and I think if you, as you, you know, you sort of come to a building like this
Starting point is 00:05:20 and you look around and see the kind of people we've got working here and the kind of ways that we go about doing our work it's frankly completely unrecognizable from kind of the days of 1919 and not just because of the technology really because of the problem is it is a genius always a genius do you think like even if valentine was here now would he be able to apply his level of innate genius to whatever new technology is around you could apply it in whatever age anywhere yeah i don't know i mean i think certainly for us and the sort of we've had a number of people i think over the course of our history you you genuinely call a genius and i think the thing has been
Starting point is 00:05:54 the way that they think not necessarily the context that they're thinking in it so turing was brilliant because he could think around corners really you know he could he was looking at what appeared to be a completely intractable problem so enigma is the kind of the classic thing and you know yes he obviously was deeply steeped in sort of very very expert in in mathematics in particular but i think one of the things that really characterized him was he just kind of thought laterally and he thought about a different way into the problem you know and and in the course of doing so kind of invented a whole load of technology that never happened before um so i think that sort of ability to think around corners uh and that ability to sort of imagine a completely different way of solving a problem
Starting point is 00:06:34 that most of the world would probably have imagined at the time was just you know was was unbreakable in in those in those terms uh i think we we still do that here today it's really it's really fundamental um uh and yeah and i think it it characterizes the work that we do you know we are um i think genius is you know it's for us it's not just about having one or two geniuses it's about having a whole bunch of really really talented and diverse people who come together and sort of create that kind of collective genius that allows us to do what we do. Tony, I want to, again, just talking about that idea of the loner, because is this a problem with mathematics as a whole, which is the ones that are written about most
Starting point is 00:07:11 are very often the quirkiest, and therefore people... I was reading a book about Gödel the other day, most famous for his incompleteness theorem, and some of the things about him was that his diet, for instance, was baby food, butter and laxative, and he wouldn't go out any days if another mathematician was in town because he feared he'd assassinate him, and he thought he was going to die from refrigerator gases.
Starting point is 00:07:32 So quite often people think, I'm going to go into maths, and then they read about one of the most famous ones and go, I'm not really sure I've got what it takes, or whether I even want to have what it takes. So is that... It's the same with touring comedians, to some extent. It's an interesting thing, isn't it, in history, where sometimes those who are elevated most
Starting point is 00:07:47 are not necessarily the most ultimately representative of perhaps what might be required. I think already by the middle of the 1940s, GCHQ had this reputation of attracting idiosyncratic, quirky people. And I think you need to turn this on its head and say that this was an organisation that, in its time, was almost revolutionary in thinking about saying,
Starting point is 00:08:13 we don't want stereotypes, we don't just want a load of people who are all the same. You know, great minds don't think alike. How do you bring in enough people who think differently so you can actually get to the heart of your problems and then add something about not telling them how to live their lives not demanding a particular set of social skills or of dress sense or anything like that
Starting point is 00:08:39 it's not why you're employing people Can I just say for the radio listeners the entire audience is made up of GCHQ employees at the moment so I'm not sure whether this is comfortable or uncomfortable in this context. Carry on, Tony. I've been here for a long time. And you look magnificently
Starting point is 00:08:56 tailored. Thank you. You don't look so bad yourself. What is that equation that's tattooed on your naked chest that you're showing us? If I told you, you wouldn't believe this. There's an enigma machine, actually. We mentioned Alan Turing there. Could you
Starting point is 00:09:14 first of all go back to the origin of codes and perhaps describe the simplest of codes and then how those codes have evolved over the years? I think the simplest sort of codes are ways of taking, for example, a book, taking a dictionary and just assigning a value to every word or maybe make it a bit more complex and throw in some phrases
Starting point is 00:09:39 and assign just a simple numerical value to them. So you have a code book there which is just phrases, and on the other side it's actually sequences of letters in that case, isn't it, next to them? They can be letters or numbers, and they can be more or less random depending on what you want to do with them. That would be a Katie-type code, wouldn't it?
Starting point is 00:09:59 That would be the kind of code that you develop to write your code. Just little squiggles next to each letter uh letter yeah that's what that's what we did i well funnily enough we became quite fluent in it because we wrote in it several times a day every day for a few weeks and then because we were 14 or you know we got bored of it and um i don't know some some pop star caught our eye god i sound about 100 years old someone at the top of the hip parade took our fancy. I suddenly saw Tommy Steele and I thought, my life's changed. Do you remember the squiggle?
Starting point is 00:10:32 No, not really. Duran Duran or something or whatever it was. No, it was literally each letter. I mean, it was quite extraordinary that we had the patience to do that in the end. We must have really wanted to be rude about the teachers. But yeah, we learned this whole new alphabet and we became very fluent in it but the interesting thing was that as soon as we stopped even for a few days we immediately I mean we forgot all of it very very fast so you
Starting point is 00:10:55 know you if I didn't write anything for a few days I wouldn't forget the alphabet but I guess I'd be surrounded by the words all the time but is that is that a common problem or is that just because we were particularly stupid I mean if you stop using your code regularly do people tend to just forget very fast does it go too deep or too shallow or i can answer that because what you do is you have a code book that's the point but so that's is that called a substitution code is that what it's called where you just this sort of this sort of code yeah where you just substitute This sort of code, yeah, where you're substituting each word or each phrase by a small group of letters. But in itself, that's not fantastic.
Starting point is 00:11:33 It's a very good way of shortening your messages, for example. But what you really want to do is add something to those numbers that make them unrecognisable as the numbers that were in the codebook. And that's a key. You're going to add something to those numbers that make them unrecognizable as the numbers that were in the code book and that's a key you're going to add something to them it could be for example from a pad this is a one-time pad as it's called you'd only use it once you and the person you're sending a message to share these books of random numbers and you add the value of these numbers to the value that you've got out of the code book and come up with a third number that's what you send to the recipient so if a was very simply let's say 11 and you looked in there and you had 12 then you'd add them together and you'd write 23 down exactly and the person at the distant end would see 23 he'd have the same book he'd
Starting point is 00:12:26 subtract 12 and he'd get back to 11 and know that you're talking about a yeah so how long would they i mean when would that have stopped when would they that numerical that that was just no longer of any use using these one-time pads yes not stopped it's not stopped these are the safest way of sending a message that you can be pretty certain nobody will ever be able to break if you use them properly. And so the main problem, I suppose, is sending the pads around, isn't it? The distribution of the pads and the sheer tedium of having to add large numbers of numbers to large numbers of other numbers.
Starting point is 00:13:01 I should say we've got a very beautiful thing here, which is one of those pads printed on silk. That's right. That was one that was produced for use by MI6's agents during the war, the people who were being parachuted into France, into the occupied countries. They obviously couldn't take code books or long-time pads with them because if they were searched they'd be
Starting point is 00:13:26 discovered. But a silk sheet can be sewn into your clothes if you need to get rid of it it can be just burned, it'll burn much better and more quickly than paper you can scrumple it up and just put it in your shoe or something like that. You could wear it as a nice pair of tights.
Starting point is 00:13:41 You could but it would be perhaps slightly more visible than you'd want it to be. Which one's the spy? as a nice pair of tights. You could, but it would be perhaps slightly more visible than you'd want it. Yeah, you don't want it. Which one's the spy? I think... The woman in the numerical tights. I'm starting to... I'm starting to get a sense of why I never got a tap on the shoulder. Yeah.
Starting point is 00:13:56 So why are these one-time pads, which is a very simple idea, why is that still effectively... Well, I should ask you, actually, is that unbreakable? You said it's very, very secure indeed. As long as you've generated the numbers in a completely unpredictable manner, so they're effectively random, as long as you only use them once, and as long as you use them properly, for practical purposes, yeah, these are completely unbreakable, but they're difficult to use quickly. The temptation to reuse numbers is always going to be there because of the difficulty of sending them,
Starting point is 00:14:36 sending fresh pads around, and it just takes too much time. Nowadays, there's far too many messages that people want to send to be able simply to write them all out by hand and then start adding these numbers so all the more complex technologies we have now like the enigma machine for example that we had in the 30s and 40s are really attempts at making more convenient codes rather than more safer codes yeah essentially an enigma machine i mean it looks like a typewriter, but as you press keys, it lights up letters.
Starting point is 00:15:14 There's a complex electrical pathway through which the current goes when you press a key to light up a lamp on the lamp board. Fantastic machine, 10 to the 138 permutations possible, but there's a fundamental flaw in its design. Because it was designed for people to be able to use quickly and simply, it's reciprocal, which means that for any given setting, if you press a P and it gives you a Q, at the same setting, if you press a Q, it'll give give you a p that reciprocity brings another problem along and that's that a letter can't encrypt as itself and that fundamentally is the way in for a cryptanalyst to breaking a message encrypted on that machine there is that the story of how it
Starting point is 00:15:57 was broken was is it right it's about one particular phrase that by repetitive use is that true that's that story that ultimately led to the breaking of it or it's much more complex it's much more complex because uh there are a whole set of things that you need to be able to understand and a cliche is uh german messages ended with heil hitler so you'd look for the last 10 letters of a message and if none of the letters match the letters of heil hitler in the right order that must be the message in practice it's looking for things that say weather forecast because you know that's a bit more common or just the way that military bureaucracies send messages
Starting point is 00:16:36 they have a very stereotype format exactly the same as emails do today because emails derive essentially from the way the military message so they'll begin to somebody from somebody they'll have a serial number a date a subject you have who's the copies you'll have a classification it's a whole load of stuff at the beginning of a message that you expect to be there and that's the way in rather than the content of the message so these are clues because i was going to ask you and i suppose a naive um person would think well with the computing power that we have today what you do with the code is you throw a computer at it and just try it but uh that that clearly is not the case there's a lot of human intelligent input into breaking a
Starting point is 00:17:23 code yeah absolutely and you know certainly kind of modern encryption algorithms other sorts of things that underpin the way that you know the internet works and the way that you're able to do internet banking safely and things i mean there's no doubt you know i mean to to implement those and to uh and to use them you need enormous amounts of you know computing power and things to get the kind of security it's all about how hard it is to kind of you know break the keys and and things like that but kind of security. It's all about how hard it is to break the keys and things like that. But it's a really important point. All of these systems, at some point, rely on people.
Starting point is 00:17:52 So people implementing them, people designing them, and actually people using them. And as you said earlier, the thing about the one-time pad, which just makes it a nightmare, is it's just massively annoying to use to actually do anything useful. And if you were trying to send the sort of you know huge amounts of data around the globe that we now all sort of know and love i mean trying to you know obviously you're not going to do that with a one-time pad right so you have to invent schemes that are that work at scale that can deal with volume and are very secure in some ways in terms of observing what is going on in the
Starting point is 00:18:22 world in one way has has it become more difficult but equally are you also being often going i cannot believe we are able to observe this form of observation in terms of just the amount of information that is just kind of yeah yeah in terms of what people you you can go hang on a minute if that google search is is coming up there there there and there that may well suggest oh you know and then you can build up a pattern can that happen is that possible and not necessarily google by the way other search engines i use by the way if i'm ever being covert i just go and ask jeeves um but uh you know that that to me i'm just i'm fascinated sometimes when i read books about big data what is given away about human behavior yeah and how useful that
Starting point is 00:19:00 can be to someone like you yep absolutely so um you So, you know, if you work in an organisation like GCHQ, it's a slightly different question because what we're about is trying to say, OK, there is all this data out there. 99.999, very large number of nines, of it, is of no interest whatsoever to us as an intelligence and security organisation. But a very, very very very small
Starting point is 00:19:26 percentage of it absolutely is and that very very small percentage of it um is something that we need to be able to find and we need to be able to use you know to save lives or protect children and all the things we do um for us the the the thing that makes our work very technically challenging our job is to figure out how to only go and get the bit that's that's going to make the difference so that's not only is that a computer science problem it's a question of policy it's a question of ethics it's a question of you know many other things is absolutely not the case that the availability of data that we all kind of use is the thing that helps us do our job because so much of that is actually of no interest to us whatsoever.
Starting point is 00:20:07 And just to continue this, and it's really serious what you're talking about, you know, saving and keeping the country safe and the privacy versus surveillance issue. So a really, really serious question that I know troubles me and a lot of my friends, which is, can our phones hear us talking about mattresses and then put a targeted ad for mattresses creepily and randomly in the middle of our Facebook feeds?
Starting point is 00:20:29 So, certainly if they do, there's nothing to do with us. I would say that. Let's be really clear. It turns out you're actually run by Beds R Us, the gentleman and Gloucester. The whole of GCHQ is actually nothing more than a front for an enormous mattress company. And also EasyChairs. There doesn't have an enormous mattress company. And also easy chairs.
Starting point is 00:20:45 It doesn't have to be mattresses. But you know what I mean? When everyone remarks on this, you know, I was really weird. I was talking to my friend the other day about maybe going on holiday to Cyprus. And we were both sat there and we didn't put anything about it online. And then I got home and I had an advert just popped up that said, I'll be a holiday to Cyprus. Now, I know this isn't necessarily in your remit or your day-to-day work.
Starting point is 00:21:02 But just from a pure technological point of view, could just help us to understand the director of counter tourism and targeted marketing yes this is my only chance to ask an expert because it bothers a lot of us and it bothers a lot of us well i'll give it a go so so absolutely you know you will know if you're you know on the if you're online if you're using the internet and you're i don't know you're searching for you know buying something on a particular website, you will notice that when you're then looking in your email, there seem to be adverts for the thing you were looking at there, right? There's a whole very interesting kind of machine, essentially,
Starting point is 00:21:38 that runs the way that advertising works on the internet. It's really interesting. I've found some fascinating technology behind it, actually. It's really, really cool. The next stage in that that we're starting to see now is well what about what about if it isn't just about what you're clicking on when you're sat on your laptop um but if you've got your you know your smart speaker sitting in the corner that you can use the keyword to get it going and it you know it makes the curtains open or whatever it might do well actually um can that and the things that you are saying and doing in
Starting point is 00:22:06 that environment also be used to feed that kind of advertising ecosystem so a really really important part again it makes that point about the link between security and humans and people right so um you know a really really daft way to implement something like that would be to have a system whereby um the thing thought that you were you just asked to switch it on, you know, the keyword, and it was a really common word, right? So every time you said hello, you were talking to your friend. Actually, your smart speaker thought you were talking to it, and then you start talking, and it goes,
Starting point is 00:22:34 oh, I know what you're on about. So that would be a really, really poor implementation of the privacy settings on a device like that. And obviously, it's in nobody's interest, I don't think, for any of the kind of people who are building this technology to start putting those sorts of things in people's homes i mean nobody's going to want to do that but um this is where these questions of cyber security come in because what if somebody uh in a with a sort of nefarious purpose has figured out how these things work and is attempting to get that speaker to do precisely that um well actually a lot of the work that we do here is to work with industry and people that
Starting point is 00:23:09 make these things to make sure they actually know that really isn't a thing and it isn't just that they've told us that it can't be done we try and make sure that you know those sorts of things where where cyber security starts to really touch on the fundamental ability for us all to be sort of you know to live and work and be safe online um that's that we're trying to do. But it's not out of the box, right? This is why we have people here working on these things, right? One of the pieces of security technology we all use every day is public key encryption. It's our bank accounts or secure, well, HTCPS, anything on the web. So could you talk a bit about, just outline how that works? So there's a couple of sort of basic things going on here.
Starting point is 00:23:47 So as Tony was saying, with the one-time pad system, one of the real challenges with that is how do you get the pads to the people that need to see it? And PKI is almost like the modern version of the infrastructure that means that you can pass the sort of modern versions of pads around in a way that's kind of secure and understood. pass the sort of modern versions of pads around in a way that's kind of secure and understood and a lot of this rests on uh some actually not not really difficult to understand concepts that come from maths and number theory so prime numbers right so uh and there's a thing about prime numbers that basically says um if you've got two really really really big prime numbers and you multiply them together that's an easy thing to do. Figuring out which two prime numbers you multiply together to get that massive number is a really, really, really difficult thing to do. And you're talking about the sort of thing that computers would take to the end of the universe to do that factoring of the primes problem.
Starting point is 00:24:37 So how does that work in terms of encryption? And so basically, we want to exchange a message between two people. There's a couple of ways you can do it. One obvious way would be to say, OK, we've got a box and it's got a padlock on it. You have a key and I have the same key. Let's not worry about how you got it, but we've both got those keys. I can write my message. I can put it in a box, lock it, send you the box. No one else can open the box, so you get it. You've got the key. You open it up. That's called symmetric.
Starting point is 00:25:01 When people talk about public and private keys, which is the thing that underpins how the internet works, it's slightly different. So in that model, I've got a key. What I do is I give you the box, but the padlock's open. So you can write me a message, and you can put it in the box and shut it. Once you've shut it, you can't open the box again, neither can anyone else.
Starting point is 00:25:19 But you can send me the box, I've got the key, I open it up. So really, the whole thing about public key cryptography is a mathematical implementation of that process, really. The whole way that this works on the Internet is about putting in place infrastructure, what we call the trust infrastructure of the Internet, that means that you can trust the fact that the box that you've got is actually the box that you're expecting. So you've probably seen this, certificates and things, you've you've got your web browser you see a whole load of things about certificates and things have been signed and and so on so basically that is all about um the infrastructure that exists to make sure that those keys and those padlocks and those boxes you can trust where it came from um and uh and then the maths behind all that is to do with the way that
Starting point is 00:26:01 essentially factoring the primes and those kind of things. So, Tony, it sounds as if codes, encryption, these things are pretty much uncrackable. But as Ian said, it can be the humans that are the weak link in the chain. Could you give us some examples or an example historically of those mistakes that led to the breaking of a code? Right at the very beginning, in November 1914, the way in which the Germans added a key to the values from their codebook wasn't understood. You know, they just saw these messages that even when you compared the values to the captured codebook, nothing was coming out.
Starting point is 00:26:42 And it was one of the people working in the admiralty who just thought his father was German his name was Rotter and he thought the Germans are a methodical people and they send messages every one of them is going to be numbered sequentially so if we've seen message number 151 and 152 in plain text we can guess that the next one is going to be number 153 even though it's enciphered and it was using that as a starting point that he started to look well how could this set of letters or numbers represent the number that i think the message is going to have and then working back from that so it's uh understanding human behavior as much as the mathematics and yeah one of the
Starting point is 00:27:33 earliest uh working aids that was developed at Bletchley Park was a book of German swear words and that's because on variants of the Enigma machine that have the rotors with letters on when you set up the machine to send a message you have three rotors that you have to set at any random setting but it's humans and they don't like random settings and it's 18 19 20 year old young men who've been conscripted into the German army, and they really hate what they're doing. So what are they going to do? They're going to use girls' names, or they're going to use swear words, or they're going to use the setting where the rotors got up to at the end of the last message,
Starting point is 00:28:17 because it's easier than to change anything just to go that way. And so there's a whole set of ways of guessing where things might be starting that are based just on your observation of what people do people do daft things when you let them is it again i want to because i know the imitation game may well not have been you know filled with veracity but it's uh this is in terms of the fact that once a code has been broken, the decision of when you risk changing your actions that may well give away that the code has been broken. I mean, that morality, that ethics, have we seen that in hits? I mean, in that film, they talk about not changing, I believe,
Starting point is 00:29:01 the direction of a boat because that would give away and that would be in terms of they're doing a whole, a direction of a boat, because that would give away, and that would be, in terms of they're doing a whole... a balance of how many lives saved. They're looking at a pragmatic... So do we see that in history, that ethical dilemma? The imitation game gets two things absolutely right. There was a Second World War, and Turing's first name was Alan. To use it as a guide... To use it as a guide for the way we work would be sort of slightly odd.
Starting point is 00:29:30 That sort of decision on the balance of the use and the protection of intelligence has been in our business. Again, if you look at December 1914, when the first indications are that the Germans might be going to shell towns on the North Sea coast. How do you get that information to those towns without letting the Germans know that you're reading their ciphers? This is a constant and there's a whole process of saying how can you get a plausible alternative source for the information so you could make it look as though an agent had got hold of it or you might have, you know a convoy is going in a particular place so you send a plane up to spot the convoy. Even though the plane is in danger then of being shot at itself,
Starting point is 00:30:18 at least you've got something that would give another reason for finding that information. The biggest myth of all is that you'd sacrifice lives, you'd sacrifice British lives just to protect your secret. We've never, ever done that. The most potent myth is that Churchill sacrificed Coventry to protect the fact that they'd started reading Enigma, and it's not true. It wasn't true then, and it's never, ever been true.
Starting point is 00:30:48 Because, yes, there is the dilemma of how do you protect your sources, but there's an absolute bottom line, it is not at the cost of British or Allied lives. Just before we move on from the film that perhaps I feel awkward even naming now, I won't name it, I won't say it again, I won't say it again, but one of the things that came out of that and Bletchley Park and and I hope this isn't a myth uh because I'm looking around the room now here all of these people in here and actually the gender split is pretty good from what I can see I mean it's got it could even
Starting point is 00:31:19 be approaching 50 50 or 30 percent women perhaps and there was this sort of this association with Bletchley that women had a big role there and were recruited and for the first time women mathematicians and were given a huge role and do you think that has continued to what I can see around the room now which is a massive female presence here? At its height, at the end of 1944, the workforce at Bletchley Park was 76.3% women. The only two jobs that weren't done by women were the armed guarding of the site and the senior management. And that's really the clue as to, if you like, the wasted opportunity that nobody conceived of the idea that women could be right at the top of the organisation, even though they were able to demonstrate that they could do any of the work that was there.
Starting point is 00:32:18 That raises an interesting question, actually, Ian, which is what is, what makes a good codebreaker? actually, Ian, which is what is, what makes a good codebreaker? Yeah. So I think we sort of touched on this a bit earlier, but I think to be a good codebreaker or, you know, to be the sort of person that GCHQ needs to do what it does, because we do a lot of things, right, and codebreaking is obviously a big part of it,
Starting point is 00:32:39 but it's not all of it. I think there are a couple of things that I think we really learn matter. And probably the thing i would call out more than anything else is um kind of a problem solving mindset really so there's something about people i think who are interested in solving problems they don't have to be technical problems particularly they just have to be you know that sort of um like tenacity to want to want to know to want to solve a problem to understand it um again we've mentioned this before but teamwork so team working feels like a really really important uh thing here now at the moment and uh and again you know the the sorts of the different kinds of ways that people think and being able to work together to solve
Starting point is 00:33:16 problems rather than just being on their own i think is is really really important it's all very normal here i went past the cafe and it just said, today's special, baby food, laxative and butter stew. Can I ask one quick question? And I'm sorry if it's a silly question, but I'm dying to ask. A friend of mine had high up security clearance in number 10 and had to have lots of clearance chats and instructions about what she could and couldn't do in the office. And I always wondered, I wasn't ever sure if she was pulling my leg or not about this, but she said that it was possible, the technology existed, that if she was talking about something classified
Starting point is 00:33:52 in the building, that someone could shine a laser onto the windowpane and it could interpret the vibrations of the windowpane as she was talking and then that could be turned into words that someone could understand what she'd said. Is that true? He's not going to answer that. But you don't listen to what he just said. I can neither confirm nor deny
Starting point is 00:34:12 that. What? Oh no. But it is in a lot of movies, right? So it probably is true. A bit like the imitation game, right? Damn it! What, are you going to put me through another 20 years of not knowing? Your friend who's high up in government,
Starting point is 00:34:27 is it actually Judi Dench? That's right. We asked our audience, GCHU audience, a question, and we asked them, who would you like to bamboozle and why? And, you know, redacted. Redacted. Well, this is a very good answer. Everyone except the recipient.
Starting point is 00:34:49 That is the point of a secret code. My eight-year-old daughter, she thinks she knows everything. Codes? None of that nonsense. I'm an engineer. The people who wrote the GCHQ puzzle book, because it's too darn difficult. If I was going to ask you one, Brian, you know everything, don't you? Which element is missing?
Starting point is 00:35:09 Arsenic, astenin, bismuth, carbon, copper, iron, krypton, neon, ogenicin, phosphorus, silicon, tenosine, tin, xenon. You have seven seconds. I bring him here to you to see his doubtful face. It doesn't happen often. I was looking through that book earlier on, and I can't do any of it, but then I was heartened to find
Starting point is 00:35:31 that there's one page which says, can you identify this person? And it's a picture of Victoria Wood. So I thought, yeah, maybe I am the ideal candidate for GCHQ after all. Does anyone know the answer to that in this book, by the way? They can't tell you, but you can turn to the answers section. Now, I'm not saying that you might not be the best spy. How on earth can I break this code?
Starting point is 00:35:52 Index, answers, answers, answers! You said it on the radio. People want to know what the answer is now. I'm just saying helium, because it wasn't in the list. I've got a reasonable chance. Who would you like to utterly bamboozle with your secret code, and why Brian? Because mathematics beating physicists is a fun fact
Starting point is 00:36:08 of life. Thank you very much to our panel and that brings us to the end of this series. We're going to leave you with two problems and you have until the next series to come up with a solution which will be probably January 2019. Whoever sends us the answers first will receive one of Brian's out of date
Starting point is 00:36:23 Mensa membership cards. He knows the sequence of any shapes. Anyway, what is the next letter in this sequence? M-V-E-M-J-S-U? N. You don't have to wait for the next series. It's N. And I love that.
Starting point is 00:36:44 Here are a group of people who have to hold in secrets. I can't hold that in. Here is... It's N. And I love that. You knew... Here are a group of people who have to hold in secrets. I can't hold that in. I definitely know what that is. So it was a lovely moment when someone just lost their job. Ian, you need to keep an eye on that. And if that's too easy, how about... Robin and I went to the cinema.
Starting point is 00:37:00 In cryptic crossword clue terms, the film we saw was £6,000. First word, six letters. Second word, seven letters. What was the film? Four monkeys. A word. We'll give you the answers at the end of the next series.
Starting point is 00:37:24 So that is the last Infinite Monkey Cage of this series. I will be joining you again in January. Robin won't be with us because he wanders off at lunchtime and now he knows too much. Goodbye. I did look in the secret room. APPLAUSE In the infinite monkey cage Without your trousers In the infinite monkey cage Turned out nice again.
Starting point is 00:37:52 When was the first time that you saw your face tattooed on somebody's body part? This summer, don't forget to pack a castaway or two. There's a constant voice that's saying, Oh, stop it. Oh, just give up. Rubbish. You're a constant voice that's saying, oh, stop it. Oh, just give up. Rubbish. You're a ball of energy. Oh, no worries,
Starting point is 00:38:08 because I'm excited to be here. Travel with Matt Smith, Charlie Brooker, Nicola Adams, or over 2,000 other companions. I stepped into the centre of the ring and when I won, they said it was equivalent
Starting point is 00:38:20 to the same amount of noise as a jumbo jet taking off. Search for Desert Island Discs wherever you get your podcasts. This is the first radio ad you can smell. The new Cinnabon pull-apart only at Wendy's. It's ooey, gooey, and just five bucks for the small coffee all day long. Taxes extra at participating Wendy's until May 5th. Terms and conditions apply.
Starting point is 00:38:45 all day long. Taxes extra at participating Wendy's until May 5th. Terms and conditions apply. In our new podcast, Nature Answers, rural stories from a changing planet, we are traveling with you to Uganda and Ghana to meet the people on the front lines of climate change. We will share stories of how they are thriving using lessons learned from nature. And good news, it is working. Learn more by listening to Nature Answers wherever you get your podcasts. you

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.