Interview: Dr Brooke Rogers, Counterterrorism Innovator
Dr Brooke Rogers is an American social psychologist, who has directed her work and research at King’s College London towards analysing how terror and risk can be managed in the public realm. With her humanities background, she’s uniquely placed to assess the impact of the constant innovations made in counter-terrorism; every technological breakthrough brings with it questions about feasibility, privacy, and trust. Bad Idea spoke to her about the kinds of technologies being rolled out in the name of fighting terror, and about their often complex effect on those they are designed to protect.
Bad Idea: How has protection from terrorism changed in recent years?
Brooke Rogers: They used to build buildings and then start building the security around them – nowadays its much more extreme, people are talking to designers and architects about how to make it a building people can actually use, but which has more security.
If you go behind the scenes in new-build shopping centres, they are absolutely hi-tech, the security behind the scenes is fascinating. This has been built into these things ahead of time. We’ve got to a point where a lot of this is now common, to the point where a lot of the places that would be really attractive targets are very well protected. That shifts the threat to softer targets, so we need to look at flexible ways of protecting, say, events, or concerts, that don’t happen in the same place every time. The Olympics is a great challenge, in a good way, looking at flexibility of methods. We have a lot of bells and whistles, but some equipment is in other areas and we haven’t even thought to apply it yet. Quite often a technology that’s being used in environmental monitoring for greenhouse gases, all of a sudden has an application to security. We’re really encouraging this innovation of design, and of application.
BI: Can you give an example of this kind of cross-pollination of technology, from one field into counter-terrorism?
BR: Some of the applications I’ve seen come from the health field, where there are heart monitors you can stand quite far away from and read someone’s heart level. It isn’t implemented right now, but there are talks about could we use this in airports, to see if someone’s anxious. There are big issues around that, because if someone’s really nervous about flying, then their heart level is going to be up.
BI: What other new technologies are coming online?
BR: It various between countries, but say in Washington DC, we can install sensors in the environment that can pick up chemical release, be it a truck that has turned over and spilled chlorine. They can detect levels of radiation. These sensors can become both a help and a hindrance, and it is not unheard of to receive so many false alarms that the sensors are actually turned off.
Some of the airports are doing it very obviously, other airports more subtly, but since swine flu came out they’re taking heat readings off us as we come off the plane, to see if we have an elevated temperature. In a similar way, it’s possible that the heart-rate type thing can work. There are also other programs that can do things like gait analysis – it’s understanding if somebody’s moving in a nervous fashion. And if someone leaves a bag in a busy train concourse, they can scan the environment and realise that something that was not stationary is now stationary.
There’s no one solution – all of these sensors and ways of assessing the environment have to be built in with a lot of other ways of doing it, to make sure we’re not getting false readings. So if someone’s stressed out and moving erratically, maybe they’re running for a train.
BI: And that’s the difficulty with these technologies – when they get something wrong, it could be extremely upsetting and offensive for a wrongly accused person, let alone inefficient and potentially dangerous. How can you draw the line between what’s acceptable and useful, and what isn’t?
BR: Most of counter-terrorist strategy is about enabling people to carry on as normal. We can’t have airport-style check-in at train stations because it’s going to slow everything down. It’s finding the happy medium. Some of my colleagues and I look at public acceptability of counter-terrorism technologies, from CCTV to something more obvious, and at what point we oversecure an environment so that people don’t want to be there. My line is whether or not it discriminates against any one group, which brings all kinds of questions about profiling - at what level would profiling be used to filter this data, and so on.
Most people aren’t bothered by CCTV, there’s a lot of it. A lot of people are losing faith in CCTV – it’s something that works after the event, it’s great after they’ve killed you and they go to court. But what we’re looking at in terms of technology is something that can deter, so it can stop something from happening, or if something is happening then it can help us respond very quickly. The British public put up with CCTV, they assume there are environmental sensors out there – they’ll go with that, because it’s not impinging on any one social group. It’s when you start violating rights, or slowing people down, or start targeting specific groups, that they really revolt against it.
BI: How can you improve “public acceptability”, while still maintaining security?
BR: All this technology is grand, but there’s nothing like a human being walking around, or a human with a sniffer dog. Just having a physical presence makes people feel a lot safer.
Technology can be fallible, as we can be; it’s like when you talk about the energy solution for the UK, you always hear about this “basket of technologies”. I think we need to think about that as well, to combine technology with real people. It’s like when you ask a police officer: “why did you pull that car over?”, and they say “I don’t know, there was just something funny about it, I can’t put my finger on it”, and the car turns out to have a body in the trunk. There’s nothing like experience and instinct. But it can get you into trouble; there are all the debates with stop and search.
BI: So it’s a question of getting people to feel safe, and getting them “on side”?
BR: The government is keen on engaging members of the public, encouraging them to claim ownership of environment. It’s trying to bring back this community like you all had in the Blitz, when people got together. They’re really worried about frightening people. They’ve got a lot in place in order to respond, in order to make sure places are secure, but the human element and human senses are so valuable, and so they’re trying to find creative ways to talk to people about their communities. I’m sure you’ve seen the posters on the Tube, “If you suspect it, report it”, and so on, but after a while that kind of thing fades into the background. Like “mind the gap” – how many times have you heard that? You stop listening. So they’re trying to find some really engaging and creative ways to let members of the public take ownership of the environment.
Designers are trying to design environments that are very welcoming, but very secure as well, and make it somewhere that people want to go. These posters that show ladies having lunch and say “a terrorist attack was prevented because Mrs X saw something suspicious and reported it”, they’re trying to make a direct link with taking action in your environment. They don’t want to frighten people, but they don’t want to say “we want you to take responsibility for the environment because we can’t handle it”. They can handle it, but it’s almost like creating a public army – if everyone’s aware, then we’re so much better off.