The big change was what happened in May 1997, when Tony Blair was elected Prime Minister. Under Margaret Thatcher (Prime Minister from 1979 to 1990) there was a whole series of terrorist attacks in Britain, including the assassination of a number of senior political figures (and personal friends of the Prime Minister) and an attempted assassination of the PM herself. On one occasion while John Major was PM (1990-97), the IRA managed to bombard 10 Downing St while Cabinet was in session (nobody was hurt). What didn't happen between 1979 and 1997 was the introduction of any significant new counter-terrorist measures. When Thatcher became PM, the main counter-terrorist Act was the Prevention of Terrorism Act, passed (under Labour) in 1974. Thatcher's and Major's governments made some minor additions to the PTA, but they didn't feel the need to pass a counter-terrorist Act of their own, or to respond to each attack by bringing in a new law.
In 1997 Blair became Prime Minister. In 1998 the IRA signed a peace deal - the Good Friday Agreement - which effectively ended the main threat of terrorism at that time. Over the next ten years, Blair's government passed
- the Criminal Justice, Terrorism and Conspiracy Act 1998 (in response to the Omagh bombing carried out by a dissident Irish republican group)
- the Terrorism Act 2000
- the Anti-Terrorism, Crime and Security Act 2001 (ATCSA) (in response to 11/9/2001)
- the Terrorism Act 2005 (in response to ATCSA being found illegal)
- the Prevention of Terrorism Act 2006 (in response to 7/7/2005)
- the Counter-Terrorism Act 2008 (in response to the Glasgow Airport attack of 30/6/2007)
Why is there such a big difference between the records of the Conservative governments of 1979-97 and the Labour governments of 1997-2010? There are a number of different ways of looking at it, but I think a key difference is that the Blair government used the precautionary principle when thinking about terrorism. The usual way to think about security risks is to quantify the severity of the danger and the probability of it happening, and multiply the two together. Something that only has a 1% chance of happening, but which will cause £10,000 worth of damage if it does, has a projected cost of 1% x £10,000 = £100; as such, it has exactly the same importance as something which will only cause £200 worth of damage but has a 50% chance of happening. The same calculations can be done using projected deaths, if you're feeling macabre.
Very often, of course, we don't know for certain how high (or low) a probability is, or how much damage something will do if it happens. (How to quantify the damage which would be done if Al-Qaida crashed an aeroplane in Central London? If they'd done it during the Olympic opening ceremony? Killing the Queen?) What the precautionary principle says is that, where uncertainties like this exist, we need to assume the worst. Where there is uncertainty, and where there is a possibility of a risk being at a certain level, the burden of proof lies with anyone saying that the risk is lower.
To see how this way of thinking works, suppose that intelligence suggests that a terrorist attack could cause between £1,000,000 and £10,000,000 worth of damage, and that it has a probability of about 1%. Normally we would quantify the danger at about £5,000,000 and then multiply by 1%, to give a projected cost of 1% x £5,000,000 = £50,000. This is quite low; if we are prioritising, this risk will probably be outranked by other risks with less harmful consequences but higher probabilities (e.g. a 20% chance of a risk causing £300,000 worth of damage). If we apply the precautionary principle, however, all that matters is the top end of the scale of damage: unless somebody can prove that a terrorist attack will definitely cause less than £10,000,000 worth damage, that is the figure that should be used. What's more, under the precautionary principle the 1% probability should itself be seen as a range, running from 0 to 1: 99 times out a 100 it won't happen (probability = 0), but one time in a hundred it will (probability = 1). Instead of a projected cost of £50,000, we end up with a worst-case projected cost of £10,000,000 x 100%, i.e. £10,000,000.
Using the precautionary principle in this context means taking terrorism very seriously indeed - to the point of committing the government to policies which are very expensive in their own right. I think it's worth thinking about the wars in Iraq and Afghanistan in this light.
I came across Prince Harry's interview last night in regards his return from active duty from Afghanistan. He mentions in the interview that he killed 'enemy 'Taliban targets. I think this was ill advised and can potentially increase the risk of either a personal or western world revenge attack. What do you think?
ReplyDeleteHi Keith,
DeleteSorry I missed this at the time. Yes, I think Prince Harry's comments were ill-advised at best. What he said tended to trivialise the war - literally comparing it to a game - which is a bit of an insult to all the Afghan and Pakistani civilians who have been killed by US and British weaponry. Even in a conventional war, a bit more respect would be appropriate - enemy soldiers are some mother's sons, after all. All the more so in a guerrilla war with massive scope for "collateral damage". Hopefully that one interview won't be enough to trigger off revenge attacks, but it certainly won't have helped win hearts and minds.