Friday, November 2, 2012

On prediction in terrorism research ...

Now that the previous post on predicting future tech trends has been around for a bit, here is one more take on prediction.

Before that, some notes based on my studies...

Why does a terrorist group's activity go down?
1) It is planning for a big attack(s) and is corraling resources toward this goal. 
2) It is planning to lull the State into a false sense of security and exploit the vulnerabilities that accrue over time.
3) The Intentions of the group change and it is exploring political options.
4) It is sharpening the fighting unit with training/changes in tactics.
5) It is really down and out.

Why does a terrorist group attack?
1) It wants to prove its detractors (including the State) that it is (still) a force to reckon
with. In other words, it wants to establish its legitimacy as a threat. 
2) It wants to take revenge on/commemorate certain past incidents.
3) It wants to pursue certain bold/new strategies to bring the State to its knees.
4) It wants to confuse the State that it is a potent fighting machine especially as its numbers shrink. Unlike politics, inaction is not an option as internal dissension could break the viability of a terrorist group. 
5) It wants to capture certain resources to further the business of terrorism. 

Options for fighting terrorists/insurgents:
With the caveat from Peter S. Probst that:
"Fighting terrorism is not a game for an impatient people. One must carefully think through second and third order consequences of any move. It’s a bit like three-dimensional chess. Good intelligence, political will, imagination, an educated and engaged public, uncommon courage, and disciplined restraint are necessary. The way we choose to prosecute this endeavor will largely determine the winners, the losers, and the price paid by each.",  
the following attempts are ordered from the crude to the not-so-crude, and in the order of the measures that take the shortest time to enforce to the longest.

1) Neutralize the rank-and-file or (exclusively) the top echelon of the outfit --- incarcerate, kill, maim/injure. Set an example in a court-of-law as and when possible without backlash on counter-propaganda/martyr status. The above does not answer the following questions: which State outfit should be used for fighting the insurgents/terrorists (armed forces, paramilitary forces, police forces, etc.)? What defines an act that requires intervention and oversight from elsewhere? How does one balance strategies for kill-all to kick-all?
2) Cut off sources of funding (intra-State or neighborly or transnational or international).
3) Diplomacy to recognize them as worldwide terrorist outfits thereby ensuring lack of safe havens/finance sources abroad. Work with the international banking/police systems to prevent illicit money transfer/capture of emigrant fighters.
4) Dispel the ideology that feeds the outfit by active propaganda efforts
5) Negate the sources that lead to the whine profile within the boundaries of financial affordability. In other words, indulge in a battle of the hearts and minds --- beat them over time and wean the silent support-cast (that is, remove the Oxygen to extinguish the fire).

Another track in these strategies is to sue for peace, gather intel in the process, and use this intel to cull out and neutralize the outfit/top echelon. The risks for backfiring are quite high.

Problems with current terrorism analysis and counterterrorism research:

1) All attacks contribute equally to statistical analysis. In reality, attacks are made unequal. Each attack has a purported motive/intent and leads to a certain counter-response and unless the tit-for-tat is accounted for, there are no credible lessons to be learned with numbers or statistical analysis based on such numbers. 

2) Not all counter-terrorism policy measures have the same impact. Not all measures have an impact over a defined period of time. Not all impacts each measure has on the targeted population can be measured. In other words, psychological/sentimental impacts of a given measure are often shades of gray subject to the same torture that a qualitative treatment has.

3) Most things can be predicted. Unfortunately, this seems to be a dictum in the policy circles. If you cannot predict when a future attack will happen (or at what rate future attacks will happen), it is a bad outcome of terrorism research. Hence, there is a need to model the past and extrapolate that into the future. This myth requires a very detailed riposte.

Far from reality, terrorists are neither rational nor irrational --- they are just human. As much as terrorism is theater (to borrow Brian Jenkins' famous quote), it is also a business and a life-or-death outcome for people who have invested in it. Policies adapt and move on, people get bumped off and new thinking dominate, and there is just only so much that the past can mean for the future.

Predictive power of a counter-measure has fundamental limitations in practice. This is obvious to anyone with a fair sense of estimation theory where the Cramer-Rao lower bound serves as a fundamental limit beyond which the mean squared error in the form of variance of the estimator (a sub-standard metric in reality, but nevertheless) cannot be driven below. The situation is more complicated in general statistical models that are neither linear in "noise" nor iid. Thus, terminologies such as "predictive policing" or "predictive counter-terrorism" are buzz-words with a primary focus on accounting for the moneys invested than with real impact on the ground.

Yahoo! sociologist Duncan Watts further avers on how common-sense is over-rated
"For almost as long as it has existed, that is, sociology has had to confront the criticism that it has “discovered” little that an intelligent person couldn’t have figured out on his or her own.   
The problem with common sense is not that it isn’t sensible, but that what is sensible turns out to depend on lots of other features of the situation. And in general, it’s impossible to know which of these many potential features are relevant until after the fact (a fundamental problem that philosophers and cognitive scientists call the “frame problem”).
Nevertheless, once we do know the answer, it is almost always possible to pick and choose from our wide selection of common-sense statements about the world to produce something that sounds likely to be true. And because we only ever have to account for one outcome at a time (because we can ignore the “counterfactuals,” things that could’ve happened, but didn’t), it is always possible to construct an account of what did happen that not only makes sense, but also sounds like a causal story."

Former intel czar, Peter S. Probst, writes the following in his famous essay on measuring success in counter-terrorism efforts:
"... short term success paved the way for long term failure and a significant setback in the war of hearts and minds. The point is defining success can sometimes be tricky depending on one's ultimate objectives and timeframe.
Countless attempts to implement quantitative systems to evaluate success too often have backfired with unintended and serious consequences.
The way we use statistics has a way of distorting reality and, frankly when it comes to Intelligence, one needs to regard statistics and those who tout them with no small measure of suspicion."
While admittedly belong to the latter club that Mr. Probst criticizes, I would like to point that the problem is not with statistics as such. The problem lies with how statistics are used, and at the end of the day, if garbage goes in to a statistical decision-making framework nothing more than garbage shall come out. If useful/cleaned/processed data enters the framework, inferences that are often deep and patterns that are not clear to the naked eye can show up. Of course, the price to pay for such an output is rigorous training, a "fad" for which the enthusiasm is low (as has always been the case).



Post a Comment

Subscribe to Post Comments [Atom]

<< Home