Intellectual Self-Defense; or, Honing the Bullshit Detector
1. "I trust you, but I want more time to check": Just because someone wins a particular argument does not mean they are right. The fact that you do not have facts on hand to support your position or attack theirs does not mean that such facts are not out there to be found. If all else fails, simply respectfully say, "Well, you've pointed out some legitimate ideas, I'll have to think about it and check to see if it is true."
After all, if they're asking you to change your political ideology or adopt a particular proposal, if one has the time wouldn't it be prudent to double-and-triple-check the reasoning?
2. Separate fact and value arguments: Frequently, someone who is trying to pull a fast one will fail to distinguish between their factual analysis and their conclusions. Conclusions cannot be made on the basis of factual evidence alone, no matter how overwhelming (except in rare cases where the conclusion is simply a summary of the evidence): there must be some broader
explanatory theory, which typically incorporates a prescriptive, descriptive and causal aspect.
For example: If someone says, "Iraq has WMDs. We must kill them before they kill us.", one need not know an ounce about Iraq's WMD capacities to say, "Hold on. It strikes me that some skepticism is in order given that this claim could be made by people who are not well-meaning or lying. In any respect, bombing the country could leave it open to looting, so even if your facts are right your prescription is wrong. And it's still not ethical to bomb innocent people."
Carefully see when statements about empirical fact are mixed in with personal ethical principles. This is especially true, say, when discussing with Christians, who will often speak of "ethics" but by which they mean their own dogma.
3. Check for predictive value: Any number of theories can be developed in hindsight to account for something. One can even have situations where someone proposes the exact opposite of the intuitive: say, one is losing a war with X tactic but the General proposing X tactic says that the failure is a proof that the tactic is being misapplied. The real value of a theory is its ability to describe things in a consistent way that bears to light new realities, and when speaking of the physical world, its ability to use those new realities to predict what will happen in the future.
One also has to take into account that even predicting things isn't that hard given a confluence of chance and appropriately vague statements. This is the fallacy of horoscopes and fortune cookies: they can be applied to anything, and then that application is used to establish their validity.
4. Check the nature of the causality: Even if someone is establishing a very good correlation between X and Y, they still may not have done their job.
For one thing, correlation does not establish causation: it is a necessary but not a sufficient condition. One can have obscuring causal influences. Take the amount of homeless and doctors in a city. There is probably a very good correlation between them: the more doctors, the more homeless. But that's probably because of the size of the city: larger city, more doctors, more homeless. Control for the size of the city and you'll find that you're within any statistical margin of error.
This isn't a quibble. If someone were to say that regulating the amount of doctors in a city would solve the homeless problem, you'd be very skeptical. If they then cited a correlation, you would be able to rebut.
This happens more often than one thinks. One'll hear it in discussions of pornography and violent media: There's a correlation between using pornography and rape, say. But a) if most men look at pornography, then isn't it going to be a no duh that most rapist men also look at porn? In fact, looking at the proportion of porn users among sexual offenders, one often finds LESS than would be expected; b) Are those people porn users because they're rapists or rapists because they're porn users?
B above segues to the next thing one needs to check: Reverse causality. Does the causal influence flow the way they're saying it does? This also isn't a quibble.
Also remember that even very good studies and polls often have margins of error at something like + or - 3%. Add in any number of other chaos factors and instrumental errors and one can be deluded very quickly by statistics.
5. Some math tricks: These may sound obscure, but they're more common than you might think, particularly in economics.
One common mistake is to confuse "mean" with "median". The two may seem to be both attempts at measuring the average, but they generate rather different results.
Take the set 1, 7, 9, 10, 10. The median is the number inbetween: 9. The mean, in contrast, is 7.4. (And as an aside: The "mode" is 10, as it is the most common number that appears).
Both are good measurements, but they mean and imply different things. What the above example indicates is that skewing numbers that are too small or large will throw off the mean by a substantial amount. In economics, this means that a society with a lot of inequity will have a mean income that looks nothing like the median income.
The median income is a good measure in an economy because it indicates the number above which 50% of the population and below which 50% of the population lies.