Please indulge me, fellow TFLers. It seems to come up again and again that strong claims are made on the basis of 'scientific proof' of one thing or another. I feel compelled to point out that most people just don't understand how research works. I'm no scientist (and I don't play one on TV either), but every day on the job I'm required to use the results of other people's science, and I have to decide whether what they claim is beleivable.
Turns out that a lot of what we 'know' just ain't so. Case in point: the notorious Kellerman study showning how we're a billion times more likely to be massacred with our own gun than to be defended by it. By now most people here understand the methodologic flaw in that study: the counting of only killings or shootings, disregarding the much more common scenario of assailants being frightened off by the sight of the intended victim's gun.
Apply that lesson to most of the science you hear reported. If possible, try to find out the method of the study. If you see bad methods, ask yourself why. Why would a university prof not think of the existence of non-lethal deterrence with a gun? Because of the bias inherent in the prof's approach to the issue.
BIAS is the bugbear of research. It can creep in at the very beginning of a study, in the nature of the question it seeks to answer, as in studies designed to assess the effects of global warming on frogs. Such a study assumes that GW exists, which is a big assumption indeed.
Bias can arise from selection of the test subjects. That's what I meant when I said EMS Guy suffered from selection bias: the population from which he derived his conclusion is made up of the people calling 911. That's not a representative sample of much of anything, and it can lead you to overstimate the prevalence of disease.
Bias can creep in after all the data is gathered. The researcher can make math errors. Or he can use an inappropriate statistical test. Or, all too common, he can draw a conclusion which is just not warranted by the data.
So you've got to be very, very careful in designing your study, interpreting the result, and reporting your conclusion, that you're not deceiving yourself and making the facts fit the picture you have in your head. It's very easy to make a mistake even with honorable intent.
Then it gets worse, because the study is reported by people who have no idea how to judge research, and who have biases of their own. That's how Arming America got a free pass in the press.
I once heard of a study of the correlation between human birth rate and stork populations in a Dutch town. It showed 100% correlation: more storks, more babies. Ergo, the researcher said, storks bring babies. He was trying to illustrate bad research: of course, storks nest in chimneys in Holland, and when birth rates are up, more houses are built, and more nests for storks.
I'm not saying we should abandon science. I'm just saying that we humans are very good at b**llsh**ting ourselves, and we really have to watch it. Especially when we propose to use science to justify abridging anyone's rights.
Thanks for listening. This has been a public service of Khornet.
Turns out that a lot of what we 'know' just ain't so. Case in point: the notorious Kellerman study showning how we're a billion times more likely to be massacred with our own gun than to be defended by it. By now most people here understand the methodologic flaw in that study: the counting of only killings or shootings, disregarding the much more common scenario of assailants being frightened off by the sight of the intended victim's gun.
Apply that lesson to most of the science you hear reported. If possible, try to find out the method of the study. If you see bad methods, ask yourself why. Why would a university prof not think of the existence of non-lethal deterrence with a gun? Because of the bias inherent in the prof's approach to the issue.
BIAS is the bugbear of research. It can creep in at the very beginning of a study, in the nature of the question it seeks to answer, as in studies designed to assess the effects of global warming on frogs. Such a study assumes that GW exists, which is a big assumption indeed.
Bias can arise from selection of the test subjects. That's what I meant when I said EMS Guy suffered from selection bias: the population from which he derived his conclusion is made up of the people calling 911. That's not a representative sample of much of anything, and it can lead you to overstimate the prevalence of disease.
Bias can creep in after all the data is gathered. The researcher can make math errors. Or he can use an inappropriate statistical test. Or, all too common, he can draw a conclusion which is just not warranted by the data.
So you've got to be very, very careful in designing your study, interpreting the result, and reporting your conclusion, that you're not deceiving yourself and making the facts fit the picture you have in your head. It's very easy to make a mistake even with honorable intent.
Then it gets worse, because the study is reported by people who have no idea how to judge research, and who have biases of their own. That's how Arming America got a free pass in the press.
I once heard of a study of the correlation between human birth rate and stork populations in a Dutch town. It showed 100% correlation: more storks, more babies. Ergo, the researcher said, storks bring babies. He was trying to illustrate bad research: of course, storks nest in chimneys in Holland, and when birth rates are up, more houses are built, and more nests for storks.
I'm not saying we should abandon science. I'm just saying that we humans are very good at b**llsh**ting ourselves, and we really have to watch it. Especially when we propose to use science to justify abridging anyone's rights.
Thanks for listening. This has been a public service of Khornet.