Richard once wrote a piece called Cargo Cult Science. It focuses on people who get lost in the trappings of science and miss the point of the scientific enterprise. In explaining what he means by "cargo cult science", he writes:
In the South Seas there is a cargo cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they've arranged to imitate things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas—he's the controller—and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they're missing something essential, because the planes don't land.
Doug Hoffman connects cargo cult science with climate science in his piece: Cargo Cult Climate Science at The Resilient Earth.
In this age where climate scientists are embroiled in public scandal, it is instructive to compare the actions of the CRU crew and others so recently in the news with Feynman's ethical standards. For those not following the Climategate scandal, a number of prominent climate change alarmists were caught out withholding and ultimately destroying climate data rather than letting critics review the data themselves. Here is what Feynman said about such shenanigans:[T]here is one feature I notice that is generally missing in cargo cult science. That is the idea that we all hope you have learned in studying science in school—we never explicitly say what this is, but just hope that you catch on by all the examples of scientific investigation. It is interesting, therefore, to bring it out now and speak of it explicitly. It's a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards. For example, if you're doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you've eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.Feynman summarized it this way: “the idea is to try to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgment in one particular direction or another.” Yet, the University of East Anglia's Climate Research Unit breached Britain's Freedom of Information Act by refusing to comply with requests for data concerning claims by its scientists that man-made emissions were causing global warming.
....
Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can—if you know anything at all wrong, or possibly wrong—to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.
Again, when I was learning science, I learned that your write-up had to include your data, your procedures, and an interested reader would need to know in order to replicate your experiment. If a reader had to be psychic in order to reconstruct the experiment, I got marked down on my write-up.
The CRU and ICPP researchers must not have gone to Cal State Univ. Los Angeles.
No comments:
Post a Comment