Research is taught, conducted, and described professionally as a suffocatingly serious endeavor. This is appropriate, because people and the societal systems act on scientific knowledge. It therefore makes sense for science to be a no-nonsense business, committed to getting the facts right and nothing else. When you think about science, levity is not the go-to gear.
This does not mean, however, that science can’t be funny, as in amusing or weird or unexpected. Or even sometimes shockingly obvious and mundane. Skinner (1956) alluded to the last point in his classic treatise, “A case history in scientific method,” in which he shared pearls of wisdom like, “apparatuses sometimes break down” and “the major result of this experiment is that some of my rats had babies.” Presumably Skinner had a small chuckle when penning those truths.
Sex rays, Impertinent Children, and the Happy Hour Curse
While there is little in the public face of science that’s funny, behind the scenes, well, stuff happens. One of my vivid graduate school memories is of entering the lab to find a classmate’s leg dangling from the ceiling. She’d clambered up while chasing a pigeon that had escaped through an open drop-ceiling panel, and gotten stuck. Another instance — I grant that not everyone will find this one funny — is the possibly apocryphal story told to me about a graduate student who, on the way from the vivarium to the lab, got into a heated theoretical argument while grasping his pigeon in one hand. As the story goes, he got so riled up during the argument that he unknowingly crushed the poor bird to death.
I had a moment of my own while recruiting participants for a dissertation study. Screening was elaborate and involved about two hours’ worth of questionnaires and performance tests, such as various subscales from intelligence assessments. The purpose was to verify that each participant was “normal,” that is, within the typical range of cognitive functioning. One summer afternoon a prospective participant passed all of the screening tests with flying colors, and I was just about to sign him up when he said, “Tell me, do you think this experiment will help me to better understand myself?” I began to explain that the point of the research was to explore learning processes that all people have in common, rather than to “diagnose” what’s different about individuals, when he proceeded to demonstrate that he in fact required a diagnosis. He told me that he was the world’s smartest person, and that by age 17 he was “on the forefront of theoretical physics.” That is, until “they” became aware of him. Who? The aliens, of course. They found his intellectual abilities threatening (to their plans for world conquest maybe?) and so now, whenever he began to have a really smart thought, they would bombard him with “sex rays” and force him to masturbate, after which the brilliant insight would be forgotten. He hoped that being in the experiment would help him learn how to defeat the sex rays (and, I assume, save the planet). Needless to say, no informed consent agreement was signed that day.
Some of the fun of research occasionally slips into published articles. In the 1980s there was a lot of interest in whether people need to be aware of reinforcement contingencies to produce orderly schedule-controlled behavior. This was often explored by asking participants, after an experiment concluded, to describe the contingencies, with the prediction being that performance would conform to descriptions. In one such study (sorry, I forget which one), a child was asked what was required to make reinforcers happen. As reported in the article, he showed his precocious understanding of the contingencies by replying, “Pop goes the weasel!”
Among the potentially funny things about research is how dense we researchers can be. In the early days of human operant research, Auburn University’s Bill Buskist conducted a study in which the reinforcers were small cups of nuts and raisins. Bill was convinced that edible reinforcers would yield especially sensitive performances, and after considerable data collection that’s exactly what the data showed: immaculately schedule-controlled behavior. Imagine Bill’s surprise, therefore, when outside the lab window he noticed a sizable pile of nuts and raisins that disinterested participants had tossed out rather than consume. Whatever was controlling behavior in the experimental sessions, it was not food reinforcers.
Dan Fienup and I got to wear the scientist’s dunce cap when conducting a stimulus equivalence experiment with college student participants in borrowed lab space that was available to us only on Monday mornings and Friday afternoons. We were quite encouraged when Monday participants began producing results consistent with our expectations: they nailed the predicted emergent (untaught) stimulus relations. Then we examined the Friday data — disaster! Our Friday subjects performed at little better than chance on the same emergent relations test. Perplexed by this failure of experimental control, we shared our data with a colleague who might have launched into a fascinating intellectual discussion about fluctuating motivating operations. Instead he responded, “i can see why you’re confused. Who could have ever imagined that college students would be useless on a Friday afternoon?”
These last examples carry a take home message. “Pop goes the weasel!” is a reminder that, although verbal reports can sometimes be informative, they aren’t universally veridical (e.g., Perone, 1988). Buskist’s study and the one Fienup and I conducted remind that consequences work, but not necessarily like you think they should. Now, it might seem that capable scientists wouldn’t need such reminders, but research is complicated, and the harder you concentrate on certain things, the more likely you are to wear blinders for others. No investigator is omniscient, and when our attention shifts from what’s important, research studies have a way of jolting us back into focus.
Share Your Fun(ny) Example!
“Research is Funny” is an occasional series of short posts devoted to sharing about moments when research went off the rails in unexpected but informative ways. These cases might be drawn from published studies that are little known. Or they can be unearthed from your file drawer where justifiably unpublishable studies go to die. All that matters is that whatever went not-entirely-according-to-expectations was in some way instructive, perhaps revealing a previously unconsidered tidbit of wisdom, or perhaps just illuminating one that should have been obvious all along. These cases can be thought of as the less-dignified cousins of more familiar counterintuitive findings, such as electric shock functioning as a positive reinforcer or supposed “failures” of operant conditioning in the famous Breland and Breland (1961) report, “The misbehavior of organisms.” Dignified or not, they are worth sharing.
If you have an idea for a “Research is Funny” post, contact me (tscritc@ilstu.edu) and we’ll discuss how you can put that together.