darb.ketyov.com

Caveat lector: This blog is where I try out new ideas. I will often be wrong, but that's the point.

Home | Personal | Entertainment | Professional | Publications | Blog

Search Archive

Loading...

1.8.12

The deception ratchet

The recent admission by Jonah Lehrer of fabricating quotes in his latest book has caused a lot of schadenfreude, bloviating about the "state of journalism", etc. People are writing a lot about what this "means".


I've been critical of Jonah Lehrer in the past because of his seemingly blind exaltation of neuroscientific findings but noted that this is a symptom of the state of cognitive neuroscientific research in general. After all these years, I believe I've finally identified the major root source of my scientific frustration.

Lies of omission. Intentionally leaving out critical information with the intent to deceive.

Again, this post isn't about Lehrer, it's about how Lehrer reflects the state of neuroscience. Specifically, this post is about the pervasiveness and problems associated with lies of omission. While I will discuss this in a neuroscientific and science reporting context, I think it is a more generalizable issue.

In that post linked to above, my friend and fellow neuroscientist Dan left a very insightful comment mirroring a discussion I had on twitter with Noah.





Dan said:
1. [Lehrer] wrote many blog posts, long-form articles, and books. In all cases, he had the luxury of space to add nuance to his articles (my point in an earlier comment).
2. He clearly isn't a scientist but he's shown that he can read and understand a large swath of scientific literature. He read enough that he must of come across articles that contradicted his ideas.
3/4 Is the big question. We know significant research exists that contradicted many of the stories he wanted to tell. When he came across those contradictory findings, did he convince himself that they weren't good science or did he actively ignore them to cherry pick the studies that fit his stories?
Dan's points 3/4 are basically talking about lies of omission. While Lehrer admitted to fabricating some quotes and then lying to cover up that fabrication, my deeper concern is how easy it is to commit a lie of omission and how hard such a lie is to uncover.

How can you possibly prove someone didn't know something when the accused can so easily feign ignorance?

Every time I hear about the insula being the "disgust" part of the brain, or oxytocin being the "love chemical", or dopamine being the "addiction chemical", I cringe. Normally I would say this is my curmudgeonly annoyance at oversimplification, but as Dan noted anyone who spends even a minute doing a search of the literature on PubMed will find tons of contrary evidence against the tidy narratives often reported in media accounts.

For example, the insula is also associated with a whole barrage of behavioral and cognitive functions.

There are two issues wrapped up here: one journalistic and one scientific.

Journalists are, in theory, supposed to fact check, but how can they check facts on scientific minutia in which they are not trained? Who is culpable? And are scientifically savvy journalists who follow the press release about a new scientific finding at fault if they don't address the complexities (when the allure of a neat behavioral narrative is so strong)?

Now the second problem comes down to evidence fishing. I'm sure many neuroscientists who have analyzed their data have encountered an anomaly. Maybe an unexpected brain area showed significant activation to your task. Maybe your treatment group had an unusual response. How do you handle these discrepancies?

The wrong way is to go to PubMed, search for the name of the brain area and the name of your task or drug or whatever, find some abstracts that talk about the two, and just go ahead and cite those and pretend like it's common knowledge that the insula "processes disgust" and leave it at that.

But I promise you this happens. A lot. I've seen it happen.

In both cases, journalist and scientist alike have committed sins of omission. The savvy journalist knows that their story is too simple, but man it makes for such a better story. The scientist just needs to add that little citation to show that their annoying finding is already known and therefore not worth discussing (because it doesn't fit their paper's narrative).

In a fantastic review of self-deception in unethical behavior, Tenbrunsel and Messick "...identify four enablers of self-deception, including language euphemisms, the slippery slope of decision-making, errors in perceptual causation, and constraints induced by representations of the self."

Honestly I can't recommend this paper highly enough; it's easy to read and fascinating.

The part that is most relevant to the discussion at hand, however, is on the ethical "slippery slope" that I'm calling "deception ratcheting":
The second component of the slippery slope problem is what we call the “induction” mechanism. Induction in mathematics is as follows. If a statement is true for N = 1, and if the statement for N + 1 is true assuming the truth of N, then the statement is true for all N. The way this works in organizations is similar. If what we were doing in the past is OK and our current practice is almost identical, then it too must be OK. This mechanism uses the past practices of an organization as a benchmark for evaluating new practices. If the past practices were ethical and acceptable, then practices that are similar and not too different are also acceptable. If each step away from ethical and acceptable practices is sufficiently small, small enough not to appear qualitatively different, then a series of these small steps can lead to a journey of unethical and illegal activities.
This happens in journalism just as it happens in science. If, as a scientist (or journalist), you witness some lazy brushing away of a surprising finding then you, too, will be more likely to do the same. And when your future students (or assistants) see you doing the same, then they become more comfortable with deceptions and lies of omission as well.

Tenbrunsel and Messick argue that such systematic, generational drift is caused by our large-scale errors in attributing causation to deception:
Acts of omission are a third reason as to why perceptions of causes are in error. Whereas lies of commission are direct misstatements of the truth, lies of omission are acts of deception that occur because someone withholds information that deceives the target. A failure to take an action, i.e., disclose the truth, makes it more difficult to ascertain responsibility, not only for others but also for oneself. Consider the situation in which you are selling a car. Is it your responsibility to inform the buyer that the car has had several unexplained malfunctions or is it the buyer’s responsibility to ask? Phrases or euphemisms such as “buyer beware” reveal the answer: moral responsibility shifts from the agent to the target under situations characterized by acts of omissions. Ritov and Baron’s (1990) work provides empirical support for this intuitive motion, demonstrating that acts of omission are viewed as more acceptable than acts of commission. Acts of omission, then, because they blur the assignment of responsibility, can create self-biased perceptions of causes, shifting blame from self to others. In such circumstances, it is highly likely that individuals’ propensity to engage in unethical behavior increases, because shifting responsibility to others allows one to divorce oneself from the moral implications of their actions.
Errors in perceptual causation allow us to distance ourselves from the ethical issues at hand. We erroneously believe that we cannot fix the problem, because it is a people and not a system issue. We also falsely believe that it is someone else’s problem, either because they are to blame or because the responsibility is someone else’s, not ours. While different from hiding the ethical implications of our own actions, this form of self-deception removes the ethical decision from our backyard, thus watering down the ethical demands of the situation.
The whole style of writing popular amongst the "Big Idea" crowd pushes for errors of omission in favor of a tight story. This is such a minor sin--one for which you almost certainly cannot be caught--that the allure to commit the lie probably overwhelms any inner voice of caution. But once you take that first deceptive step, you are statistically more likely to be willing to baby step your way farther and farther in service of that tight story.

Until in the end you're plagiarizing your own copy, making up quotes, and lying to fellow journalists.

By this point you should be asking yourself, "how well does this Tenbrunsel and Messick paper represent our state of knowledge on this topic?"

So caveat lector. Listen to the notes that are being played, but listen more carefully to the notes that are not being played.

And for you writers and scientists out there, beware the easy allure of the deception ratchet.

ResearchBlogging.org
Ann E. Tenbrunsel, & David M. Messick (2004). Ethical Fading: The Role of Self-Deception in Unethical Behavior Social Justice Research, 17 (2), 223-236 DOI: 10.1023/B:SORE.0000027411.35832.53
Chang LJ, Yarkoni T, Khaw MW, & Sanfey AG (2012). Decoding the Role of the Insula in Human Cognition: Functional Parcellation and Large-Scale Reverse Inference. Cerebral cortex (New York, N.Y. : 1991) PMID: 22437053