darb.ketyov.com

Caveat lector: This blog is where I try out new ideas. I will often be wrong, but that's the point.

Home | Personal | Entertainment | Professional | Publications | Blog

Search Archive

1.8.12

The deception ratchet

The recent admission by Jonah Lehrer of fabricating quotes in his latest book has caused a lot of schadenfreude, bloviating about the "state of journalism", etc. People are writing a lot about what this "means".


I've been critical of Jonah Lehrer in the past because of his seemingly blind exaltation of neuroscientific findings but noted that this is a symptom of the state of cognitive neuroscientific research in general. After all these years, I believe I've finally identified the major root source of my scientific frustration.

Lies of omission. Intentionally leaving out critical information with the intent to deceive.

Again, this post isn't about Lehrer, it's about how Lehrer reflects the state of neuroscience. Specifically, this post is about the pervasiveness and problems associated with lies of omission. While I will discuss this in a neuroscientific and science reporting context, I think it is a more generalizable issue.

In that post linked to above, my friend and fellow neuroscientist Dan left a very insightful comment mirroring a discussion I had on twitter with Noah.





Dan said:
1. [Lehrer] wrote many blog posts, long-form articles, and books. In all cases, he had the luxury of space to add nuance to his articles (my point in an earlier comment).
2. He clearly isn't a scientist but he's shown that he can read and understand a large swath of scientific literature. He read enough that he must of come across articles that contradicted his ideas.
3/4 Is the big question. We know significant research exists that contradicted many of the stories he wanted to tell. When he came across those contradictory findings, did he convince himself that they weren't good science or did he actively ignore them to cherry pick the studies that fit his stories?
Dan's points 3/4 are basically talking about lies of omission. While Lehrer admitted to fabricating some quotes and then lying to cover up that fabrication, my deeper concern is how easy it is to commit a lie of omission and how hard such a lie is to uncover.

How can you possibly prove someone didn't know something when the accused can so easily feign ignorance?

Every time I hear about the insula being the "disgust" part of the brain, or oxytocin being the "love chemical", or dopamine being the "addiction chemical", I cringe. Normally I would say this is my curmudgeonly annoyance at oversimplification, but as Dan noted anyone who spends even a minute doing a search of the literature on PubMed will find tons of contrary evidence against the tidy narratives often reported in media accounts.

For example, the insula is also associated with a whole barrage of behavioral and cognitive functions.

There are two issues wrapped up here: one journalistic and one scientific.

Journalists are, in theory, supposed to fact check, but how can they check facts on scientific minutia in which they are not trained? Who is culpable? And are scientifically savvy journalists who follow the press release about a new scientific finding at fault if they don't address the complexities (when the allure of a neat behavioral narrative is so strong)?

Now the second problem comes down to evidence fishing. I'm sure many neuroscientists who have analyzed their data have encountered an anomaly. Maybe an unexpected brain area showed significant activation to your task. Maybe your treatment group had an unusual response. How do you handle these discrepancies?

The wrong way is to go to PubMed, search for the name of the brain area and the name of your task or drug or whatever, find some abstracts that talk about the two, and just go ahead and cite those and pretend like it's common knowledge that the insula "processes disgust" and leave it at that.

But I promise you this happens. A lot. I've seen it happen.

In both cases, journalist and scientist alike have committed sins of omission. The savvy journalist knows that their story is too simple, but man it makes for such a better story. The scientist just needs to add that little citation to show that their annoying finding is already known and therefore not worth discussing (because it doesn't fit their paper's narrative).

In a fantastic review of self-deception in unethical behavior, Tenbrunsel and Messick "...identify four enablers of self-deception, including language euphemisms, the slippery slope of decision-making, errors in perceptual causation, and constraints induced by representations of the self."

Honestly I can't recommend this paper highly enough; it's easy to read and fascinating.

The part that is most relevant to the discussion at hand, however, is on the ethical "slippery slope" that I'm calling "deception ratcheting":
The second component of the slippery slope problem is what we call the “induction” mechanism. Induction in mathematics is as follows. If a statement is true for N = 1, and if the statement for N + 1 is true assuming the truth of N, then the statement is true for all N. The way this works in organizations is similar. If what we were doing in the past is OK and our current practice is almost identical, then it too must be OK. This mechanism uses the past practices of an organization as a benchmark for evaluating new practices. If the past practices were ethical and acceptable, then practices that are similar and not too different are also acceptable. If each step away from ethical and acceptable practices is sufficiently small, small enough not to appear qualitatively different, then a series of these small steps can lead to a journey of unethical and illegal activities.
This happens in journalism just as it happens in science. If, as a scientist (or journalist), you witness some lazy brushing away of a surprising finding then you, too, will be more likely to do the same. And when your future students (or assistants) see you doing the same, then they become more comfortable with deceptions and lies of omission as well.

Tenbrunsel and Messick argue that such systematic, generational drift is caused by our large-scale errors in attributing causation to deception:
Acts of omission are a third reason as to why perceptions of causes are in error. Whereas lies of commission are direct misstatements of the truth, lies of omission are acts of deception that occur because someone withholds information that deceives the target. A failure to take an action, i.e., disclose the truth, makes it more difficult to ascertain responsibility, not only for others but also for oneself. Consider the situation in which you are selling a car. Is it your responsibility to inform the buyer that the car has had several unexplained malfunctions or is it the buyer’s responsibility to ask? Phrases or euphemisms such as “buyer beware” reveal the answer: moral responsibility shifts from the agent to the target under situations characterized by acts of omissions. Ritov and Baron’s (1990) work provides empirical support for this intuitive motion, demonstrating that acts of omission are viewed as more acceptable than acts of commission. Acts of omission, then, because they blur the assignment of responsibility, can create self-biased perceptions of causes, shifting blame from self to others. In such circumstances, it is highly likely that individuals’ propensity to engage in unethical behavior increases, because shifting responsibility to others allows one to divorce oneself from the moral implications of their actions.
Errors in perceptual causation allow us to distance ourselves from the ethical issues at hand. We erroneously believe that we cannot fix the problem, because it is a people and not a system issue. We also falsely believe that it is someone else’s problem, either because they are to blame or because the responsibility is someone else’s, not ours. While different from hiding the ethical implications of our own actions, this form of self-deception removes the ethical decision from our backyard, thus watering down the ethical demands of the situation.
The whole style of writing popular amongst the "Big Idea" crowd pushes for errors of omission in favor of a tight story. This is such a minor sin--one for which you almost certainly cannot be caught--that the allure to commit the lie probably overwhelms any inner voice of caution. But once you take that first deceptive step, you are statistically more likely to be willing to baby step your way farther and farther in service of that tight story.

Until in the end you're plagiarizing your own copy, making up quotes, and lying to fellow journalists.

By this point you should be asking yourself, "how well does this Tenbrunsel and Messick paper represent our state of knowledge on this topic?"

So caveat lector. Listen to the notes that are being played, but listen more carefully to the notes that are not being played.

And for you writers and scientists out there, beware the easy allure of the deception ratchet.

ResearchBlogging.org
Ann E. Tenbrunsel, & David M. Messick (2004). Ethical Fading: The Role of Self-Deception in Unethical Behavior Social Justice Research, 17 (2), 223-236 DOI: 10.1023/B:SORE.0000027411.35832.53
Chang LJ, Yarkoni T, Khaw MW, & Sanfey AG (2012). Decoding the Role of the Insula in Human Cognition: Functional Parcellation and Large-Scale Reverse Inference. Cerebral cortex (New York, N.Y. : 1991) PMID: 22437053

10 comments:

  1. Terrific article. This issue of omission is so pervasive, and I personally feel some learned helplessness about changing others' behavior. The Buyer Beware - Partner Beware dilemma is as destructive to our social fabric of community as, say, the fast pace with which industry is cutting down rainforests. Delayed feedback loops make it harder to stop!

    ReplyDelete
    Replies
    1. Thanks Trina! And yes, this is a socially very complex issue, unfortunately.

      Delete
  2. Nice post. When writing up findings, I think we've all struggled with finding the right balance between telling a good story and telling the complete story. It's probably unrealistic and unproductive to always write a comprehensive review of the literature, but I agree with you that lies of omission are damaging (and probably pervasive). My approach is to try to be honest about what I am doing: if the overall argument I want to develop has flaws (and what argument doesn't?), I'll add a "Limitations" or "Open Questions" section and describe those flaws; if I'd like to speculate based on a few studies, then I'll make a "Speculations" section and not pass off those few studies as accepted facts. I'm pleased to say that reviewers have been generally receptive to this approach.

    ReplyDelete
    Replies
    1. Notably I gave no solutions in this post. I think peer-review journals requiring a "Limitations" or "Contrary Evidence" section would be a powerful step forward. I really like those ideas! You have a great approach.

      I'm trying to practice what I preach, too. I was recently asked to write a review on a topic in my field, and my plan is to break the paper into two sections: confirmatory and contrary evidence.

      Delete
  3. Great post. Two thoughts.

    First, for those of us interested in science outreach where is the line between nuanced simplification of the story for non-scientists to grasp and "lies of omission"? Seems like it is not so clear as you make it seem. For example, if I'm say writing a outreach article on the motor system, I might make an oversimplification like "the basal ganglia acts as a gate for actions and decisions, delegating what gets select and what doesn't." Now there is a lot of controversy with that theory as well. Am I to always put an * by such statements?

    Second, there is a difference between "lies of omission" and "literature ignorance". Every one of us in neuroscience has framed arguments in our papers without knowing a segment of the vast literature that may be relevant or contradictory to our interpretations. Let's face it, the literature is exploding these days, even more so if doing inter-disciplinary research like what many of us do. While I do agree that if you know of a contradictory finding, you must talk about it, but I think a fast part of the omission problem in science papers comes form ignorance rather than more malice means.

    ReplyDelete
    Replies
    1. Thanks, Tim. I'm not sure where the line is, or even if there is a line. It's like pornography for the SCotUS: I know it when I see it.

      What's wrong with just adding a little asterisk though? Just a little line like, "look, this is really complicated, but here's one prevailing theory"? It emphasizes the truly dynamic nature of the scientific process.

      As for your second point, I agree! We *can't* know everything in the literature (that was kind of the point of brainSCANr in the first place). My point was there's not way to differentiate ignorance from intentional omission, which is what makes it so insidious.

      Delete
    2. I see your case. I'm fine with a constant set of asterisks in my papers.

      Although, sometimes it depresses the crap out of me to think about how little we actually know about the brain, given the MASSIVE amount of effort we put into understanding it, and how easily that can get exploited.

      Delete
    3. Roby Duncan15:00

      I know it is depressing sometimes, being confronted with the knowledge horizon of the field, but try and take heart from the fact that it is the only way to keep the field honest with itself.

      Delete
    4. Like gravity a law of nature is that "Information is expensive." Plus, we are using the brain to study itself. In addition, natural language is getting useless.

      Like the other real sciences, best to start moving to maths only and calculus. Natural language carries way too much ideological baggage. Higher order concepts like personality, decision making, choice, emotions, etc. are a real waste of time and distraction. We just saw a fMRI study of "empathy" -- what a waste of time.

      Delete
  4. I am a pro marketer and complete brain geek. Probably after we are all dead brain science will be the basis for all knowledge -- what's the alternative?

    I have two Linked In groups dedicated to brain and marketing/business - NOT neuromarketing which is a scam. I also, proudly started the only "neuroscience" group in MeetUp. But no one ever shows up for the meetings, predictably.

    Here is my experience:
    1. Brain stuff is complicated and hard to understand. You actually have to spend years studying it to understand the basic concepts. It is NOT pop science any more than kidney physiology is pop sci amenable.
    2. It challenges and debunks pretty much every belief and ideology. Free will? Out the window> Emotions mattering -- probably not. Decision making? Probably not, etc... Our brains do NOT like having cherished beliefs debunked.
    3. Anyone over 30 will never git it. There is a developmental brain reason for this.

    Brain research is really for the semi-pro and pro audience and even then it's limited. I get in all sorts of dumb arguments with non-brain academics who refuse to accept the latest findings - of course. And I am an amateur -- but a read a lot.

    BTW, I NEVER share what I learn with my social friends or family -- sure way to get blackballed!

    Bottom line - "popular science" is an oxymoron an ideological scam. "Jetpacks for everyone!!" In fact, pop sci appears to cause more blowback than anything else. It's like pop medicine or engineering or jet piloting....

    ReplyDelete