Wednesday, May 7, 2008

Omission - A Cry for Action


Is withholding the truth as bad as lying?

In my first post, I presented two classic social biases: the outgroup homogeneity bias and the trait ascription bias.

If humans could gain much by becoming aware of their cognitive mechanisms, certain behavioral patterns should be brought to awareness at all costs to avoid repeating mistakes from the past. Errare humanum est, sed perservare diabolicum.

We have all heard about the sometimes irrational tendency to protect the status quo; I would like to cast light on a variant: the omission bias.

"We tend to judge harmful actions as worse than harmful inactions (omissions)."

Many people believe that withholding the truth is better than lying although the outcome is the same.

This is an experiment led by Spraca, Minsk and Baron:

A story is read by a group of subjects.
"John, a tennis player, is about to play against a tough opponent in a decisive tournament match. He knows that the opponent is lactose intolerant."
The subjects are then presented with two conditions:
  1. John recommends food he knows contains dairy.
  2. The opponent orders the food and John does not warn him about the danger despite knowing about it.
As the omission bias predicts, a majority of people found recommending the food more immoral than not informing the opponent.

Most of us have disdain for those who collaborate with repressive police states or accuse innocents. Are we really much better when we witness it and do not react? By knowing something terrible is going to happen and then letting it happen, we are criminals.

You might not like the direction I'm heading in with this post. Although this might be outside the scope of the omission bias, mere knowledge makes us all responsible. We know about future catastrophes to come, injustices happening today, and we let everything happen. I myself so far have not had enough intelligence, maturity, courage and perspective to shake myself into action. I will fall sleep soundly and it should provoke disgust. Guilt, whining and transferring blame is worthless, we are all responsible, and if we want things to change we must act. The simple principle "do not that to another, which thou wouldst not have not done to thyself" is an example of a great place to start.

2 comments:

Tim said...

Agreed. However, I must say that there's a fundamental difference in comparing your example to most of our everyday situations.

In this situation, everything is clear, lack of action will harm someone. However in our everyday lives, we do not have a clear stimulus telling us that something will go wrong. Normally, in a situation where one does not act, one is missing information and can end up hindering the situation.

An example of such would be a motorcycle accident. Imagine a biker falls and passes out on the side of the road. Not calling an ambulance would be similar to the situation illustrated in the post, however giving first aid and moving the biker into safety is absolutely different. While proper first aid would probably help the biker survive until the ambulance arrived, inadequate treatment such as removing the helmet or moving the biker could potentially kill him (broken neck).

Once again we find ourselves with a thin line between right and wrong. Again, here, everything falls back to knowledge or rather lack thereof.

This example, is an extreme, but this reasoning can be transfered to most issues in this world (including ecology). Unless one is truly aware of all the facts, one can not truly be just or wrong.

One could argue, however, that not pursuing knowledge is wrong - but we would be getting into a new subject.

I'll just say that, if you have adequate understand of the situation and don't act, you're in the wrong. [and the term "adequate" can be debated too... argh]

Substance said...

"Unless one is truly aware of all the facts, one can not truly be just or wrong "

Yes!

The omission bias does imply clear knowledge of the consequences of inaction. John has to know that his opponent will be seriously handicapped after he eats the allergen containing dish.

I agree, textbook omission biases can only be present for simple situations where knowledge can safely predict the consequences of both action and inaction.

Yes, reality is complex, a dynamical system with so many parameters interacting that predictions are at best "a good guess". Inaction in many situations cannot be reproached.

"One could argue, however, that not pursuing knowledge is wrong"

To cover a small part of that issue, it is IMO way too easy and cowardly to avoid knowledge because of what can be called a "fear of knowing".

For example a person believing in Intelligent Design might refuse to learn facts which will reveal the unquestionability of the Theory of Evolution. In this case, the knowledge could shatter the person's world view, sense of identity, idea of God, and rapport with community and family.

When there is complexity, it is crucial to try learning about which parameters are significant and which parameters are not. A decent evaluation of reality in its complexity enables us to interact with our world, to see how people and events fit in the big picture.

"I'll just say that, if you have adequate understand of the situation and don't act, you're in the wrong."

When is the understanding adequate enough to act?

It never will be in many situations. However, by cultivating one's experiences and sense of ethics, understanding of science, economy, psychology and so on, it is possible to feel an impetus to act simply because our sense of identity and purpose has grown strong enough to give us a strong, measured and seasoned intuition that inaction will bring about negative consequences.