The Way Things Are, the Way Things Were, and What Is True

I think a lot about assertions, things that people assert as true, very often without acknowledging their personal bias. To be fair, most of us are so immersed in our ideologies that we’re not aware of how they are compelling us toward bias.

The title of this post refers to some of the kinds of assertions I hear, by which someone states something as a fact:

  • The way things are — some assertion about fact, whether it has to do with science, economics, politics, or some other sphere. One of my favorite manifestations is when someone begins an utterance with the stark word “Fact,” followed by a colon to emphasize the factiness of what follows, then followed by an unquestioned assertion.
  • The way things were — some statement about history or the past. For example, such and such Egyptian dynasty ruled in such and such time period, or some assertion about why humans came down from the trees to live on the savanna.
  • What is true — This is really akin to the other two kinds of assertions I’m pointing to, but maybe in this case I’m thinking about an assertion that goes beyond a mere statement of some fact. Some examples might be that God exists or that he doesn’t, or that evolution is an incontrovertible fact.

An assertion might be well supported, but what I’m trying to spotlight here is the common practice of making an assertion without acknowledging the background and context surrounding the assertion and the person making it. One result is that people get into fierce arguments even though they aren’t really arguing about the same thing.

Here are some of the kinds of influences that one might make clear to provide context to an assertion:

  • The lines of evidence behind the assertion — Is the assertion based on scientific or scholarly research? Sometimes a speaker will make an assertion, basing his or her statement on the consensus within a profession or academic field. (Academic or scientific consensus doesn’t always mean the same thing as the everyday understanding of what constitutes a consensus.) One of the problems here is that there may actually be a minority that disputes the consensus view. There might be a legitimate critique that isn’t getting acknowledged when the speaker makes the assertion.
  • Assumptions — Many assertions are based in part on ideas or constructs that are taken for granted. As with lines of evidence, there might be a legitimate minority critique of a given assumption. One example would be dating a past event based on the conventional chronologies hypothesized by historians and archaeologists.
  • Definition of terms — Often people get into arguments without establishing and agreeing on the meaning of the point they are discussing. For example, people argue about whether evolution is true without coming to a prior understanding of what they mean by evolution.
  • The ideological leanings of the speaker — For someone who wants to evaluate an assertion, it could be useful to know something about the speaker’s ideological convictions. Is the speaker a theist? An atheist? A free-market fundamentalist? An eco-socialist? One problem here is that many people don’t like to admit that they subscribe to an ideology or aren’t even aware of it.
  • The speaker’s authority for making the assertion — When evaluating an assertion, it can be useful to know the speaker’s credentials.
  • The speaker’s underlying agenda — As with ideology, many speakers don’t like to own up to their agendas, which are often political or ideologically-driven.

As is often the case with this writing project, my purpose here is to set out some basic ideas with the intention of coming back later to revise and add ideas and examples.

ARB — 3 Oct. 2013


What the Climate Change Controversy Is Really About

I’ve discussed this question before over on ThomasNet Green & Clean — see my piece “The Climate Change Controversy — What’s It Really About?

However, I’m meditating on a somewhat different way to articulate it. I should say that I don’t think this controversy is essentially about science. I’m not persuaded by ill-informed or politically motivated assertions, but I don’t use terms like “hoax,” “anti-science,” “pseudo-science,” or “denialism” in connection with the argument.

My current thinking is the following:

The climate change controversy is about a high-stakes struggle between science in the service of eco-socialism and misinformation in the service of free-market fundamentalism.

I’m engaged in an ongoing development of my thinking on this topic and will no doubt circle back to it. But I just wanted to pin down that idea.

ARB — 2 Nov. 2012

Religion and Political Correctness

I’ve noticed that some people think being religious requires you to hold certain political views. I’ve been meaning to put together a post collecting some of those ideas. I plan to add to this post as I encounter new and interesting expressions.


Gary Cass, head of the Christian Anti-Defamation Commission and a former member of the executive committee of San Diego’s Republican party, says you can’t be a Christian if you don’t own a gun:

You have not just a right not bear arms, you have a duty. How can you protect yourself, your family or your neighbor if you don’t have a gun? If I’m supposed to love my neighbor and I can’t protect him, what good am I?

via The Raw Story


Bryan Fischer of the American Family Association says it will hurt God’s feelings if we stop using fossil fuels:

“And you think, that’s kind of how we’re treating God when he’s given us these gifts of abundant and inexpensive and effective fuel sources,” Fischer added. “And we don’t thank him for it and we don’t use it.”

via The Raw Story


Radio show host Rush Limbaugh says if you believe in God you can’t believe in human-caused climate change:

See, in my humble opinion, folks, if you believe in God then intellectually you cannot believe in manmade global warming.

via Grist


How the Internet Reinforces Confirmation Bias

Recently I wrote about confirmation bias in connection with the climate change controversy — see my article at ThomasNet, “All This Wrangling Over Climate Change – What’s Up With That?” The Skeptic’s Dictionary refers to confirmation bias as “a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs.”

Today I ran across an interesting TED Talk (TED hosts and posts video talks on innovative topics) by political activist Eli Pariser who has some interesting things to say about how the algorithms used on web sites such as Facebook and Google tend to reinforce our current thinking and filter out new ideas — see his talk, “Beware Online ‘Filter Bubbles‘” — well worth watching, only nine minutes.

Pariser explains what he means by a filter bubble:

Your filter bubble is kind of your own personal, unique universe of information that you live in online … the thing is, you don’t decide what gets in, and more importantly, you don’t actually see what gets edited out.

If you and I both search for the same thing at the same time on Google, for example, we get different results. The danger of the filter bubble, says Pariser, is that

this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.

He suggests that a personalization algorithm deciding what to show us needs to look not just at what it thinks is “relevant,” but at other factors too, such as those in this slide from his presentation:

This seems like a great insight. Anyway, I highly recommend this short video to get you thinking outside the box:

AB — 24 August 2011

Undo: One of the Greatest Innovations in Computing

The Undo function — a life-saver.

From “Behavioral issues in the use of interactive systems,” Lance A. Miller and John C. Thomas, International Journal of Man-Machine Studies, Sept. 1977:

A more complex situation, however, occurs … when a user wishes to “undo” the effects of some number of prior commands — as, for example, when a user inadvertently deletes all personal files. Recovery from such situations is handled by most systems by providing “back-up” copies of (all) users’ files, from which a user can get restored the personal files as they were some days previous. While this is perhaps acceptable for catastrophic errors, it would be quite useful to permit users to “take back” at least the immediately preceding command (by issuing some special “undo” command).

Now if they would only invent an Undo button for one’s personal life.

AB — 15 April 2011

El vaquero y el río congelado

by Al Bredenberg

Paramos allí en la orilla del río, yo y la vaca, Grateful. Confundido, yo estaba mirando en alrededor, buscando el lugar de dar a beber la vaca sedienta, pero no pude ver el lugar. El problema: ¡El río estaba cubierto por una capa gruesa del hielo!

De repente, la vaca empezó a actuar nerviosa y de abordar al río. “¡No, Grateful!” dije a ella, pero ella no prestó atención, y ¡comenzó a pisar al río congelado! Con un creciendo sentido de pánico, yo pisé también al hielo para tratar de desviarla de la peligrosa situación. “¡Ay-ay-ay!” pensé. “¿Qué va pasar si se quebranta el hielo?”

Traté de empujar la vaca hacía la orilla, pero una vaca es un animal bien grande, y ella no se desvió, y lo que era más, comenzó a caminar adelante en el hielo.

“¡Grateful, párate!” grité, y en ese momento ¡la vaca empezó a correr! En un pánico total, empecé a perseguirla, gritando.

Pero, ¿cómo llegué a estar en esa situación, corriendo atrás de una vaca grande por un río congelado en el invierno? Voy a explicar, y también voy a recordar una lección que aprendí acerca de las capacidades de los animales.

A la edad de 24 años, me mudé al estado de Vermont, Estados Unidos, para vivir con unos amigos en su granja en el campo al lado de un río hermoso. Durante los días, ayudaba a ellos en su negocio de ebanista. Muy de mañana y por la noche, cuidamos a los quehaceres, incluso alimentar y ordeñar la vaca Grateful, que nos proveyó con mucha leche rica.

Grateful era una vaca de mayor edad que mis amigos han adquirido de un lechero del área. Ella recibió su nombre “Grateful” (inglés, “agradecida”) porque ella tenía solamente tres tetas y por eso proveyera menos leche. Pero a pesar de eso, ¡no la han matado por carne!

Desafortunadamente, el agua de la casa dejó de funcionar durante el invierno por causa del frío intenso. Pues decidimos de dar a beber a la vaca en el río a unos cien metros de distancia del establo. Dejamos una hacha en la orilla para excavar un hoyo en el hielo cada noche para exponer el agua por la vaca sedienta.

Pero, durante unas semanas, yo acepté la asignación de cuidar a la vaca durante la mañana, y otra persona la daba a beber en las noches.

La noche en que ocurrió el incidente que mencioné anteriormente, todos mis compañeros estaban trabajando tarde, pues se cayó a mí de cuidar a Grateful.

Pero, cuando llegamos al río, descubrí que mis compañeros habían movido el lugar para dar a beber. Yo busqué todo alrededor, pero no pude ver el hoyo en el hielo ni el hacha para cortar el hielo.

Entonces, ¡esta vaca loca estaba caminando en el hielo, determinada para seguir adelante, sin prestar atención al ser humano persiguiéndola, gritando!

Bueno, con gran temor, continué corriendo atrás de Grateful por uno o dos minutos, impotente de hacer nada, esperando que el hielo grueso pudiera sostener la vaca pesada.

Pero sorprendentemente, después de unos minutos, Grateful empezó a minorarse, y entonces ella regresó a la orilla en una manera muy calma, y me esperó. ¡Aparentemente, después de todo, no estuviéramos pasando por una estampida!

Resollando, calmándome, yo miré alrededor, y en ese lugar más adelante al lado del río, vi varios montones de estiércol de vaca, un hoyo en el río, y el hacha apoyándose en un árbol.

En ese momento comprendí de que durante las semanas pasadas, mis compañeros, sin informarme, habían cambiado el lugar por dar a beber la vaca. Este lugar estaba más adelante en la orilla. Pero, llegué a entender que la vaca recordó perfectamente el lugar correcto y simplemente había tratado de llegar allí a pesar de la tontedad de su ayudante humano.

Yo grabé el hacha, y comencé a cortar la capa fina de hielo en el hoyo para permitir de beber la pobrecita vaca sedienta.

Pero, al la misma vez, ocurrió a mí una realización y lección: ¡Muchas veces, los animales son más inteligentes que pensemos!

AB — 18 enero 2011

Why People Don’t Believe Scientists Even When There Is ‘Consensus’

An interesting study, soon to appear in the Journal of Risk Research, by Yale law professor Dan M. Kahan and colleagues, suggests that people tend to disbelieve scientists whose cultural values are different than theirs.

I’m not able to determine when this study will be published, but you can find an abstract at this link, and I was able to download a preliminary version of the whole article in PDF by clicking on the link on that page that says “One-Click Download.”

These conclusions shed light on the debate over anthropogenic climate change, Kahan tells Science Daily (“Why ‘Scientific Consensus’ Fails to Persuade“):

We know from previous research that people with individualistic values, who have a strong attachment to commerce and industry, tend to be skeptical of claimed environmental risks, while people with egalitarian values, who resent economic inequality, tend to believe that commerce and industry harms the environment.

(For more on the climate-change controversy, see my previous entry, “John Abraham’s Point-by-Point Rebuttal of Climate-Skeptic Monckton.”

Kahan and colleagues based their study on the theory of the cultural cognition of risk. In his paper, Kahan says this theory “posits a collection of psychological mechanisms that dispose individuals selectively to credit or dismiss evidence of risk in patterns that fit values they share with others.”

The researchers surveyed a representative sample of 1,500 U.S. adults. They divided this sample into groups with opposite cultural worldviews, some favoring hierarchy and individualism, the others favoring egalitarianism and communitarianism.

They surveyed the respondents to determine their beliefs about what in fact is the scientific consensus on issues of climate change, disposal of nuclear waste, and concealed handguns, with particular focus on the associated levels of risk in each of those areas. Various statements were attributed to fictional personas portrayed as authors of books on these various issues. Respondents were asked to judge whether each author is really an expert or not.

Analysis of the results on the climate-change issue revealed that:

Disagreement was sharp among individuals identified (through median splits along both dimensions of cultural worldview) as “hierarchical individualists,” on the one hand, and “egalitarian communitarians,” on the other. Solid majorities of egalitarian communitarians perceived that most expert scientists agree that global warming is occurring (78%) and that it has an anthropogenic source (68%). In contrast, 56% of hierarchical individualists believe that scientists are divided, and another 25% (as opposed to 2% for egalitarian communitarians) that most expert scientists disagree that global temperatures are increas-ing. Likewise, a majority of hierarchical individualists, 55%, believed that most expert scientists are divided on whether humans are causing global warming, with another 32% perceiving that most expert scientists disagree with this conclusion.

The study revealed similar results around the issues of geologic isolation of nuclear wastes and concealed-carry laws.

So should we conclude that people are going to believe what they want to believe, and that’s all there is to it? The authors make an interesting statement about the implications for public presentation of scientific findings:

It is not enough to assure that scientifically sound information — including evidence of what scientists themselves believe — is widely disseminated: cultural cognition strongly motivates individuals — of all worldviews — to recognize such information as sound in a selective pattern that reinforces their cultural predispositions. To overcome this effect, communicators must attend to the cultural meaning as well as the scientific content of information.

The report suggests some ways that cultural meaning might be considered in communicating with the public. One such strategy is what the authors call narrative framing:

Individuals tend to assimilate information by fitting it to pre-existing narrative templates or schemes that invest the information with meaning. The elements of these narrative templates — the identity of the stock heroes and villains, the nature of their dramatic struggles, and the moral stakes of their engagement with one another — vary in identifiable and recurring ways across cultural groups. By crafting messages to evoke narrative templates that are culturally congenial to target audiences, risk communicators can help to assure that the content of the information they are imparting receives considered attention across diverse cultural groups.

AB — 24 Sept. 2010