I think a lot about assertions, things that people assert as true, very often without acknowledging their personal bias. To be fair, most of us are so immersed in our ideologies that we’re not aware of how they are compelling us toward bias.

The title of this post refers to some of the kinds of assertions I hear, by which someone states something as a fact:

  • The way things are — some assertion about fact, whether it has to do with science, economics, politics, or some other sphere. One of my favorite manifestations is when someone begins an utterance with the stark word “Fact,” followed by a colon to emphasize the factiness of what follows, then followed by an unquestioned assertion.
  • The way things were — some statement about history or the past. For example, such and such Egyptian dynasty ruled in such and such time period, or some assertion about why humans came down from the trees to live on the savanna.
  • What is true — This is really akin to the other two kinds of assertions I’m pointing to, but maybe in this case I’m thinking about an assertion that goes beyond a mere statement of some fact. Some examples might be that God exists or that he doesn’t, or that evolution is an incontrovertible fact.

An assertion might be well supported, but what I’m trying to spotlight here is the common practice of making an assertion without acknowledging the background and context surrounding the assertion and the person making it. One result is that people get into fierce arguments even though they aren’t really arguing about the same thing.

Here are some of the kinds of influences that one might make clear to provide context to an assertion:

  • The lines of evidence behind the assertion — Is the assertion based on scientific or scholarly research? Sometimes a speaker will make an assertion, basing his or her statement on the consensus within a profession or academic field. (Academic or scientific consensus doesn’t always mean the same thing as the everyday understanding of what constitutes a consensus.) One of the problems here is that there may actually be a minority that disputes the consensus view. There might be a legitimate critique that isn’t getting acknowledged when the speaker makes the assertion.
  • Assumptions — Many assertions are based in part on ideas or constructs that are taken for granted. As with lines of evidence, there might be a legitimate minority critique of a given assumption. One example would be dating a past event based on the conventional chronologies hypothesized by historians and archaeologists.
  • Definition of terms — Often people get into arguments without establishing and agreeing on the meaning of the point they are discussing. For example, people argue about whether evolution is true without coming to a prior understanding of what they mean by evolution.
  • The ideological leanings of the speaker — For someone who wants to evaluate an assertion, it could be useful to know something about the speaker’s ideological convictions. Is the speaker a theist? An atheist? A free-market fundamentalist? An eco-socialist? One problem here is that many people don’t like to admit that they subscribe to an ideology or aren’t even aware of it.
  • The speaker’s authority for making the assertion — When evaluating an assertion, it can be useful to know the speaker’s credentials.
  • The speaker’s underlying agenda — As with ideology, many speakers don’t like to own up to their agendas, which are often political or ideologically-driven.

As is often the case with this writing project, my purpose here is to set out some basic ideas with the intention of coming back later to revise and add ideas and examples.

ARB — 3 Oct. 2013

Advertisements

I’ve discussed this question before over on ThomasNet Green & Clean — see my piece “The Climate Change Controversy — What’s It Really About?

However, I’m meditating on a somewhat different way to articulate it. I should say that I don’t think this controversy is essentially about science. I’m not persuaded by ill-informed or politically motivated assertions, but I don’t use terms like “hoax,” “anti-science,” “pseudo-science,” or “denialism” in connection with the argument.

My current thinking is the following:

The climate change controversy is about a high-stakes struggle between science in the service of eco-socialism and misinformation in the service of free-market fundamentalism.

I’m engaged in an ongoing development of my thinking on this topic and will no doubt circle back to it. But I just wanted to pin down that idea.

ARB — 2 Nov. 2012

Just a note that I have written a post over at Tools for Thinkers with brief reviews of some of my favorite books about intelligence and how to improve it if you so desire. I review books by Tony Buzan, Jeff Hawkins, Daniel Golemen, and Joshua Foer. See “Some Great Books About Intelligence.”

ARB — 2 Nov. 2013

I’ve noticed that some people think being religious requires you to hold certain political views. I’ve been meaning to put together a post collecting some of those ideas. I plan to add to this post as I encounter new and interesting expressions.

+++++

Gary Cass, head of the Christian Anti-Defamation Commission and a former member of the executive committee of San Diego’s Republican party, says you can’t be a Christian if you don’t own a gun:

You have not just a right not bear arms, you have a duty. How can you protect yourself, your family or your neighbor if you don’t have a gun? If I’m supposed to love my neighbor and I can’t protect him, what good am I?

via The Raw Story

+++++

Bryan Fischer of the American Family Association says it will hurt God’s feelings if we stop using fossil fuels:

“And you think, that’s kind of how we’re treating God when he’s given us these gifts of abundant and inexpensive and effective fuel sources,” Fischer added. “And we don’t thank him for it and we don’t use it.”

via The Raw Story

+++++

Radio show host Rush Limbaugh says if you believe in God you can’t believe in human-caused climate change:

See, in my humble opinion, folks, if you believe in God then intellectually you cannot believe in manmade global warming.

via Grist

— ARB

Recently I wrote about confirmation bias in connection with the climate change controversy — see my article at ThomasNet, “All This Wrangling Over Climate Change – What’s Up With That?” The Skeptic’s Dictionary refers to confirmation bias as “a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs.”

Today I ran across an interesting TED Talk (TED hosts and posts video talks on innovative topics) by political activist Eli Pariser who has some interesting things to say about how the algorithms used on web sites such as Facebook and Google tend to reinforce our current thinking and filter out new ideas — see his talk, “Beware Online ‘Filter Bubbles‘” — well worth watching, only nine minutes.

Pariser explains what he means by a filter bubble:

Your filter bubble is kind of your own personal, unique universe of information that you live in online … the thing is, you don’t decide what gets in, and more importantly, you don’t actually see what gets edited out.

If you and I both search for the same thing at the same time on Google, for example, we get different results. The danger of the filter bubble, says Pariser, is that

this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.

He suggests that a personalization algorithm deciding what to show us needs to look not just at what it thinks is “relevant,” but at other factors too, such as those in this slide from his presentation:

This seems like a great insight. Anyway, I highly recommend this short video to get you thinking outside the box:

AB — 24 August 2011

Congratulations! You’ve never had an auto accident, even though you drive fast, tailgate, and weave in and out of traffic.

And you’re already 27 — way to go!

You will never have an auto accident. Until you do. Then you will.

Congratulations on your great skill and brilliance.

AB — 9 May 2011

 

The Undo function — a life-saver.

From “Behavioral issues in the use of interactive systems,” Lance A. Miller and John C. Thomas, International Journal of Man-Machine Studies, Sept. 1977:

A more complex situation, however, occurs … when a user wishes to “undo” the effects of some number of prior commands — as, for example, when a user inadvertently deletes all personal files. Recovery from such situations is handled by most systems by providing “back-up” copies of (all) users’ files, from which a user can get restored the personal files as they were some days previous. While this is perhaps acceptable for catastrophic errors, it would be quite useful to permit users to “take back” at least the immediately preceding command (by issuing some special “undo” command).

Now if they would only invent an Undo button for one’s personal life.

AB — 15 April 2011