Is it true that the “Close Door” buttons on elevators don’t work?

Recently an acquaintance smugly told me that the “Close Door” buttons on elevators don’t work — they are just there as a psychological sop to make passengers think they actually have some control.

I didn’t contest this assertion — I had heard it before and wasn’t certain one way or the other. I was deeply suspicious, however — it smacked of the bogus rumors and conspiracy theories you hear all the time, or at least sounded like one of those things that everybody knows that just aren’t true.

So I was happy to learn today that Cecil Adams of The Straight Dope has already (in 1986) dealt with this critical question in his thorough and inimitable manner — he even interviewed representatives of the Otis elevator company and various elevator repairmen. See “Do ‘close door’ buttons on elevators ever actually work?

The upshot is that the “Close Door” button is not an evil conspiracy to manipulate people into pushing a fake button hoping for a reward like Pavlov’s dogs. That’s not to say that they always work — they could be broken or disconnected at the request of the building’s owner. Here’s another reason Cecil gives as to why these buttons don’t always seem to work:

The button really does work, it’s just set on time delay. Suppose the elevator is set so that the doors close automatically after five seconds. The close-door button can be set to close the doors after two or three seconds. The button may be operating properly when you push it, but because there’s still a delay, you don’t realize it.

AB — 9 September 2011

How the Internet Reinforces Confirmation Bias

Recently I wrote about confirmation bias in connection with the climate change controversy — see my article at ThomasNet, “All This Wrangling Over Climate Change – What’s Up With That?” The Skeptic’s Dictionary refers to confirmation bias as “a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs.”

Today I ran across an interesting TED Talk (TED hosts and posts video talks on innovative topics) by political activist Eli Pariser who has some interesting things to say about how the algorithms used on web sites such as Facebook and Google tend to reinforce our current thinking and filter out new ideas — see his talk, “Beware Online ‘Filter Bubbles‘” — well worth watching, only nine minutes.

Pariser explains what he means by a filter bubble:

Your filter bubble is kind of your own personal, unique universe of information that you live in online … the thing is, you don’t decide what gets in, and more importantly, you don’t actually see what gets edited out.

If you and I both search for the same thing at the same time on Google, for example, we get different results. The danger of the filter bubble, says Pariser, is that

this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.

He suggests that a personalization algorithm deciding what to show us needs to look not just at what it thinks is “relevant,” but at other factors too, such as those in this slide from his presentation:

This seems like a great insight. Anyway, I highly recommend this short video to get you thinking outside the box:

AB — 24 August 2011

Cisco’s Infographic About the Internet of Things

On the Cisco blog on July 15, 2011, Dave Evans, Cisco’s Chief Futurist in their Innovations Practice, posted the following infographic about the Internet of Things, which I’ve been writing about for a few years — see “Developing the Internet of Things and a Smarter Planet” and “Is an ‘Energy Internet’ Emerging?,” which touches on similar idea.

Click on this image to link through to the full-size original:

Infographic about the Internet of Things

I’m as much interested in the infographic as a method for the visual presentation of information as I am about the particular content of any infographic — in examining any of these presentations, I think it’s important to understand the data sources and to recognize that these graphics are simplifications of research that is often quite complicated.

I notice that author of this graphic says that by the end of 2011, “20 typical households will generate more Internet traffic than the entire Internet in 2008.” While the denizens of Casa Bredenberg no doubt generate a lot of traffic as Internet users, I doubt whether the objects in our house are right now generating 5 percent as much traffic as the 2008 Internet. Maybe if Progress Energy eventually gets its smart-grid rollout going …

AB — 18 July 2011

SEO Angst: The Secret of Search Engine Optimization

Many who manage web sites invest great effort and expense in search engine optimization (SEO), the practice of optimizing the content and format of a site and its pages so as to attract the most search engine traffic.

SEO is important to online businesses, because qualified web traffic can translate into eyeballs (if a site sells advertising) or sales (if it’s an e-commerce site) or potential clients (if the site is run by, say, a consulting firm).

I’ve been around the practice of SEO for about 15 years (before it was even called SEO), and I’ve come to believe in a central truth about it:

If you want search engine traffic, the first thing you have to do is deserve it.

This means providing honest, substantive content.

This also means offering well-executed services and a customer experience that serves the visitor well.

This concept is approximately equivalent to customer-centeredness in marketing or user-centered design in software development. A business has to make a profit, try to grow, strive for market share — but business success in the long term is hard to come by without a strong customer focus, or user focus in the case of web traffic.

By all means, optimize your site for search engine traffic, but be aware that few businesses make it for very long by tricking Google.

Do what you can to direct web traffic to your site, but make sure you deserve it.

AB — 5 May 2011

Undo: One of the Greatest Innovations in Computing

The Undo function — a life-saver.

From “Behavioral issues in the use of interactive systems,” Lance A. Miller and John C. Thomas, International Journal of Man-Machine Studies, Sept. 1977:

A more complex situation, however, occurs … when a user wishes to “undo” the effects of some number of prior commands — as, for example, when a user inadvertently deletes all personal files. Recovery from such situations is handled by most systems by providing “back-up” copies of (all) users’ files, from which a user can get restored the personal files as they were some days previous. While this is perhaps acceptable for catastrophic errors, it would be quite useful to permit users to “take back” at least the immediately preceding command (by issuing some special “undo” command).

Now if they would only invent an Undo button for one’s personal life.

AB — 15 April 2011

Where the Big Green Copier Button Came From

Big green copier buttonRecently I’ve been studying the use of ethnography in large companies for product design and market strategy, which relates to some of the work I’ve done in usability and user experience.

In process of the research, I ran across an interesting anecdote about how the “big green button” on printers came out. I think it illustrates the value of video ethnography in product design, but, on an even more basic level, the value of simply watching how people live and work and use your product.

In a 1999 presentation for WPT Fest, Xerox PARC anthropologist Lucy Suchman described how she helped Xerox engineers understand how hard copiers were to use:

Around this time [1979] a project began at PARC to develop an intelligent, interactive expert system that would provide instructions to users in the operation of a particular photocopier, just put on the market and reported by its intended users to be “too complicated.” With Austin Henderson, I initiated a series of studies aimed first at understanding what made the existing machine difficult to use, and later at seeing just what happened when people engaged in “interactions” with my colleagues’ prototype expert advisor.

Scientists struggling with copierIn order to explore these questions in detail we got a machine ourselves and installed it in our workplace. I then invited others of my co-workers, including some extremely eminent computer scientists, to try using the machine to copy their own papers for colleagues, with the understanding that a video camera would be rolling while they did so. This resulted among other things in what has become something of a cult video that I produced for John Seely Brown for a keynote address to CHI in 1983, titled “When User Hits Machine.” This image, taken from a 3/4″ reel-to-reel video recording made in 1982, shows two of my colleagues using the machine to make two-sided copies of a research paper. The CHI audience would recognize Allen Newell, one of the founding fathers of AI. His PARC colleague is a brilliant computational linguist named Ron Kaplan.

Video ethnographer Susan Faulkner of Intel relates one of the interesting results of Suchman’s video:

The film was shown to researchers and engineers at Xerox, and it led to significant changes in interface design, including the addition of the now ubiquitous large green button that allows users to quickly and easily make a copy.

AB — 2 June 2010