The Hole in the Wall: Computing for India’s Impoverished

Below is a blog entry I posted a few years ago when I was working for TMCnet. I wanted to refer to it in an upcoming post, but it has disappeared from the TMCnet web site.

Transferred over on 31 March 2009 from Al Bredenberg’s VOIP & CRM Blog (linking here to the Wayback Machine’s archived version):

VoIP for the Developing World

Rich Tehrani wrote a fascinating blog entry today about the potential connection between MIT’s $100-laptop program and the future possibilities for VoIP in developing countries. See his essay at:

VoIP Helps the Needy

In part, Rich writes:

… imagine if there was a way to get computers into the hands of more children. What would this do for the world’s developing nations and how would it help children? Imagine they would now be able to compute inexpensively and have access to the Internet and also speak for free with others.

This is a huge deal because in many parts of the world there aren’t telephones or even telephone lines. Many children don’t even understand the concept of the telephone. What if we could get them to access the web, allow them to compose documents, blog and talk for free? What an amazing world that would be. What an exciting place to live. What a more interconnected planet we would live on.

This reminds me of the fascinating story of “The Hole in the Wall,” which I heard about a couple of years ago.

Sugata Mitra, a computer scientist in India, decided to place a computer with a high-speed Internet connection in a hole in the wall that separated the high-tech company he worked for from the slum next door. He found that the kids from the neighborhood, who had never seen a computer, very quickly figured out how to use it and how to perform complexe tasks over the Internet. The last I heard, he was institutinga program making public-access computers available in poor neighborhoods in many areas of India.

One of the incidents I recall from the story was that a reporter asked one of the kids how he learned to use a computer so well, and the kid answered, ‘What’s a computer?’

AB — 10/3/05

Advertisement

Phantom limbs and the wiring of the neocortex

A story from NPR this morning, “How Do You Amputate a Phantom Limb?,” got me thinking about the amazing integration of the body and the brain.

Most of us have heard about the phenomenon of phantom limbs experienced by many people who have had limbs amputated. On the NPR segment, Radio Lab interviewers review a story told by Dr. V.S. Ramachandran of the University of California at San Diego, who had a patient plagued by pain in his phantom arm. Ramachandran was able to use a mirror to simulate the missing limb and teach the patient’s brain that the limb was really gone, after which the pain ceased.

For me, this resonates with the brain theory propounded by Jeff Hawkins in his book On Intelligence. Hawkins is best known as the computer architect who founded Palm Computing, but he is also educated as a neuroscientist. His book makes an interesting case that intelligent machines are possible but that the conventional artificial intelligence (AI) model is wrongheaded.

However, in discussing how brains work, he helps the reader to appreciate that the makeup of the brain of any creature is very much determined by that creature’s physical and sensory makeup.

The traditional, rather stereotyped idea of the brain is that brains are mapped out according to various cognitive and sensory functions. And on one level that is true. But Hawkins makes the point that the cortex is an extremely flexible structure:

… the wiring of the neocortex is amazingly “plastic,” meaning it can change and rewire itself depending on the type of inputs flowing into it. For example, newborn ferret brains can be surgically rewired so that the animals’ eyes send their signals to the areas of cortex where hearing normally develops. The surprising result is that the ferrets develop functioning visual pathways in the auditory portions of their brains … they see with brain tissue that normally hears sounds.

Commenting on human brains, Hawkins writes that

Adults who are born deaf process visual information in areas that normally become auditory regions. And congenitally blind adults use the rearmost portion of their cortex, which ordinarily becomes dedicated to vision, to read braille. Since braille involves touch, you might think it would primarily activate touch regions — but apparently no area of cortex is content to represent nothing. The visual cortex, not receving information from the eyes like it is “supposed” to, casts around for other input patterns to sift through — in this case, from other cortical regions.

What this shows, he maintains, is that “brain regions develop specialized functions based largely on the kind of information that flows into them during development.”

Ramachandran’s results with phantom limb pain support the idea that the brain maintains its flexibility in adulthood and its sensory areas can be rewired in dramatic ways.

AB — 18 March 2009

SixthSense prototype portends “The Internet of Things”

Today I learned about SixthSense, a wearable gestural computer interface developed at MIT’s Fluid Interfaces Group, a research group devoted to the design of interfaces that are “more immersive, more intelligent, and more interactive.”

Here’s how the group describes the interface:

The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user’s hand gestures and physical objects using computer-vision based techniques.

These images give you an idea how the prototype works and the kind of functionality it presages:

  

Here’s a link to a video that shows some great demos of SixthSense.

Fluid Interface Group’s work makes me think of one of the best film portrayals of a futuristic computer interface: the one Tom Cruise uses in the film Minority Report. In the movie, Cruise’s character uses virtual-reality gloves to manipulate a large interface virtual interface in front of him — very exciting to see.

This work of the Fluid Interface Group touches on the “Internet of Things,” an idea I first heard put forward by the Auto-ID Labs, a group working in the area of networked RFID. One of our ILO Institute reports on new directions for RFID discussed some of the possibilities for this Internet of Things:

If miniature Web pages and servers could be embedded in building materials, components of vehicles and aircraft, furniture, appliances, apparel, and other places, this could have huge implications for marketing, communication, and provision of services, not to mention changing the very nature of the world around us.

MIT’s Sanjay Sarma tells ILO researchers that this Internet of Things is “going to have a huge impact,” and that RFID is one of the key enabling technologies. He points out that RFID creates a greatly increased connection between the physical world and the world of information by connecting more data to physical things and transferring it at much greater speeds in much greater volumes. “We used to connect data to the physical world through keyboards, but there’s only so much data you can get in through the keyboard. But with RFID it’s automatic and it’s happening all the time.”

Sarma says that the Internet of Things will allow you to “have control in your enterprise in a way that is completely unprecedented.” Sarma calls this control “high-resolution management—management with eyes everywhere, as opposed to management by gut reactions and guesswork.”

The high volume and extreme complexity of this Internet of Things presents unique opportunities and challenges for the technology provider. “If you are in this market,” says Sarma, “you should be looking more and more at distributed computation, and you should be looking at embedded computations, at areas related to distributed software, at software related to data acquisition, and at software related to process change. They’ll all be changing in the next ten years.”

(“Directions for New RFID Initiatives,” ILO Institute, Aug. 23, 2006)

AB — 17 March 2009

Best argument for remixing: Watch this video

Just today I saw a video that is probably the best argument I have ever seen in favor of remixing. Please watch and listen to Mother of All Funk Chords. Fantastic!

Here the author explains how he makes his remixed music videos.

If you prefer an intellectual argument over an experiential one, see this video lecture by Stanford law professor Lawrence Lessig. Lessig has spoken at some of our meetings at the Institute for Innovation in Large Organizations (ILO). See Lessig’s blog here.

Lessig is probably the best thinker around remixing and interesting to listen to. But watching Mother of all Funk Chords is a lot more fun.

AB — 10 March 2009

Bionic eye is restoring sight for blind

BBC is carrying an interesting story today about some initial successes in trials of a bionic eye — see “Bionic eye gives blind man sight.”

Moorfields Eye Hospital in the UK is carrying out trials of the technology with three patients suffering from retinitis pigmentosa, an inherited disease that causes retinal degeneration. Eighteen patients around the world are participating in trials.

The bionic eye is able to send “meaningful visual stimuli” to the brains of patients, according to a retinal surgeon.

As an example of what it can do, one patient quoted in the BBC article says the device allows him to sort light from dark laundry.

Here’s a video that gives an idea how the technology works.

AB — 4 March 2009