Livewired Brain #3: Sensory Substitution
In his book on the remarkable change’ability of the brain, Livewired, David Eagleman asks: how flexible is the brain? Can it even learn to make sense of an altogether new format of data it receives?
“Would
a small electronic chip, speaking the dialect of Silicon Valley instead of the
language of our natural biological sense organs, be understood by the rest of
the brain?”
If that sounded
like a question for the future, you’d be wrong. For people whose inner ear isn’t
working, no amount of amplification will help. Instead, they’ve had the option
of cochlear implants since 1982:
“This
tiny device circumvents the broken hardware of the inner ear to speak directly
to the functioning nerve just beyond it… (The implanted microcomputer) receives
sound from the outside world and passes the information to the auditory nerve
by means of tiny electrodes.”
A recipient of the
implant said it took some time for the brain to be able to make sense of the
new format of data, but soon he could understand sentences, even have a
conversation, and eventually even get by in a loud bar. That’s impressive, you
say, but it didn’t sound like a change in data format…
Technology has
already allowed us to test the idea of “sensory substitution”, i.e., you send
data that would normally come through, say, the eye, to instead be sent via,
say, the skin. In this example, you can’t just bombard the skin with photons
and hope the subject can learn to see. Instead you translate the signal of the
photons into a format that the skin is designed to detect and transmit already:
“(They
created) a grid of four hundred Teflon tips… The tips could be extended and
retracted by mechanical solenoids. Over the blind man’s head a camera was
mounted… The video stream of the camera was converted into a poking of the tips
against the volunteer’s back.”
What happened next
was stunning:
“Over
days of training, he got better at identifying objects by their feel… The
experience wasn’t exactly like vision, but it was a start.”
Several other
devices have been built that do other forms of “sensory substitution”, and the
brain does learn to make sense of it all. So why haven’t such devices caught
on? They’re usually too big, heavy or low resolution. But the idea does work.
But how can all these “strange approaches” possibly work?!
“Because
inputs to the brain – photons at the eye, air compressor waves at the ear,
pressure on the skin – are all converted into the common currency of an
electrical signal. As long as the incoming spikes carry information that
represents something important about the outside world, the brain will learn to
interpret it.”
All of which is
why Eagleman says, without too much exaggeration, that:
“The brain is a general-purpose computing device… (and the sense organs) are merely plug-and-play devices.”
Comments
Post a Comment