What
if a computer could let us “feel” the texture of a fabric before we buy clothes
online? Or gives us a whiff — or even a taste — of a meal we’re thinking of
preparing? That’s pretty game-changing stuff. And, it’s also within the realm
of possibility in the next 5 years, according to IBM’s list of technologies it
thinks are on the cusp of adoption.
Every
year IBM polls its R&D braintrust about what technologies that may have
“been at the hairy edge before but are now closer to the scalp,” IBM fellow and
VP of innovation Bernie Meyerson told me recently. This year those
“closer-to-the-scalp” technologies converge around computers’ growing ability
to handle richer, more diverse data and churn out more valuable output — such
as the feel of cloth, the smell or taste of food. The general premise is that
these sensory and cognitive technologies will convert computers from glorified
calculators into true thinking machines.
So,
without further ado, here’s IBM’s sixth annual 5 in 5 technology picks.
1:
Computers with a sense of touch
Even
people who love shopping online say that it’s hard to get a good read on the
finished product from a digital image alone. Most of us want to feel the
fabric before we buy a big-ticket item. So what if you could sample that
cashmere coat from your cell phone before adding it to your shopping cart?
Texture data fed into a machine’s piezoelectric drivers can re-create
vibrations and temperature on a touch screen can simulate that feel, Meyerson
said. “Imagine you have very fine pixels and that each can be heated and
vibrated to mimic the sensation of the cloth,” he said.
Some
of this capability is available now in rudimentary form in computer games where
the controller shakes to indicate an on-screen car collision.
2:
Seeing the forest, not just the trees
If
you have to rasterize an image in order to analyze it, any sort of correlation
will take a long time. If the computer can instead really see and understand
that image for what it represents — say, a child, as opposed to a bunch of
pixels — it can accelerate the whole process of analysis. That in turn will
make the parsing of things like medical images and traffic video much faster.
The difference here is between the computer viewing an image and understanding
that image without having to break it down into myriad components. That’s the
way humans deal with the world. Computers could monitor scanned images of a
person over time to watch for and detect changes that indicate a health
condition before it gets too serious for example.
3: Hearing the whole story
Just
as computers need to see images as whole entities, IBM thinks they also need to
hear total sounds — ambient noise, words, music, a lot of inputs to get the
full story. “It’s not necessarily just hearing words, hearing is also
background noise … if a cell phone caller is in a car with an engine running at
2,000 rpm, you might even be able to tell if the driver is stuck in traffic or
moving smoothly,” Meyerson said.
By
embedding sensors in flood prone areas, this technology could warn users based
on what it’s learned from past sounds, as to whether a mud slide is likely.
Computers could also likewise learn based on past experience when a baby’s cry
is due to a wet diaper, teething, or something more serious.
4:
Digitized taste buds
IBM’s
brainiacs think that machines will increasingly be able to taste things — like
chocolate or eggplant – and figure out why people do or don’t like that
taste. As Kevin Fitchard, GigaOM’s resident foodie, recently reported, some of
this is happening now. For example, researcher and app developer
Foodpairing
“has
broken down flavor to its molecular components and has compiled databases
that can match the flavor of those ingredients against other completely
different ingredients. By compiling “foodpairing trees” its technology can
identify vegetable or seafood ingredients that reinforce the flavor of
different meats, or in some cases, can act as a substitute for a meat
entirely.”
This
understanding of the chemical elements of food could help people get healthier
by subbing in something that tastes like milk chocolate but is better for them.
5:
A nose that knows
Breath
analysis can do more tan keep drunk drivers off the road. What if your
smartphone could tell from your breath that you’re about to get a cold? It’s
conceivable that your doctor would be able to diagnose you remotely based on
that information and prescribe treatment. This technology could also sniff out
minuscule amounts of environmental toxins before they hit critical mass, which
could have broad public health ramifications.
And
then there’s just the quality of life aspect. “You can paint chemical sensors
on a surface and when they detect a pattern, they give off a smell — you could
make a rich paint with all sorts of sensors that mimic things that you like,”
Meyerson said.
So,
how’s IBM doing as a sooth sayer?
Since
I’m still waiting for the jet packs we were promised decades ago,
I’m skeptical about technology predictions, but IBM’s list provides a good
starting point to track tech progress and priorities. It’s also fun to grade
its prognostication skills.
Looking
at last year’s 5 in 5 predictions, it’s fair to say there are hits and misses.
For example, last year it said junk mail will get so targeted it will actually
cease to be junk at all. If that’s happening, I’m not seeing it.
Another
2011 prediction was we’d get much better at capturing and using wasted kinetic
energy – from people walking, riding bikes,from running water etc.
There is early traction there. Los Angeles is testing advanced flywheel
technology as a way to reap wasted energy from braking trains and re-apply
it when trains accelerate. And Pavegen is building sidewalk tiles
designed tocapture energy from walking pedestrians.
As
for mind-reading headsets that measure our brain activity and recognize our
facial expressions: Um, no, don’t think so. But to be fair, IBM has 4 more
years to make good.
Taking
the longer view, looking at IBM’s inaugural list in 2006, it does better. It
was on the money with its call that people would be able to access healthcare
remotely. There are lots of tele-radiology options and doctors can even perform
surgery remotely. IBM also predicted real-time speech translation now
exemplified by products like Samsung’s Galaxy speech translation. Meyerson
admits to some less successful calls — especially one about hydrogen-powered
vehicles — but he’s pretty happy overall with IBM’s effort.