improves understanding of mobility problems
The software tool presents data visually and
this allows those without specialist
training – both professionals and older
people – to better understand and contribute
to discussions about the mechanics of
movement, known as biomechanics, when
carrying out everyday activities.
The software takes motion capture data and
muscle strength measurements from older
people undertaking everyday activities.
The software then generates a 3D animated
human stick figure on which the
biomechanical demands of the activities are
represented visually at the joints.
These demands, or stresses, are shown as a
percentage of maximum capability through a
colour gradient: green is 0 per cent, amber
is 50 per cent and red is 100 per cent or
The research shows the new software tool has
the potential to improve diagnostic,
therapeutic, communication and education
procedures by increasing the use and
integration of biomechanical expertise in
both design and healthcare practices.
The visualisation software could be used to
improve the designer's understanding of the
different needs when developing products for
older people, including enhancing the
ergonomic and as well as the functional
attributes of products, and improving the
design of landscapes and buildings.
In a healthcare setting the tool could be
used as part of a range of assessment
techniques. It could improve the
understanding by different healthcare
profession of older people's mobility
challenges and improve communication across
these professions to provide a more
joined-up approach to clinical assessment,
diagnosis and rehabilitation.
Commenting on the research, Professor
Alastair Macdonald of the Glasgow School of
Art, said: "The visualisation software is a
simple yet highly effective tool to help
older people and professionals explain,
discuss and address mobility problems.
Better understanding of older people's
mobility can help healthcare professionals
improve diagnosis or treatment of problems,
and design professionals to adapt the way
they design for older people."
Newswise, July 2010 — A unique device based on
sniffing—inhaling and exhaling through the
nose—might enable numerous disabled people
to navigate wheelchairs or communicate with
their loved ones.
Sniffing technology might even be used in the future to
create a sort of “third hand” to assist
healthy surgeons or pilots.
Developed by Prof. Noam Sobel, electronics
engineers Dr. Anton Plotkin and Aharon
Weissbrod, and research student Lee Sela in
the Weizmann Institute of Science’s
Department of Neurobiology, the new system
identifies changes in air pressure inside
the nostrils and translates these into
The device was tested on healthy volunteers as well as
quadriplegics, and the results showed that
the method is easily mastered. Users were
able to navigate a wheelchair around a
complex path or play a computer game with
nearly the speed and accuracy of a mouse or
Says Prof. Sobel, “The most stirring tests were those we
did with locked-in syndrome patients. These
are people with unimpaired cognitive
function who are completely
paralyzed—‘locked into’ their bodies.
With the new system, they were able to communicate with
family members, and even initiate
communication with the outside. Some wrote
poignant messages to their loved ones,
sharing with them, for the first time in a
very long time, their thoughts and
Four of those who participated in the experiments are
already using the new writing system, and
Yeda Research and Development Company,
Ltd.—the technology transfer arm of the
Weizmann Institute—is investigating the
possibilities for developing and
distributing the technology.
Sniffing is a precise motor skill that is controlled, in
part, by the soft palate—the flexible
divider that moves to direct air in or out
through the mouth or nose.
The soft palate is controlled by several nerves that
connect to it directly through the
braincase. This close link led Prof. Sobel
and his scientific team to theorize that the
ability to sniff—that is, to control soft
palate movement—might be preserved even in
the most acute cases of paralysis.
Functional magnetic resonance imaging (fMRI) lent support
to the idea, showing that a number of brain
areas contribute to soft palate control.
This imaging revealed a significant overlap
between soft palate control and the language
areas of the brain, hinting to the
scientists that the use of sniffing to
communicate might be learned intuitively.
To test their theory, the researchers created a device
with a sensor that fits on a nostril’s
opening and measures changes in air
pressure. For patients on respirators, the
team developed a passive version of the
device, which diverts airflow to the
About 75 percent of the subjects on respirators were able
to control their soft palate movement to
operate the device. Initial tests, carried
out with healthy volunteers, showed that the
device compared favorably with a mouse or
joystick for playing computer games.
In the next stage, carried out in collaboration with
Prof. Nachum Soroker of Loewenstein Hospital
Rehabilitation Center in Raanana, Israel,
quadriplegics and locked-in patients tested
One patient who had been locked in for seven months
following a stroke learned to use the device
over a period of several days, writing her
first message to her family.
Another, who had been locked in since a traffic accident
18 years earlier, wrote that the new device
was much easier to use than one based on
blinking. Another 10 patients, all
quadriplegics, succeeded in operating a
computer and writing messages through
In addition to communication, the device can function as
a sort of steering mechanism for
wheelchairs: Two successive sniffs in tell
it to go forward, two out mean reverse, out
and then in turn it left, and in and out
turn it right.
After 15 minutes of practice, a subject who is paralyzed
from the neck down managed to navigate a
wheelchair through a complex route—sharp
turns and all—as deftly as a non-disabled
Sniffs can be in or out, strong or shallow, long or
short; and this gives the device’s
developers the opportunity to create a
complex “language” with multiple signals.
The new system is relatively inexpensive to produce, and
simple and quick to learn to operate in
comparison with other brain-machine
interfaces. Prof. Sobel believes that this
invention may not only bring new hope to
severely disabled people, but it could be
useful in other areas; for instance, as a
control for a “third arm” for surgeons and
Prof. Noam Sobel’s research is supported by the Nella and
Leon Benoziyo Center for Neurosciences; the
J&R Foundation; and Regina Wachter, NY.