BLOG

Sensors for our Five Senses

F5 Thumbnail
F5
Published September 05, 2019

Back in 2015 I explored how our five primary senses—sight, smell, taste, touch, and hearing—were being re-created using sensors. Our senses are how we navigate life: they give us a perspective of the environment around us, and help us interpret the world we live in. But we’re also limited by the sensory world. If a sense is diminished, there may be a way to approximate or enhance its effects (as we do with hearing aids) or rely on another sense in a compensatory fashion (as with braille and sign language).

Today, gadgets (and IoT technologies) are being built that work in conjunction with, or completely replace, capabilities of the eyes, ears, nose, tongue, and hands. Sensory receptors can be replaced with micro-chipped devices that perform the same functions as these receptors, attached to or integrated with our bodies.

The technology in 2015 was eye-opening (ha-ha), but I wanted to examine how much things have advanced over the past few years.

Sight: Remember Google Glass? Before its demise, engineers were working on eyeglasses that connected to automobiles, and provided telemetry displays on the lens. Today you can get a device that beams such information onto the windshield, or displays it using technology built into the glass. We also have technology that lets you ‘see’ through walls.

There are 285 million visually impaired people worldwide; and among those, there are 39 million who are totally blind. Sensor-based assistive devices for the blind used to be limited in their capabilities, and typically alerted the user to the presence of obstacles only. Now researchers have developed a wearable assistive device that enables a person to sense their environment and move around more safely. These devices—currently available as a sonar equipped wristband or a radar monitor—use frequency waves and give feedback either with vibrations or audio.

There’s more, though bionic eyes are being developed, and blind patients are testing bionic implants that rely on a brain-computer interface. These devices could bring back some vision in patients with certain genetic eye disorders. A camera and an array of electrodes implanted around the eye and the retinal cells can transmit visual information along the optic nerve to the brain, producing patterns of light in a patient’s field of view. The results aren’t perfect, but this does give hope to those with limited or declining vision.

Smell: From Smell-O-Vision and Smell-O-Rama back in the 1940s-50s to the little devices that connect to your mobile device to emit a scent, objects designed to create smells have been around for a while—as well as devices designed to “smell” a substance in the air, such as smoke, radon, and carbon-monoxide detectors. Researchers have already developed wearable sensors that can smell diabetes by detecting acetone in the breath, and have figured out how to use a sensor to identify the odor from melanoma. Also, Apple is looking to add sensors to the iPhone and Apple Watch to detect low blood sugar based on body odor. Current electronic noses can smell more effectively than human noses, using an array of gas sensors which selectively overlap, along with a pattern reorganization component. The smell or flavor is perceived as a global fingerprint and generates a signal pattern (a digital value) that’s used for characterizing smells. What would “stink” be to the Nth power?

Hearing: According to U.K. firm Wifore Consulting, hearing technology alone will be a $40 billion market by 2020. In 2018, it was $5 billion. We have alerting devices, cochlear implants, and a wearable vest that helps deaf people hear through a series of vibrations. A suite of sensors picks up sounds and vibrates, allowing the wearer to feel rather than hear sounds. The vibrations occur at the exact frequency which the sound made. (Have you ever stood next to a thumping speaker at a concert and felt the sound? You don’t need to hear it to know the bass thump.)

What about communicating with those who don’t know sign language? Prototype SignAloud gloves translate the gestures of American Sign Language into spoken English. There was some criticism of the device because of mistranslations, and because the device didn’t capture the nuances of sign language—such as the secondary signals of eyebrow movements, shifts in the signer's body, and motions of the mouth—that help convey meaning and intent. With another glove, users can record and name gestures that correspond with words or phrases, eliminating facial additions; another version can send translations directly to the wearer's smartphone, which can then enunciate the words or phrases.

Touch: Back in 2013, researchers developed a flexible sensor able to detect temperature, pressure, and humidity simultaneously, providing a big leap toward imitating the sensing features of the human skin. Elsewhere, the University of Pittsburg Medical Center designed a robotic arm which allows the user to feel touch applied to the robotic fingers.

And now we have an artificial nerve! Similar to sensory neurons embedded in our skin, a bendy Band-Aid looking device detects touch, processes the information, and sends it off to other nerves. Rather than zeroes and ones, this nerve uses the same language as a biological nerve and can directly communicate with the body—whether it be the leg of a cockroach or residual nerve endings from an amputated limb.

Today’s prosthetics can read a user’s brain activity and move accordingly but imagine the reverse: circuits that transform voltage into electrical pulses. The outputs of this artificial nerve are electrical patterns that the body can understand—the “neural code.” Forget computers, it’s time to go neural!

Taste: The Internet of Food is expanding. I've written about smart chopsticks that can detect oils containing unsanitary levels of contamination, a fork that monitors how many bites you take, and a smart cup that counts the amount and calories you drink.

The field of chemosensory research focuses on identifying the key receptors expressed by taste cells, and understanding how those receptors send signals to the brain. For instance, researchers are working to develop a better understanding of how sweet and bitter substances attach to their targeted receptors. What we think of as taste often comes from the molecular composition of a food ingredient along with smell. IBM’s Hypertaste uses “electrochemical sensors comprised of pairs of electrodes, each responding to the presence of a combination of molecules by means of a voltage signal…The combined voltage signals of all pairs of electrodes represents the liquid’s fingerprint,” according to the IBM Research Blog. It also needs to be trained just like the human palate! Yet another taste focused system uses sensors and electrodes that can digitally transmit the basic color and sourness of lemonade to a tumbler of water, making it look and taste like the summertime favorite.

No matter what, all these technologies require an application, services and some code to properly function. Science-fiction aside, who would have thought that an individual's perspectives would become embedded within software so quickly?