Smell-O-Vision: This nose-zapping wearable simulates smell using electricity in 2021

0
618

Smell-O-Vision: Jas Brooks, a long-haired engineer who looks like he might be moonlighting as a roadie in a high metal band, sat blindfolded in a room with electrodes in his nose and allowed people to amplify smells over the Internet.

“It definitely looks … terrifying,” they told Digital Trends, comparing the experimental setup to the Milgraham experiment, a controversial series of 1960s experiments conducted by a Yale University psychologist in which people were tested for their willingness to subject participants to electric shocks.

In Stanley Milgraham’s experiments, however, participants did not electrocute people. Unbeknownst to them, participants tested whether they were willing to submit to an authority figure by doing what they themselves might have considered unconscionable. In Brooks’ experimental setup, Brooks actually received electric currents from people at the control station. They just happened to register as warm, wasabi-like sensations or sharp smells of vinegar instead of shocks.

“It didn’t hurt for me,” Brooks said. “I was just sitting there thinking, ‘Oh yeah, I feel it. That’s what I’m perceiving right now.’ The basic setup was that I was blindfolded and there was a screen [that had instructions on it]. It was an interface designed by me with a picture of my nose, right and left buttons. They could virtually click on it to test the sensor.”

Sniffing out the future of technology

Brooks, a graduate student in the Department of Computer Science at the University of Chicago’s Human-Computer Integration Lab, is focused on the future of technology. And, at least based on this recent experiment, one form technology could take is a pair of electrodes held in place by tiny magnets that are inserted into users’ noses.

To imagine this, imagine a high-tech anti-snoring device or a cyborg data collection accessory that Twitter‘s Jack Dorsey could swap his Burning Man nose ring for. The tiny, battery-powered, wireless wearable device is able to detect when a user inhales and then use electrodes to stimulate the septum, the cartilage in the nose that separates the nostrils.

Digital Trends previously reported on the work of the Human-Computer Integration Lab, when researchers (including Brooks) developed a technique to reproduce temperature in virtual reality by pumping odorless chemicals with the trace elements capsaicin and menthol to simulate the sensation of heat and cold. A low-power device attached to a virtual reality display was used for this purpose. This time, however, the device the team developed does not involve actual chemical stimulation. The user doesn’t actually smell externally, but simply tickles one of the nerve nodes associated with the sense of smell in such a way that it feels like he can smell it.

“Most people know that we perceive smell through the olfactory bulb, but actually smell is a multimodal sensation,” Brooks says. “We have two systems that [contribute to] our perception of smell. We have the olfactory bulb and then the nerve endings in our nose that perceive things like the sharpness of vinegar, which is a very distinct sensation mediated by that nerve, as well as things like the refreshing aspect of mint.”

The wearable Bluetooth nasal device buzzes this last area of the trigeminal nerve to do the trick. This more easily accessible nerve cluster (lighter than the olfactory bulb, which is located behind the eyeball) creates certain odor sensations, which the brain then combines with data from the olfactory bulb to trigger certain odor sensations.

A brief history of olfaction

The work done by Brooks and the rest of the team is cutting edge. But it’s not the first time the world has thought about smell technology. On April 1, 2013, Google announced its Google Nose project, a new initiative by the tech giant that it says will expand the search space through olfaction. In a video produced by Google, product manager John Wooley explained that olfaction is an important part of how we navigate the world, yet it has been brutally ignored by previous search methods.

The idea behind Google Nose was, based on Google’s Aroma database of 15 million “scents” from around the world, to allow users to “search for smells.” By tapping the new Google Smell button on a laptop, desktop or mobile device, a user could, for example, hold their phone up to a flower and get a positive identification based on its scent. “By crossing photons with infrasound waves, Google Nose Beta temporarily aligns molecules to mimic a particular scent,” the video explains.

Unfortunately, it was an April Fool’s prank, not an actual product. While it was a good joke, it’s also indicative of how often scent technology has been treated lately. No one disputes that the sense of smell is a strong sense (there’s a reason people talk about the importance of baking fresh bread when viewing homes for sale), but smell is a difficult sense to use in the way we can, for example, create bubbles of personalized sound with headphones or control what the eye sees with a changing video display.

Efforts to do this are regularly mocked by critics. For example, the long-winded Smell-O-Vision is often ridiculously considered the nadir of mid-20th-century cinema gimmickry, at a time when television was giving way to television. In Smell-O-Vision’s first film, 1960’s Scent of a Mystery, an automatic scent was pumped to the theater seats using plastic tubes. Thirty different scents, ranging from vapors to shoe polish to wine, were designed to match what was happening on the screen.

The movie’s blurb read: “First they moved (1895)! Then they talked (1927)! Now they smell (1960)!” As a gimmick, it sucked.

The smell of fresh rain?

Olfactory control is made much more possible by the latest work of the Human-Computer Integration Lab. For example, one of the unusual features of the device is that it allows you to smell in both stereo and monaural modes. This means that it can activate each electrode independently, which is why the virtual Brooks control panel described earlier had separate buttons for the left and right electrodes. Stereo sniffing is remarkable in that it is not part of the way we normally experience scents in the real world.

However, don’t expect a nasal wearable device to be able to reproduce more complex smells. According to Brooks, mimicking a wider range of scents is possible, but not just by stimulating the trigeminal nerve. The olfactory bulb has a much broader palette of sensations. The trigeminal nerve is more like a tongue, which can distinguish only five tastes: sweet, sour, salty, bitter, and umami. (Most of the subtleties of what we call taste are actually smells.) Similarly, stimulation of the trigeminal nerve can produce strong sensations that we recognize as smells, but without any notes. In other words, although you can reproduce the tingling sensation of vinegar vapor, you can’t do the same with the smell of fresh rain.

Stimulation of the olfactory bulb requires prolonged supervised nasal swabbing, compared to which the COVID test would seem like sniffing. Brooks noted that the optimal way to stimulate the olfactory bulb would be with a tiny medical implant, although this is unlikely to suit most of us. There is also the problem of reproducing odors at the code level. “We don’t know what the parameters have to be to code an odor digitally or electrically so that it is then properly decoded by the bulb,” they said.

The usefulness of olfactory technology

In terms of use cases, the most obvious thing is that virtual reality is becoming more immersive. Regardless of how good the graphics may be, whether we master the ability to take endless walks in virtual reality or work on haptic technologies to feel textures and objects in the virtual world, the VR pine forest, for many, will always seem inadequate if it doesn’t smell like pines.

But Brooks doesn’t see it solely as a gaming accessory. “We already have phenomenal sensations of smells that we might not pay much attention to, in real life, that are just super rich,” they say. “You’re walking down the street and some smell just strikes you. There’s a pretty famous chocolate factory in Chicago, and you just get clouds of that smell in the city. I imagine that could lead to a purely olfactory augmented reality … really transforming the way we interact with everyday smells instead of trying to create a new set of smells from scratch.”

7 things you do on social media that affect your personal brand

This work, which is still in the future for the team, could be aimed at making the perception of smells more intelligent. Where does a particular smell come from? Can you amplify one smell that you like and weaken another that you don’t like? How about odor notifications: Who wouldn’t want to feel the pungent smell of wasabi in their nostrils when their boss is texting him on Slack? Or, more seriously, can you be made to smell a deadly gas such as carbon monoxide, which is currently odorless? While carbon monoxide detectors do this without having to put electrodes in your nose, such a tool could prove useful for certain scenarios, such as rescue workers.

“One of the things we’re thinking about is, can we use this as an intervention technology, like hearing aids for people with odor loss?” Brooks said, pointing out that this could become more relevant in a post-pandemic world where loss of smell will prove to be a predominant side effect for many people.

And, of course, there’s always the possibility of other kinds of sensory entertainment besides VR and gaming.” “The chemical senses are so intense that it’s hard to imagine, for example, a three-hour olfactory opera that constantly stimulates you for those three hours and doesn’t give you a break,” Brooks says. But the idea is certainly tempting. “For the last year and a half, I’ve been thinking about how I would personally enjoy the smell of a Walkman.”

The idea of picking up a playlist of smells — from the smell of tomatoes on the vine to the scent of fabric softener — and playing each one on demand is something technologists dream about. A far-fetched one, perhaps. But not impossible. “It’s definitely not impossible,” Brooks says.

A paper describing the team’s work was recently presented at the 2021 Conference on Human Factors in Computing (CHI). Laboratory leader Pedro Lopez, Romain Nit, Shan-Yuan Teng, Jingxuan Wen and Jun Nishida also worked with Brooks on the project.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here