Skip Navigation

Produced by the Office of Marketing and Communications

Subscribe Now

Scents of Things to Come

Researchers Add Smell to Virtual Reality to Better Understand Data

By Maria Herd M.A. ’19

Virtual reality headset

Photo courtesy of Andrea Batch and Biswaksen Patnaik

A prototype device attached to a virtual reality headset wafts scents toward the nose of Andrea Batch, a doctoral student in information studies developing technology to integrate smell into VR technology.

Imagine running through a dark forest in a virtual reality video game, and being able to smell the crisp scent of pine needles all around you.

Or what if you were analyzing a complex data set, and could associate specific scents with data points in order to better track and recall the information?

These ideas are now becoming a reality through innovative research by graduate students in the University of Maryland’s Human-Computer Interaction Lab (HCIL).

Biswaksen Patnaik, a second-year master’s student in human-computer interaction, and Andrea Batch, a third-year doctoral student in information studies, are exploring ways to convey information with scent as a complement to the visual representation of data sets.

The students recently presented their paper on “information olfactation”—a term they derived by combining olfaction, which is the sense of smell, with information visualization—at a conference in Berlin.

“This was easily our most crazy idea to date,” says the students’ adviser and paper co-author, Niklas Elmqvist, a professor of information studies who is the HCIL director and has an appointment in the University of Maryland Institute Advanced Computer Studies (UMIACS).

The paper describes two prototypes the team has built—one for a desktop computer and one for a virtual reality headset—that disperse essential oils through diffusers. They also designed three different types of graph layouts in which smell can assist in conveying data visually to the user.

“You need a playground so that you can play around with producing different kinds of stimulus, which is why we made this olfactory display machine,” says Patnaik, who drew on a background in digital taste and smell technologies to build the prototypes. Batch, whose research focuses on immersive environments in virtual and augmented reality, developed the software.

One of the network visualizations designed by Patnaik and Batch consists of a Bitcoin data set. Each colored node represents the average transaction rating of a Bitcoin holder, which is also associated with a specific smell. The nodes are linked to represent transactions between users.

Elmqvist says that similar network visualization sets could be built using social networking data, such as Facebook friends. He stresses that unlike visual information, scent is a complimentary way to convey information.

“Smell is powerful but not nearly as powerful as vision—that’s why we always combine it,” he says.

Scientific literature shows that fragrance associated with an emotionally significant event can help with information recall, says Batch. She theorizes that a molecular bouquet—a certain set of combined scents—may also improve a user’s ability to recall information.

“If I asked you, ‘How does your favorite café smell like?’ It’s not a specific fragrance. It’s not like citrus, not lavender, but it’s a combination of the smells and that’s how you still identify it,” says Patnaik, to explain the molecular bouquet concept. “This is my café; that’s how it smells like.”



Maryland Today is produced by the Office of Marketing and Communications for the University of Maryland community on weekdays during the academic year, except for university holidays.