Scentery

Publication

Elle Luo, Katia Vega

The ACM International Conference on Mobile Human-Computer Interaction

Year

2018

Scentery proposes a novel approach to create calming multisensory environments by displaying visualizations, reproducing audios and activating olfactory sensations. Scentery’s users switch between different multisensory scenarios that promote calm sensation. Scentery was developed with Unity 3D for creating the 3D scenarios, Unity Remote for the camera control and viewer’s perspectives, and a microcontroller for triggering the scents in the vaporizer.

Screen Shot 2022-11-01 at 12.56.20 PM.png
Technology Description

To execute the prototype, we use a smart phone inserted in a VR headset as the viewer to display the scenarios in the virtual reality environment. In order to continuously track the changes of the user’s head rotation, we programmed a script and attached it to the camera in the building environment. The script modified camera in the building scene constantly rotates along with the position changes of the accelerometer inside the smartphone while connected to Unity 3D. To create the stereoscopic view on the smartphone, we rendered the view with two cameras which are placed at slightly different angles to create an illusion of depth and perspective view. We also use Arduino Uno, a microcontroller board, along with the VR program to activate triggers programmed in the VR environment.

Screen Shot 2022-11-01 at 1.13_edited.jpg
Scent

The scent is released by the triggers programmed in the microcontroller connected with Unity 3D. For the mechanics of the olfactory device, the water vaporizer collides the oil-based scents at high pressures to produce atomized water or the vaporized water. Two vaporizer are connected to the microcontroller. The vaporizer converts water into vapor and emit the scents for 10 seconds.