Glade: Museum of feelings

Joyful Room: Processing (library), Java, OSC, DMX, Arduino | CityFeels API: NodeJS

The Museum of Feelings was an interactive immersive experience that Radical Media created for Glade. The goal was to elicit the emotions connected to Glade scents through a series of 5 interactive rooms. The experience ran for 3 weeks in December 2015. We received over 70,000 visitors, millions of shares on social media, and had 6-hour lines almost everyday.

Joyful Room

The Joyful Room was the second room in the experience. Hundreds of fiber optic strands hung from the ceiling, undulating with light in response to the movement of the people in room. I built a program for translating movement events from Doppler sensors into light patterns that animated across the hanging strands. 


1. Responsibilities:

  1. Engineering lead for the "Joyful" room, 1 of 5 main rooms in the experience.
  2. Engineering lead for the "City Feels" API, the data feed for all digital OOH and interactive data-viz in the experiences; also drove the creative on the outside of the building.

 2. Takeaways:

  1. I was responsible for developing both projects from comps in a deck to fully fleshed-out pieces of software. This meant:
    • educating our creative director on the limitations of our technologies and guiding him through the development process.
    • providing recommendations on creative in order to get the most out of our opportunity.
    • setting a development timeline and managing/protecting milestones.
  2. I had to learn a few new technologies for this job, mainly how to:
    • Build out sustainable API practices as well as upkeep for NodeJS servers.
    • Build 3D models in Processing.

UI & ViSualization - Joyful

Using Processing, I built a 3D model of the room to help the designers visualize how the strands would look when animating.

This was my first experience working with 3D objects. I wrote a custom script to generate a 3D model in Processing from the coordinates of a Rhino 3D object. I then built out the custom animations and a UI for triggering them.


CityFEels API

Every 15 minutes, the CityFeels API collected data from twitter, weather, traffic, and stock market (S&P 500) APIs for 60+ cities around the world. I weighted the results from those API calls to assign a 'Feeling' to each city. That 'Feeling' value was made accessible to our interactive exhibits via a private server; as the value changed, so did all our data visualizations, the colors of the exterior of the museum, and even a massive digital billboard outside Penn Station.

UI & ViSualization - Building Exterior

I branched and refactored the visualizer from the Joyful room to control the lighting on the exterior of the museum. As the CityFeels API updated the 'Feeling' of NYC, we transitioned the colors on the outside of the building accordingly.

We also triggered 30 2-story LED beams to fire every time we received a tweet.