For the Google Global Partner Summit 2014, Helios was commissioned to develop an interactive light sculpture that would help Google partners visualize what their Cloud platform could do for their businesses and customers in a fun and informative fashion. We partnered with a fabrication company and some of our favorite visual programming geniuses to create a Nexus 10 tablet driven interface to a physical LED, metal, and propylene ball sculpture that was driven by 3700 individually addressable lights. Guests were encouraged to draw on a grid on the tablet that represented 1/4 of the lights on the physical cloud. They could choose colors, make dots or paint shapes and their creations would appear as color representations through the lights on the LED sculpture. When finished, guests were able to upload their user session to the Google Cloud Platform which stored each session and when the tablet went idle, the device would randomly fetch an existing user session and play it back on the sculpture so it was always in motion. Another component involved the collection and visualization of those user sessions on an external touch screen display. Stats such as number of users, average session duration, total time played, colors used, etc. were calculated from the session data and added to a visualization application that could be interacted with.

Credits

Client: Google
Partners: MAS, Hot Rod Shop
Launched: March 5 2014
Event Duration: 2 days
Project Video Music: “To The Author” by The Sea and Cake

UI Design: Helios Interactive
Front End Development: Helios Interactive, MMMLabs
Technology Integration: Helios Interactive

Photos

Process

The Google Cloud Light Sculpture presented Helios with a unique opportunity to combine devices, physical computing, and data visualization with a single theme. Our initial concepts revolved around the idea of having an “interactive cloud” that could play back user generated content when not in use. Originally we had thought this sort of light show might be possible by using the Phillips Hue lightbulbs. These bulbs can be a great addition to an home or office, but our initial testing proved them to not be very responsive and with too many bottle necks for responsive data processing. We found these technical limitations too constraining to create the engaging experience we had envisioned and pivoted our technology to using wired singularly addressable LED strips. We utilized several pixelpushers to control the lighting of thousands of individual LED lights. After a few shipping delays, we were on an accelerated path for the physical build out of the lights and optimization of the code base for such a large collection of lighting strips. We created a custom tool that allowed the team to calibrate the individual lighting strips to form the grid that users could interact with. Every user’s brush stroke was recorded and uploaded to a local server. When the sculpture was in passive mode, previous sessions would play back enticing others to participate. All the sessions were combined and served up for a local data visualization showing the average engagement, popular brush choices, and grid positions. On site our biggest bottleneck derived from sending stroke data from the tablets over our local wifi in a crowded space – when in doubt: use a hardwire.

Technology

Hardware: 3700 LEDs, Pixelpushers, local server
Software: openFrameworks, Processing, javascript, html
Screens: Android tablets, HP Touchsmart

Project Development: 3-4 weeks