Part of Intel’s booth at CES 2015, we created a 110 foot long interactive video wall comprised of three distinct scenes; Ice, Sand, and Space. The wall lived in a tunnel that gave participants an enriching and playful experience, isolated from the many overwhelming tech booths common to CES.
I was focused on programming the Space scene, which involved gas clouds, interactive stars and nebula clouds, and asteroids you could reach out and explode with your hands.
The video wall contained three separate sections that each displayed a 7560x1920 pixel window using NVidia Mosaic, running on dual Quadro K5200 graphics cards. For depth capture, we used 14 RealSense depth cameras (still pre-release at the time of this project) per section, stacking two on top of each other to create a wider vertical field-of-view. The capture feeds were powered by secondary PCs, which filtered the depth, compressed and then sent their streams over CAT5 cables to the main display PC.
Because we were rendering at 30fps to such a large canvas and there could be dozens of participants interacting with each scene at any given moment, the scene is composed of a few assets that could be loaded at launch or procedural rendering techniques. I’ll break down the main elements below.
Background gas clouds - a full-screen artist-created image, then modulated at run-time with FBM noise to animate the clouds. This was the only non-interactive component.
Stars - a combination of a static image and a simple particle system with animated point sprites. The animation was done by taking two different star sprites at random and rotating one image against the other in the fragment shader. The interactive particles responded to detected ‘blobs’ that were post-processed from the capture depth buffer. Each star particle chooses a nearby blob, which becomes its attractor, as well as being propelled away from other nearby stars.
Asteroids - we designed five full-sized asteroids in Cinema4D, along with a number of smaller ‘shards’ that were used during an explosion. The asteroids were imported as .obj files, rendered with normal mapping and two directional lights. Participants could cause the asteroids to explode by waving their hands, which was simply achieved by comparing the bounding box of the astroid to the nearby blobs - if there was enough blob data detected nearby, they would explode. As the tunnel was usually full of people with their hands in the air, the majority of asteroids animated from near the top of the screen, so they exploded on collision with the user's silhouette.
The asteroids explode into smaller .obj meshes that fly off the screen, and a ‘supernova’ light ring emanates across the entire canvas. Each light ring was procedurally rendered on a rotated plane, using noise functions in polar coordinates.
Nebula (silhouettes) - The silhouettes were rendered using a GPU particle system using blurry point sprites. The particles are birthed by mapping the depth buffer into the 3D scene and then randomly looking at depth locations to see if there is something present. Initially, we were attempting to animate the particles using a Navier-Stokes fluid simulation, however we hit performance limitations running at the full canvas size and decided to save that for a future project.
Kelp Room at Power Station of Art, Shanghaiinstallation
Connections Wall at Northwestern Universitymulti-touch, multi-user interactive wall at NWU Visitor Center
Flow ParticlesGPU Particles with the Intel Realsense Depth camera and optical flow.
Samsung Centerstagetouch wall
Glimpse Twitter Visualizationinterative visualization
Cinder Audioreal-time, modular, node-based audio library