The Very Large LED Array

A bunch of LEDs mounted on a 4x8' sheet of plywood

Driving an LED Array from Shaders Running on a Raspberry Pi

The Very Large LED Array, or VLLA for short, is a bunch of LEDs mounted on a 4x8' sheet of plywood driven by two Teensy microcontrollers and a Raspberry Pi. The hardware was designed and constructed by friends of mine for our hall in our dorm at MIT with funding from Texas Instruments. The goal was to create an impressive light show for parties.

I was interested in the software driving the display and had a desperate need to obsess over something not school related at the end of my 2nd semester in my junior year of undergrad, so I developed a series of tools for creating effects for the display. In its final form the system could run effects written in JavaScript, Lua, or GLSL, piped frame-by-frame over UDP, read from GIFs, or stream a 64x32 rectangle of a Linux desktop (via X11). There was also an orbital defense game that people sitting under the display in the lounge could play by whistling (frequency = angle of aim).

One of the first effects I developed for the VLLA was directly inspired by a loading screen in a demo by Fairlight called We Are New. In the demo a pistol-wielding panda rendered in chunky volumetric form rotated in 3D space.

Experiments

At one point I was learning about how X11 works so I wrote a little program to stream a 64x32 region of pixels from my desktop to the VLLA. Here's an example of doing that while running a particle simulation screensaver:

For a while the VLLA lived on my dorm room wall. One of my favorite features was playing GIFs. I added shortcuts to my window manager to let me play one of a collection of GIFs at a moments notice:

Sound Reactive Effects

An obvious application of a giant panel of LEDs is parties. Whenever my living group hosted a party the VLLA was mounted above the dance floor. For best effect the visuals had to sync up with the music. So I plugged a USB mic into the Raspberry Pi and made it so the effects could access the audio data.

To implement sound-reactive effects I piped the result of running an FFT on sound pulled from Alsa into a texture and fed it into fragment shaders written in GLSL. The resulting framebuffer was fed back into the shader for feedback effects.

An awesome effect developed by my friend Corey
An example of a feedback effect
Combining the FFT and feedback to make a waterfall plot