Workshop: Synthesizing Human and Robot Movements for Art Production

During the 2019 Independent Activities Period (IAP) at MIT I helped set up and run a workshop held in the MIT Museum Studio on art production with a UR5 industrial robot arm. Ten participants stuck with it through to the end, creating 5 final projects that were exhibited in the space and many interesting experiments along the way.

For the workshop I developed a plug in (on GitHub) for the V-REP robot simulator which made it possible to synchronize in real-time a virtual and real UR5 robot arm. V-REP embeds a Lua scripting environment which makes it easy and fast to prototype new behaviors for simulated robots. Because we expected most participants to not have a programming background I developed runnable sample projects that controlled the robot from Rhino/Grasshopper, Arduino, Processing, Touch Designer, and an Xbox controller. My brother developed a Unity scene with a virtual UR5 that was able to sync up with the real robot through the V-REP-based system.

Experiments synchronizing robot motion with a haptic feedback pen. Data flow: Pen -> Unity -> V-REP -> Robot.

I was interested in real-time control from sensors and of tools mounted to the end of the robot, so as part of the design I had included a BeagleBone affixed to the robot's tool plate connected to the robot's tool IO port. Thanks to the connection the robot could power and sync to the BeagleBone. And the BeagleBone could connect to the Wifi so it could be SSHed into for programming.

Robot controlled by sound in Touch Designer