I spent some time collaborating with the artist Jeff Leonard (Instagram), who has built fantastic painting machines for himself in the course of developing his art practice. The machines are mostly controlled by Arduino boards directly from sensors and devices like potentiometers, buttons, joy sticks, sliders, etc. I was interested in mapping human motion directly to machine motion, so I wrote software to connect a Vive VR system to the machine and a tool that was able to orient itself.
The software stack used C# in Unity and SteamVR to capture the tracking data from a Vive tracker and the control inputs from a regular Vive controller. That data was passed over UDP to a Processing sketch which visualized and mapped it to the machine's coordinate space. The Processing sketch communicated over a USB serial connection with the Arduino controlling the machine, and I wrote firmware to drive the various motors of the machine based on the messages received.
After some calibration to set the work space and rotate the coordinate frames so human motion roughly matched machine motion it was possible to carry out surprisingly subtle motions. This experiment was a prelude to a lot of later work I did (and am still doing, at the time of writing) with AR/VR and robotics in art.