Robotically Assisted Photogrammetry

Exploring the potential of automated photogrammetry captures using a UR5 robot arm.

Capture Pipeline

I made a simple modification to an infrared remote control I bought from Amazon so that I could trigger my camera via a serial message sent over USB to an Arduino. I just found the pin on the built-in controller IC that goes to the button and connected it to a digital I/O pin on the Arduino. Setting the pin mode to output and the value low caused the remote to send the camera trigger signal.

I planned out the locations and angles for the images by "teaching" the robot. The UR5 is a collaborative robot with torque sensors in every joint, so the process was as easy as dragging it around through space and saving the resulting UR Script file.

A crude Python script orchestrated the capture on a laptop that was plugged into both the Arduino triggering the camera and the network the robot was on. For each line of the input script it created a new script telling the robot to run that one line and then to establish a TCP socket connection back to the control PC. When the script sees the connection from the robot it knows that the robot has moved the camera to the right position for the next picture. It then sends a byte to the Arduino over serial and that triggers the IR data burst that triggers the camera.

Pictures accumulated on the camera's SD card. When the capture finished I transferred them to a PC with Reality Capture installed for the actual model reconstruction process. I took the resulting 3D model into Blender to render some simple animations.

Rendered from Blender

Code to Choreograph Camera and Robot