As is usually the case, Interlock managed to cajole/bribe the organizers of BarCamp Rochester into giving us a table in the atrium, upon which we could set up our wares and lure in unsuspecting geeks. The conference itself was really great, with lots of interesting talks, and lots of attendees and traffic by our table. Everybody seemed quite excited about the “Skeletonizing Carcasses with Flesh-eating Beetles” talk, myself included.
In the past, we’ve had a slight lack of table-sized projects that moved, made noise, or otherwise stimulated people to come talk to us and see what Interlock is all about… but no longer!
After becoming a little obsessed with old pen plotters over the past few months, I decided I’d like to try assembling my own drawing robot. The main goal, again, was to have something small, cool, and interactive to attract folks at events where we have a table or booth. So about a month ago, our journey started with destruction… one printer and one printer/scanner gave their slightly non-functional lives to this project. Anything slightly cool was saved, and of course the precision rods, stepper motors, and timing belts were the main goal. I wanted this to use pin-feed card stock, so an old dot-matrix printer was also sacrificed.
I was without camera for most of the month, so documentation is non-existent. Regardless, the documentation would have been something like this: “!@#$!@#$ MORE EPOXY! !@#%!#$”, along with pictures of me looking frustrated. Let’s just say, this machine is a hack, on top of a kludge, wrapped in a cob-job. We ended up with the paper-feed mechanism from the dot-matrix printer acting as the Y axis, and a small solenoid from adafruit riding along on the X axis with a wobbly pen-holder (and some tape (and epoxy!)). This was all hooked up to a rickety breadboard (I designed and ordered an Arduino shield via Batch PCB, but it didn’t arrive in time) with two Polulu stepper drivers, an Arduino, and a simple transistor doohicky for toggling the solenoid. We ran grbl on the Arduino, and after tracking down a bug in said code and reflashing the firmware, we were well on our way. I learned a lot, stressed a bit, and the morning of BarCamp we barely managed this:
But it got better throughout the day, with some live on-the-scene hacking. I managed to get a toolchain set up to get webcam input traced and plotted thusly:
That toolchain starts with OpenCV handling the webcam, and doing a “trace outlines” sort of procedure. From there, a PNG is saved, converted to vectors by autotrace, converted from eps to hpgl (the language of old-timey plotters) by pstoedit, slurped back into Python via the Chiplotle HPGL library, where I have a few routines scale and optimize the tool path, and then we output some ugly gcode and stream it to the Arduino. Phew.
It’s a bit roundabout. But it worked, and it made people smile and wander over to talk to us… and they got some cool robo-portraits out of it. I’ll leave you with another image and video of the bot doing its thing. A few more can be found in my flickr gallery.