Surely this will be my last poorly documented project: a hands-free foot operated documentation camera. I thought it’d be handy, and I had an old broken desk lamp kicking around, so of course, the two must be mushed together.
I started out with an old Canon SD1000, with CHDK installed of course. For those in the dark, CHDK is an alternate firmware that works on lots of point and shoot cameras (not just Canon, anymore), and it lets you run scripts, save pictures as raw files, and tweak every setting you could ever possibly want to. It’s awesome, and I needed it so I could trigger the shutter via an external button (you basically toggle +5v on one of the USB port pins).
I didn’t want to worry about running out of batteries mid-shoot, so I printed a dummy battery at Shapeways, ran some wires through it, then tinned and bent over the ends of them to make some pseudo “terminals”. These went out a hole I drilled in the battery door, through the lamp, to a simple 5v voltage regulator, wired up straight out of the datasheet. This 5v source also goes through an old lamp foot switch to the camera’s USB port, for the external trigger.
Finally, I took to the lathe to make an adapter between the weird lamp thread and standard camera tripod thread (1/4″ diameter, 20 threads per inch). I used a bit of round delrin stock, bored out the appropriate diameters, and just cranked a bolt through rather than threading things properly.
I tried a few test shots last night, while I put together Mighty Ohm’s Geiger Counter Kit. It works pretty well, but can’t really zoom in close enough to document very fine work. Hopefully it will still be useful for documenting other tasks that require both your hands in frame. Failing that, I’m sure there will be other uses for a scriptable camera attached to a flexible-yet-solid base… perhaps it will turn into a time-lapse-bot.
I’ve been taking up quite a bit of space at Interlock lately with my new toy, a Roland DPX-3300 pen plotter… delivered via ebay from the magical futurepast of the early nineties. For those that are out of the loop with the past century, pen plotters are two axis robo-thingamajigs that basically pick up pens and draw on paper with them. They’ve been replaced by wide-format inkjet printers, but they used to be quite popular with anybody who wanted to plot out maps, blueprints, and other large diagrams.
Anyways, somehow I caught the plotter bug, even going so far as to learn a bit of python after finding this neat library called Chiplotle that handles the basics of creating and streaming artwork to plotters using their native language of HPGL. They have a mailing list where they often point out good plotter deals on ebay, and I foolishly/accidentally purchased perhaps the biggest flatbed plotter out there. 3×2 feet of plotting goodness, 100 pounds of steel and stepper motors delivered to my door by a grumpy UPS driver.
After realizing it doesn’t fit anywhere in my house, I brought it to Interlock and have been tinkering with it since. The first step was to convert old dried out plotter pens into pen holders that would allow me to use Sharpies or other art pens with the device. That involved a quick trip to the lathe, which I hope to document more fully in the future.
Having figured out an inexpensive pen solution, I really wanted to make drawings based on webcam input… so I trawled around the internet and mailing lists until I could piece together a passable solution using OpenCV’s python bindings, and a series of nerdy unixy commands (convert, autotrace, and pstoedit) to trace, mush, and output the data. You can find the relevant code up on my Ronald Toys Github repository.
Having conquered such menial tasks, I decided that the default route the plotter was taking was quite inefficient, with lots of time spent seeking back and forth between lines. My precious robot was spending half its time flailing due to a poorly composed hpgl file. Surely there was a better way.
Tada! I wrote a really dumb sort routine that attempts to minimize seeking between lines, instead of blindly accepting the order that autotrace wrote them in. You can see the difference in the following two time-lapse videos. The first shows the default route that autotrace produces, and the second is my optimized version.
Next step for this project: some more sophisticated drawing routines, including some cross-hatched shading, color, and maybe even some 3D input from a Kinect. Stop in for a Tuesday Open Night and maybe you can be a robo-portrait guinea pig!
Join us this Friday, November 12th at 8pm for another stimulating night of lightning talks at Interlock! Lightning talks are five minute(-ish) presentations on whatever topic you’re passionate about, and we try to get through fifteen or twenty of them in a night. The event is free and open to the public, and anybody is welcome to present. We’ll have some food and beverage on hand to stimulate digestion and discussion.
If you’re going to present, and need to get slides situated, please show up a little early and/or mail your slides to firstname.lastname@example.org sometime before the event. We encourage a wide variety of topics… not just tech-oriented subjects. Previous talks have been about lockpicking, homebrewing, bookbinding, artificial intelligence, basic electronics, laser cutting, and on and on. Teach us something new!
Here is our Googley Maps page where you can get directions to the space. Once you get into the parking lot of the Hungerford building, you’ll want to look for Door #1 and head down into the basement to find us.
Also, if you are on the Facebooks, please do RSVP to our event on said site, so we can calculate pizza ingestion and seating requirements. See you this Friday!