The Twenty Dollar 3D Scanner and Cloning Cacti

October 19, 2013

With a newfound ability to take digital design and make physical objects, it’s only natural to find a way to go the other direction. 3D Scanning is the technique that closes the gap in the promise of 3D printers being replicator machines. In fact, the ‘Maker Replicator’ has a companion: the $1400 “Makerbot Digitizer.” Essentially, it’s a motorized turntable, two lasers illuminating either side of the object-to-be-digitized, and a camera with a live feed to the fine tuned software that gives you a 3D model ready to print inside of ten minutes.

Makerspace Urbana has a mission of technology proliferation to people of all classes and creeds, and at $1400, the Makerbot Digitizer is another piece of new technology that’s priced out of reach of the general population. So I was very impressed to see James, a fellow member of Makerspace Urbana, playing with a different set of hardware to achieve the same result — a simple handheld laser and an old webcam (specifically a Playstation Eyetoy — talk about repurposing).

This blue laser isn’t something many people would have sitting around, but can be ordered online for about $10 before shipping. It’s 5mw and 405nm wavelength. A simple filter that can be ordered along with it turns the dot into a sharp straight line. However, any old red laser would work as well (though it would require very dim lights, higher power lasers will work much better), and can be converted from a red dot to a red line using a small plastic cylinder, for instance: a lego lightsaber!

So the components for a 3D scanner can be hacked together (perhaps you have a busted cd or bluray player that could have its laser beam harvested) — but how about the software? James had been using the free trial of DAVID3 laser scanning software. It offered a very intuitive scanning workflow, but would only allow you to save one side of the object at a time unless you pony up hundreds of dollars for a license.

This strategy of digitizing an object works by using a calibration pattern that the software recognizes (that’s the piece of paper with dots printed on it) to determine how far away the background is from the camera. When a laser-line is projected across the object, the line takes the shape of the object. The software compares the form-fitting shape of the laser line with the straight line that hits the background, and creates a “point cloud” representing the one side of the object the camera can see.

The Makerbot Digitizer has the turntable wired up to the computer, so it can scan the object while it rotates. But without this integrated turntable, we have to scan the object one side at a time — and manually piece the point clouds together after the fact. The DAVID3 software automatically aligns these point clouds, but saving the result is a privilege of the paid version.

From left to right: aligning two point clouds, the completed 8-sided point cloud and all its associated noise, and the surface reconstruction ready to print.

But no matter, the free and open source “MeshLab” allows you to align point clouds semi-automatically. For each of the 8 angles captured, you have to give MeshLab some hints as to how they line up, and it uses its fancy algorithms to piece the two together. Here is (someone else’s) video tutorial that shows the whole process.

I used that technique to piece together 8 scans of my cactus, and was able to create a ‘watertight’ mesh, a continuous volume without any holes — using a MeshLab filter called “Surface Reconstruction: Poisson.”

The generated mesh is solid, seamless and ready to print. While the general likeness was captured, I’m not so satisfied with the detail. Perhaps using a higher resolution camera would help, but I think most of the detail was lost in the noise resulting from the laser’s light-scatter — a result of the material bending and blurring the laser-line. So whatever laser scanner you use, the detail you capture will be reliant on how sharply the object reflects the laser.

After trying this method out on a few different objects, I came across 123D Catch : a cloud service that generates meshes from photographs. I’ve found it vastly more practical than setting up and calibrating lasers and cameras — even with a couple dozen pictures from my camera phone I can get very detailed meshes, with the photographic data applied to the surface. You can download the results to use how you please (under a non-commercial agreement), but it is a free service by Autodesk that they can pull anytime. Since then I’ve learned to use Agisoft — equally powerful photostitching software, for $60 with education discount. At least it’s something you can own and run on your own computer!