Hello Hololens! Creating holographic animations with Python & Blender

I’ve always loved holograms, those virtual objects intermingled with the real world in so many of my favorite films, from Star Wars to Zenon: Girl of the 21st Century. But I wasn’t looking forward to learning Unity game development and Windows system APIs before I could make my first animation for the hololens platform.

So while I still have to dig into Windows-Guts and Direct3D to make interactive applications, I was excited to find a straight forward way to generate hololens-compatible animations with just a few lines of python.

Using the python embedded in the open source Blender project, we can read files, make database queries, do whatever data science we want to do, and then import meshes or generate cubes, apply colors and materials, define keyframes, and export an FBX file that can be displayed in Hololens or on web pages. Hololens lets you open multiple animations and place them around the room, so while I can’t define any gaze-and-click behaviors, I can still move and scale and rotate my animation in the mixed-real world.

The first step of course, is to head to blender.org and download the latest. I’m going to be really verbose and try to help out people on both Windows and Mac, because I had to figure it out on both myself. Follow all the default options, and on Mac move blender.app to /Applications like you normally do. From here, you can check out a million youtube videos on what Blender is for (That’s how I learned it!) but next we’re going to add ‘blender’ to our path so we can execute it from the command line anywhere in our system and not even have to learn how to use the GUI.

For MacOS, you can run this in your terminal, or add to your bash.rc:

export PATH="/Applications/blender.app/Contents/MacOS:$PATH"

For Windows, open up powershell (right click, ‘run as administrator’, which you’ll need to do for any commands that save to disk) and run:

[Environment]::SetEnvironmentVariable("Path", $env:Path + ";C:Program FilesBlender FoundationBlender", [EnvironmentVariableTarget]::Machine)

Now anytime you open your terminal/powershell, you can type ‘blender’ to launch blender. (On MacOS, you may need to restart your shell for it to take effect, or type ‘source bash profile’ to force bash to reload your saved preferences, including path) (Another option for windows is to search for ‘edit system variables’ from the start menu, click ‘environment variables’ and use the GUI to add C:Program FilesBlender FoundationBlender to your Path)

But going through all that trouble just to open Blender isn’t the point. The point is now we can save python files and pass them to Blender as a command line argument. Before we do tho, there’s one last step: we have to re-save the start-up scene to be a blank slate, otherwise our animations will all have a cube sitting in the middle of the scene (Blender’s default start up scene).

So open blender, and hit Select -> (De)select All, then Object → Delete, confirm. (or just ‘a’, ‘x’, ‘Enter’ if you want to use keyboard shortcuts). Next click File → Save Startup File. We’re finally ready to generate some animations with python!

Save the following code as helloworld.py, navigate to the folder you saved it in terminal, and type:

blender -b -P helloworld.py

And you should see Blender printing out a load of information about what’s happening, then it should create an fbx file and exit.

Check out the inline comments for a little bit of understanding of what the code is doing, and I’ll be writing more example code in future blogs.

And of course, the answers to all your questions are in the docs!

To make it easy to deploy these to Hololens, I just drop them in the onedrive belonging to the account I created with the hololens. There’s a python API for uploading direct to onedrive, too, so I’ll probably explore that in the future.

Optional: import interesting python modules like numpy and pandas and all the rest.

Blender comes packaged with its own python executable hooked into all of Blender’s guts, so to install any modules that aren’t built-in to Python3 we need to run get-pip.py with Blender’s python. First, download get-pip.py, I’ll put in in my desktop. Then, navigate to blender’s python executable.

command on windows:

cd "C:Program FilesBlender FoundationBlender2.78pythonbin"

command on MacOS (assuming Blender was moved to Applications):

cd /Applications/blender.app/Contents/Resources/2.78/python/bin

(Note that 2.78 is the blender version, not the python version) Run ‘ls’ and take note of the python exectuable’s name. On windows it was just python.exe, on mac it was python3.5m. Once you’ve cd’d into python/bin:

Install Pip on MacOS:

./python3.5m ~/Desktop/get-pip.py

For Windows I had to include to the full path of get-pip.py. Also this requires administrator privileges. So run powershell “As Administrator”

./python "C:UsersColten JacksonDesktopget-pip.py"

If that exits successfully, you can run ‘pip install pandas’ and whatever python modules you want to use from within Blender. On Mac, pip added itself to my path and I could use it right away, but on Windows I had to reference the pip.exe to run it from the python directory, so it ends up looking like this (again as Admin so pip can save files to disk, your permissions may vary)

Let me know what questions you have, tho of course I would appreciate it if you googled it first !

More examples and explanations here:

https://medium.com/@colten_jackson/producing-data-driven-holograms-with-python-blender-fbx-252cea370d51

PS all my gifs are in black and white because I don’t know how to do color correction, so I just get rid of the color 🙂

Also, shout out to my primary sources that taught me all this:

https://medium.com/@colten_jackson/producing-data-driven-holograms-with-python-blender-fbx-252cea370d51

https://medium.com/@colten_jackson/producing-data-driven-holograms-with-python-blender-fbx-252cea370d51

Producing data-driven holograms with Python/Blender/FBX

Here’s the situation I found myself in: Annalect Labs acquired a Hololens for purposes of producing holographic data visualizations and here I am, a lowly javascript developer wishing he could just make holograms with CSS!

Reviewing the options for game development with Unity and Visual Studio was intimidating to say the least. It looked like I would need to learn C# while adjusting to the Unity → export to Visual Studio → compile to Hololens toolchain. Worse, while I could follow along the Unity introduction to make a ball roll around a plane, I wasn’t able to export this basic demo to Hololens after hours of re-installing various versions of software.

I started wondering if Blender might let me produce animations for hololens, knowing that it allows for python scripting, which might allow my pythonic-data-scientist-coworkers to jump into producing holograms much faster than writing our own video game without any game engine experience.

Now, there are many guides on how to design things in Blender, import the meshes and actions into Unity, which can then export the project to Visual Studio for compilation. I hate the idea of dealing with that toolchain, dealing with incompatibility and export quirks each step of the way. I want to write a python script that produces an animation for Hololens.

With a little googling, I discovered that Hololens can indeed display animations in the FBX file format which Blender is happy to export, so I started playing around with the following code to make some cubes dance around:

The FBX file produced by this code can be opened directly in the Hololens app 3D Viewer (I just move my fbx files into a folder sync’d to onedrive to make it easy to access them on Hololens).

https://skfb.ly/6qPED

How about something a little more interesting: arranging a subset of the collection in a line. From here I hope you can use some imagination on how you could tie this in with data retrieval and visualization.

For Annalect Lab’s minimum viable hologram, we’re interested in visualizations of populations, and talking about subsets. So I’ll adjust the previous code to use meshes representing people. There’s a million ways to make meshes, but there’s a tool perfectly suited for my task:

http://www.makehuman.org/

I won’t say much about MakeHuman cause I think it’s pretty intuitive. You can customize everything about your character and use some pre-loaded outfits and poses, and then export the mesh as a .obj. To make it easy on myself I saved these obj files (man.obj, woman.obj) in the same directory I’ll output my animations. Now, these objs are made up of multiple meshes, and are fairly high resolution so there’s a lot of code to work through modifying them and reducing the poly-count to get to an acceptable file size for the default FBX file viewer on hololens, but the result is a lot of fun:

The main flow of the program is like this:

Import an object, which will select all 4 meshes of that object and join them.

Decimate the mesh so it isn’t so high resolution.

Create copies of each mesh.

Create an array of x,y coordinates along a normal distribution.

Loop through an array of all the objects and set their location to the next random coordinate.

Select the first 5 objects in the shuffled array and set their keyframes to animate them as they move into a line elevated from the group.

Do the same with the next 5 objects, but to the other side of the plane.

Save a .blend file and a .fbx animation to move to Hololens.

This is my first few days using the python API to blender, so there’s surely a better way to make all the selections and de-selections, and if you know how please let me in on it! In any case, I hope this gives you a glimpse over the kind of scripted animations Blender can help you with, and there’s a million other things in the docs I haven’t touched yet. In the future I’ll have some examples of how to hook these animations up to pandas dataframes for dealing with data retrieved from SQL queries 😀

Ready to get started?

Here’s a guide to setting up your blender and python environment:

http://www.makehuman.org/

Moving in to Fab Lab Tuyuryaq

Sunrise on Bristol Bay in Togiak, Alaska

Hackerspaces and FabLabs get started in lots of different ways. Often it’s a group of friends that already have a lot of tools and decide to rent a space to pool their resources. Others are built into institutions, like universities and libraries.

Fab Lab Tuyuryaq (that’s the Yu’pik spelling of Togiak) has a more peculiar start: it’s one of a few labs around the world that was established out of an NSF grant as the result of a research proposal. I wasn’t around for the years of coordination since a 2010 FabLab demonstration at a convention of the Alaska Federation of Natives. It was the meeting of minds between tribal council members of Togiak, and people from the University of Alaska and University of Illinois that got the ball rolling on a grant proposal that would finally get funding in 2014. I was asked to take the lead on ordering the equipment and materials to fully stock a lab within our budget and finally to fly out and unbox, assemble, and test all the equipment.




After flying from Chicago to Anchorage on a 737, I took a little Saab 340 turboprop where they hand out earplugs instead of snacks for the 90 minute trip to Dillingham, and finally, one of the daily cargo planes took me over the Alaskan marshes to Togiak. I got to peak over the pilot’s shoulder on that one. Just me and a bunch of Amazon Prime packages in the back.

The space was perfect for the lab: the backbone of a traditional canoe hung from the ceiling, and the building was situatued in the middle of downtown, right between the post office and the general store. At first it was packed full with dozens of boxes large and small that had been shipped over the past few months, but over the course of the first week it took the shape of a computer lab. Local handymen built sturdy hardwood tables and stools, and other furniture was rummaged from the local ‘Office Depot’ — an abandoned school down the street that was full of tables and chairs and other office supplies, open for the picking if you’re friends with the mayor.




Before it was a FabLab, this space was occupied by the local Boys and Girls club, so groups of kids knew it well and would knock on the doors after school. Though it was often a nuisance to supervise a bustling building of middle schoolers, I was glad to teach them how to use the sticker cutters and 3D printers — they showed up almost every day wanting to make something and I’m not the kind of person that says ‘no’ to people who want to learn.

But sometimes they were interrupting and just wanting to play games. Once when it was getting out of hand, a young girl noticed that I wasn’t as bossy as I was 5 minutes ago, and she hypothesized, “You’re one of those people that just gets quiet when you’re upset, huh?” And I nodded my head yes and she decided to take over for me and raised her voice to tell people to clean up after themselves. Keep taking command, Piccola!

Another night, I finally met some of the older kids that were going to high school there. They were quiet, but picked up skills really quickly. It was crowded when it was getting close to curfew, and a police officer stopped in to see why so many people were crowded in the building. After a quick tour of the lab he revealed he went to art school in Phoenix before getting a job as a policeman in Alaska, and he said he hoped he would get to use the equipment too.




Consider this a draft. I want to sit down and flesh this idea out, but:

Equally well equipped as the electronics hobbyists and inventors in Shanghai, Chicago, the investment of fabrication equipment may be an equalizing factor in bringing the same opportunities to rural and dense urban areas.

A big idea with establishing this lab is it may allow locals to develop skills that can be paid to remote workers. Instead of learning a skill and getting hired to leave the town, locals could bring income into the community by working from the lab.

Some replacement parts for salvaged work tables. Feet and cable guides.


Digital embroidery was really popular. I love setting up classrooms with sewing machines. A couple of kids in any group have likely dealt with threading a sewing machine so they can become helpers very quickly.




Some fun with broken glass:


Playing with broken glass: Laser etchings make ghosts over Togiak

A picture we received a few months after setting the lab up, a village elder’s portrait laser etched into a whale bone.


A couple of examples of Bristol Bay locals using 3D scanning and printing before this lab was set up. Bones from a whale carcass are being meticulously scanned in full color with a NextEngine laser scanner. A machinist explained to me how useful it is to print a block of plastic to test dimensions. The plastic is much easier to machine or just sand down until it fits than using a block of steel. Once he’s happy with the fit, he can machine it manually from a block of steel.



Simple, free screen capture with FFMPEG

Once you have ffmpeg installed and added to your PATH (see my previous guide), screen capture with audio is simple. Well, hopefully it’s simple following these directions, it took a while for me to figure it out.

On Windows 7/8/10

Windows uses a capture device called ‘gdigrab’ to do screen capture, and ‘dshow’ to capture audio and video input devices. Open your powershell, and type the following command to find out what your audio device is called.

ffmpeg -list_devices true -f dshow -i dummy

You should see output like the following. Notice in the sound settings on the right (you can get here by right clicking your volume icon on the toolbar and listing Playback or Recording devices), two devices are listed. Unless the device is plugged in and ready to go, ffmpeg doesn’t see it. It also truncates the name, so “Microphone (High Definition Audio Device)” becomes “Microphone (High Definition Aud” — it’s this exact string, quotes and all, that gets returned in the command line that’s important for the audio capture command.



Once you know what your audio device is called, you can stream it as input alongside your screen capture. First, though, I highly recommend changing your screen resolution to 1280 x 720 or 1920 x 1080, the high-def resolutions that youtube supports. This will simplify the transcoding process and result in a sharp video (as opposed to asking FFMPEG to downsample my 3000 x 2000 screen, which would screw up the aspect ratio. Much simpler to just record a screen already set to a useful HD resolution.) When you’re all set to record, run this command (with your audio device after -i audio:)

ffmpeg -f gdigrab -i desktop -f dshow -i audio="Microphone (High Definition Aud" -vcodec libx264 YOUR_NAME_HERE.mp4

That runs ffmpeg with the desktop as one input (-i), and the microphone as the second input (-i). The vcodec libx264 flag uses the h.264 mpeg encoder, which I found necessary to get a sharp, unpixelated video.

If you have no interest in recording audio, you can omit the second input device, and your command will look something like:

ffmpeg -f gdigrab -i desktop -vcodec libx264 YOUR_NAME_HERE.mp4

If you’re counting pennies and want to limit your file size, using the framerate flag (-framerate) to grab 5 fps or 10 fps is a great way to do so. Lowering your screen resolution is another option to limit your file size. The official documentation has a million options to peruse.

ffmpeg -f gdigrab -i desktop -framerate 10 -vcodec libx264 YOUR_NAME_HERE.mp4

Once you’ve started recording, your powershell will fill with status updates and error messages. You can minimize it while you work. When you want to stop recording, bring the same powershell back up and hit ‘Q’ on your keyboard to cancel the operation. Powershell may ‘hang’ (aka lock up) for a moment or two while it finishing processing. The video will now exist as the named mp4. Oh, and this happens in whatever directory powershell is focused on, unless you specify a full path for that mp4 (ex. C:UsersFabLabVideosvid.mp4 ).


After hitting ‘q’ to quit (you might hit it twice if it doesn’t respond) you’ll see this:


“exiting normally” — good sign! Go into the directory you ran the command in (C:UsersColten Jackson for me) and find your video. Should be ready to upload to youtube right away!

PomPomBots: Creative Robots for ages 8+ (To be Continued…)

Contributions by Virginia McCreary, Judy Lee, and more FabLabbers

With a “Plushy Robots” all-girls summer camp coming up, Team FabLab was tasked with creating a way for each camper to go home with a robot they programmed themselves with just a couple of constraints: it had to be cheap, so we could make 10 of them, and it had be made of soft materials, since we promised Plushy Robots.

Sketching out different ideas of what could be done with Arduino’s basic servo motor program, I vividly remember Judy (@judieelee) exclaiming in a wide-eyed eureka moment: “Pom Pom Bots”




Since then, we’ve run this activity with hundreds of kids and seen every one of them get comfortable with plugging in the wires of the servo and changing the speed and movement of that servo by adjusting Arduino code. What follows is a lesson plan detailing the steps to run this activity yourself.

Setup & Introduction

If you’re not familiar with poms, they’re the colorful soft balls you can buy in big bags at your local craft store. Together with some googly eyes and hot glue, you can make some pretty adorable creatures. Once you add felt and fuzzy sticks (pipe cleaners) to the mix, the sky’s the limit.



The materials per student for this activity include:

  • Arduino Uno with USB cable
  • Microservo (9 gram)
  • 3 jumper cables

The materials you should have piles of for students to share include:

  • PomPoms of many sizes and colors, Fuzzy Sticks, Felt
  • Googly eyes
  • Scissors for fabric, wire snips for fuzzy sticks.
  • LOW TEMP hot glue guns

In addition to these materials, you will also need a nearby computer lab (we used laptops) to program the Arduinos. This activity involves constructing the robot and programming the robot. How you set up your space will depend on how many participants you have and how many helpers you have.

We had three 8 foot tables for hot gluing and pipe-cleaner-sculpting and two 8 foot tables set up with 10 laptops for a group of 20 kids. We had 6 instructors (sometimes a couple more volunteers) for 20 kids, which was great. I can’t recommend going into an Arduino-programming activity without an instructor per 2 or 3 computers — it’s really important to guide students through the lesson on an individual basis. The success of this activity has been in large part because the instructors are familiar enough with Arduino code so that they can spot typos and common errors right away. Learning perseverance in debugging can wait for another day — we’re trying to get these kids comfortable with programming, to be able to see themselves as programmers, and staring at a screen for more than a few minutes without knowing what’s wrong can be very discouraging.


Start the class by gathering students around a table with some example bots you built yourself. Have the bot running so kids see the result of the activity and get their gears turning. We often use this time to egg the kids into a conversation on robots in general. You get to ask kids, “What is a robot, anyway? Is it a machine that does something for us? At what point does a machine become a robot? Does it have to make decisions for itself? What’s the difference between following instructions and making decisions?”

I encourage you to ask kids what they think of robots and what they think the future holds any chance you get, you’ll be surprised with what they come up with. My favorite exchange that’s happened during this activity:

At what point would you call a computer intelligent?

“When it makes you happy”

Whoaaaa.


Original file left: docx pdf Original file right: docx pdf

The Occasional Inspiration






An antipodist spinning a parasol with a pair of puppets on her feet. The puppets’ expressions of excitement and frustration were controlled from the jugglers toes. A cow girl on a pogo stick whose lasso doubled as her jump rope.




One minute the ringmaster is coaxing cupcakes and cookies from children in the front row, and the next she accepts a beer from the crowd (“Can you keep it closed for now? I’m still working on this one.” she tells them.)

Soft Circuit Swatches with Surface Mount LED



Creating small electronics out of uncommon materials is a hobby of mine. I think it’s fun to show that electronics isn’t limited to circuit boards, and computers don’t have to be stuffed inside boxes. Especially when FabLab is exhibiting for school age kids who might change their minds about whether they’re interested in CS and ECE, I’m happy to show off a wide variety of computers, usually various arduino-powered objects.

Aside from having interesting widgets to show off, I mostly enjoy the challenge of building something with materials I haven’t seen used before. Instead of sewing LEDs on top of fabric, I wanted to integrate the LEDs into the fabric, so I used a hand loom to make a patch of fabric, using stainless steel thread as my weft. Knots have never made sense to me, so setting up the loom was especially challenging.

I really can’t think of anything to do with this, so for now it’s just a tech-demo. I’d love to hear your ideas.





I really appreciate what Adafruit and Sparkfun do to make small electronics available in sew-able packages (such as the Flora and Gemma microprocessors from Adafruit, and the Lilypad from Sparkfun), but I never really liked that the components were still bonded to a chunk of fiberglass circuitboard — it limits the flexibility of the fabric where it’s sewn and it looks bulky. With that and the extra expense of those products in mind, I wanted to come up with a strategy that allows surface mount components be embedded into a flexible material. SMD is the cheapest package for many parts — we order these 1206 size LEDs at 2 or 3 cents apiece on eBay.

Stranded copper soldered directly to these LEDs are sewn into leather which was lasered to punch holes and provide guides for the wire. This is just a test swatch, I have it in mind that lasering the holes and traces like this would provide a way to wire fairly complex 2-layer circuits (since traces could cross to opposite sides of the leather to avoid intersecting). The parts can sit flush with the surface of the leather, and the leather can flex without dislodging the parts — they aren’t bonded to the substrate, just pulled against it by the copper stitches. This also makes it surprisingly repairable. At one point I melted an LED after stitching everything together, and it only took a minute to un-solder and drop in a fresh LED.








For the leather circuits, I’m looking forward to trying to do 7-segment displays with the same LEDs, but embedded the microcontroller and battery holder directly into the material. I’m imagining interesting tabletop radios and clocks that exhibit their circuitry on their surface.

A quick demonstration of both circuits:

A night on the edge of Paolo Soleri’s arcology.

OR, I visited utopia and just have some cameraphone pictures to show for it.

Mediterranean Cypress trees make up just as much of the skyline as Arcosanti’s poured concrete habitats. I love it so much.

I don’t remember where I first heard about Arcosanti, but at some point I came across Paolo Soleri’s drawings: reimaginings of what cities could look like. I read about his philosophy of culture as an emergent property of architecture: that it is the tight integration of live-work spaces and the multitude of possible pathways that exist in urban settings that give rise to creative exploits — in the arts as well as politics and individuals’ organization of their own lives.

I looked at masterful drawings of megacities that reminded me of my own childhood drawings of space colonies. I was into sci-fi and the prospect of interplanetary civilizations, or merely humanity on the run in giant tin cans lined with pasture and crops lit by artificial skies. But Soleri’s visionary megacities were earthbound, situated on cliffs, in deserts and other harsh environments. More than just drawings tho, he actually went and built one. An experimental city in the Arizona desert, halfway between Phoenix and Flagstaff. I visited in September of 2014.

Paolo Soleri was a student of Frank Lloyd Wright. Student. Even though he had already earned his Ph.D in architecture in Italy, he became an apprentice at Taliesin West, Frank Lloyd Wright’s desert school of architecture. FLW is highly regarded for his blending of the outdoors with indoors, creating homes that echoed their landscape and brought copious sunlight and scenery into living spaces. But he also had a less highly lauded vision, one that celebrated personal freedom afforded by the personal car — a new machine invented and popularized within the span of FLW’s career. “Broadacre City” sought to give everyone their personal space. An illuminating critique can be found on PaleoFuture. Apparently, Frank Lloyd Wright considered Urbanism a scourge on humanity.


His apprentice Paolo Soleri, however, considered Suburbanism a scourge on nature, and didn’t think it did much good for people, either. When I visited Arcosanti last year, the spaces he built made a point. You don’t have to give up the feeling of wide open space to live in an urban environment. Soleri’s vision was for humans to build their steel and concrete all in one, uninterrupted mass — and outside of that, leave nature uninterrupted as well. Whether walking along the edge of the city, having a meal in the cafeteria, or getting into bed in one of the guest rooms, there is always an unending view of wilderness, from the city’s edge to the horizon. Just as Frank Lloyd Wright managed to integrate the landscape with the interior of homes, Paolo Soleri brought wilderness and sunlight into the design of an entire city.

Jessica and I climbed on top of a concrete dome — the roof of a workshop, and talked as we watched desert thunderstorms rolled by in the distance. There was absolute quiet here. Instead of the droning sounds of cars and trains and climate control, the night air was punctuated by distant laughter, or the occasional sneeze. The architecture had invited us to climb onto roofs by virtue of how smoothly the sidewalk transitioned onto them — it was all poured concrete, and nothing really denoted public property from private property. The apartments opened up to balcony-like spaces that were accessible from the sidewalk just as well, which gave the city a strong sense of horizontalism and equality. We were all guests on someone else’s creation.


The main economy of Arcosanti at present is giving tours of the city and the casting of artistic bronze bells. There’s a poetry to the bells’ sand casting process: it is essentially a miniature version of how many of the city’s major structures were poured.



An economy run on bronze bells doesn’t bode too well for the city’s population of 50–60 artists and architects, but I think Soleri’s vision still holds a lot of weight. His drawings often include “automated production” deep in the guts of the megacities, and I like to imagine what that would look like in the Hackerspace/FabLab context of humans partnering with robots to generate the stuff we enjoy and consume. Take the concrete planters outside the guestrooms: iron rails on the edge, just begging for a CNC gardening machine to zoom back and forth every hour checking for crop growth and disease. Perhaps an atmosphere of invention and experimentation could bring more tourists through the city, and perhaps the production of more than just bells (beautiful as they are) could start an economy and a culture that starts nudging the 1970s monument towards its visionary goals.


Drawings of Kowloon Walled City (via Spoon-Tomago) vs drawings from Soleri’s The City in the Image of Man




Building Brackets around the C-LEG prosthetic

This is Part Three. See: Part One, Part Two

Started out with some photogrammetry to capture the geometry of the C-LEG, which will hopefully allow me to 3D print a bracket that fits the contours of the the C-LEG precisely.

This first scan was enough to play around with, but ultimately the glossiness and the bright sunlight caused enough gaps and distortions that I had to do a photoshoot later that night using our CNC machine as a light box. The even lighting from the LED rope was just the trick.

The next step was selecting a portion of the C-LEG’s surface to extrude into a form fitting shell. Blender was used to create a mirror image of the scan, and MeshLab was used to align the two sides and fill in the holes so I had a reconstruction of the entire CLEG (Agisoft was only able to reconstruct one side of it — I could of went back and tried another photoshoot, but decided it would be faster to just duplicate the half that worked). In the video you can see the mesh of the whole C-LEG next to the original scan.

Blender and MeshLab were used back and forth here: Blender allowed me to select a portion of the mesh freehand and export as a separate STL. MeshLab allowed me to offset this surface using ‘Uniform Mesh Resampling’ and then construct a volume around the surface using Uniform Mesh Resampling with ‘Absolute Distance’ checked off. This created an excessive and messy edge, however, so I brought it into Blender to perform a boolean intersection, extruding the surface that I selected earlier outward to overlap with the portion of the new mesh that I wanted to keep. With that cut performed, I used MeshLab one last time to perform a ‘Surface Reconstruction: Poisson” to smooth the corners. To cut a slit in the back of the model I used Tinkercad, because it’s quicker to align and subtract a cube, knowing what I know.

And it actually clipped on the way I had hoped, wrapping around the edges — but there was a considerable gap. The inner diameter of the print was 60mm, while the C-LEG is 55mm wide, so I uploaded the STL to tinkercad at 91% of the original size to continue to prototype #2:

I used some cylinder, cube, and hexagon shapes to throw together clamps that I can add nuts and bolts to for this print, to see if I can really clamp down on the C-LEG enough to hang some weight off of it.

Ended up printing copies at 93% and 96% of original size. It is not a perfect fit, but once tightened down with bolts, holds on pretty well. This one cracked due to the nut turning against the plastic — the white ABS must have shrunk more than the grey ABS, which had holes big enough for the nuts to sink into without forcing it.