I’ve been computerizing different patterns I learned in my geometry class this spring, but this design is the first original work I can put my name on. I wanted to illustrate how the complex triangles in the lowest triangle is derived from the “flower of life” in the top triangle.
In the top left, I’ve highlighted the triangles that are inscribed by circles — In the top right I’ve highlighted the triangles between the circles’ midpoints. On the lower left and right, You can see the smaller triangles are rotated 30 degrees from the larger. The bottom triangle is the intersection of both of these patterns. In the video below you can see the Inkscape operations to intersection and arrange these divisions.
The SuperHex tessellates into a visually dense wallpaper. I’m excited to see what it looks like as a fabric so I’ve uploaded it to spoonflower.com. Once I receive the sample I’ll be able to put it up fro sale, first of many designs I hope!
I always wanted to make a globe. This is the “AirOcean World Map”, a design by Buckminster Fuller and cartographer Shoji Sadao, which aims to depict earth as “one island in one ocean”, and distributes geographic distortion so no country appears to be much larger than it really is.
I decided I could make my own after I noticed a familiar shape while practicing a simple sacred geometry pattern of tessellated equilateral triangles. Once the first grid of triangles is made, 3 further cuts are made across the grid, from each point of a triangle to the opposite midpoint, dividing each triangle into sixths. That’s when I was reminded of the Dymaxion map, with a unique subdivided triangle on a one of its edges. (You can see a better version of this grid in my blog on SuperHex)
Here’s the time lapse of sketching the coastlines… I felt like a real Slartibartfast if you know what I mean.
As time goes by, I find myself interacting more often with giant numbers encoded in hexadecimal strings that are hard to tell apart. There are git commit hashes, API keys, magnet links, public private key pairs, transaction IDs, and with IPv6 even IP addresses are 64bit hexadecimal strings.
Two useful qualities of using random number generators for unique ids and hash functions:
With a large enough bitspace, the resulting hash can be assumed to be universally unique (infinitesimal chance of collisions)
the output is uniformly distributed — if the input changes even one bit, the output will be drastically different.
However, while its easy enough to look at a list of git hashes and remember the first few letters while you’re checkout out commits, you’ll forget it as soon as you have to remember the next one. Hexadecimal numbers are unique, but they all look the same.
In the interest of making the distinction between large numbers more visual, I started using geometric tessellations where I saw a similarly infinite range of possibilities. In this case with 5 x 24bit colors (between #000000 and #ffffff), and 8 bits to encode edge thickness, you get a large array of visually distinct patterns you could use as a header image, or background images – but this is just with the same 12-pointed star. There are a very large number of possibilities with 4, 5, and 6 fold symmetries and different choices of when to use space and when to alternate stars…
You can try out randomly generated mosaics here by clicking on the “flip 128 coins” link. That’s one coin for each bit of information encoded by the pattern! The URL is updated each time so you can share a pattern, and you can download the SVG ready to use as a background image.
On Twitter, Medium, and Instagram, everyone’s feed looks the same and I can never remember where I saw some piece of information. Maybe by allowing a geometric motif generator to customize a page, or just hashing the contents of a page into a mosaic, we can provide a more visual signal that something on a page changed since you last looked at it, and a way to jog your memory when you do see the same pattern.
This project came out of writing my own framework to create components that would be opening some file on disk, whether it was Markdown, source code, or media files. The priority was to make it easy to extend components by adding “actions” and “reactions” to a dropdown menu for each component.
In the video, I open a ‘library’ component that knows how to render icons and metadata from the filesystem. Then I open a file in a textarea component, which is the prototype for a component which renders markdown and another that becomes a code editor (with showdown and codemirror libraries).
Source code for the proto component at the top of the inheritance tree can be found here and implements all the lifecycle callbacks provided by custom elements. For CodeMirror, the actions and reactions just allow updating the key binding to vim and changing the syntax highlighting, but options to download, overwrite, delete, and toggle word wrap are provided by the TextArea component.
Conda is an environment manager similar to virtualenv, but it features automatic dependency resolution that has made it a joy for me to use. To get a python environment set up with conda, its only a matter of running something like:
conda --name whateveryouwant python=2.7 a list of modules you need
You can then enter this environment at any time by calling source activate whateveryouwant and then run scripts from there. You can also print a “requirements.txt” file that locks in the dependencies and versions so that the next person running your script can reproduce the exact environment on their system — which would ease a lot of pain in trying to run random python examples off the internet.
So what do I need CondaVision for?
But what if I don’t want to run scripts interactively? I have python
scripts that I want to execute as a web API, but they all require
different versions of pythons and python packages, so I needed a way to
automate creating environments at runtime.
aims to solve the same problem, but it introduces a new syntax to
declare dependencies as a comment inside python files, and I couldn’t
get it to recognize that I needed a python less than 3, so I went ahead
and wrote a bash script that does the following:
Scan through files in PYTHONPATH to get the names of modules that might be defined locally
Combine that with a list of modules python includes to get a list of modules I don’t want to ask conda to install (conda create will throw an error if you say you need the sys built in module, for example)
perform regex on the python file I want to execute (and all python scripts in PYTHONPATH) to extract all the modules required
compare the results of regex with the list in step 2 to get a new list of modules I need to ask conda to take care of for me
create a hash representing the combination of modules so I can compare it with environments that were created earlier
if there is no existing conda environment that matches that hash, I create one
then I activate the necessary environment and in that environment, execute the script
I really like using the fetch API packaged in evergreen browsers, but
I was getting annoyed with having to set the credentials, redirect, and
method options all the time, plus it always takes a bit of code to
format the querystring correctly, plus it’s annoying to set headers and
stringify JSON objects that I want to PUT to my server. So I wrote
Kvetch. First argument is URL, which gets passed directly to fetch.
Second argument is an optional query object. This can have as many key
value pairs as you want, it just gets URIComponent encoded, joined with
&s and =s, and appended to the URL after a ‘?’ (so don’t put the ?
in yourself). Third argument is the Body a.k.a. Request Payload. It can
be an object, a string, an ArrayBuffer (ie Binary data) or FormData.
A lot of people use axios, request, or other libraries on npm, but I
didn’t want to add extra features and a bunch of dependencies, I just
wanted to prevent repeating myself using the native API.
If you need it to work on browsers without fetch, just bring in some fetch polyfill and define that first, kvetch will use window.fetch just fine.
You can leave the QueryObject and the Body blank if you don’t need them.
You can pass a falsey argument (null, undefined, etc) as a QueryObject if you only need a Body.
If you give an Object as a Body, it will be JSON stringified and sent with an application/json
ContentType. If you send FormData (including files), the body is handed
directly to fetch and it figures out what to do. If you pass a string,
it will be sent untouched with a ContentType of text/plain. ArrayBuffers get sent with application/octet-stream but I haven’t actually tested this and don’t know if it’s appropriate.
The purpose of this script is to provide an interface for data scientists to interact with parameters submitted via an HTML form. By defining a schema at the top of a python script, the named parameters will either be the correct type and validated against regex, or the program will fail at the validation step. The schema can also be returned as JSON so a consumer of an API can understand what is required.
It also provides methods to read and write from mysql or psql, and write results to AWS buckets via Boto 3.
A TODO for this project is to create an endpoint that builds the web form based on the schema of any requested script, since the programmer will have already defined the input type and name attributes as part of the schema.
This was written back when Python 2.7 was cool, so I’ll have to update it to replace ConfigParser and StringIO with the modern equivalents.
And for today’s episode of “I don’t know why it’s not working for you, it works on my system!” I was just testing this for the first time in 2 years and the files it created were empty. Looks like when running python on Windows using the io module, I have to explicitly call valid.file.close() for the file to finish writing. I’ll have to test and update the rest of my example files.
Worse than that is having to log into wordpress and edit a blog post to make adjustments to the code — once code has been pasted into a blog, when will you notice that it has a typo?
I was happy to find the WordPress 5 compatible plug in called Gist Github Shortcode. I’m surprised it only has a few hundred active installations because it works great except for one thing, the styling on line numbers was broken! The line number is a :before pseudo element, and the gutter that is supposed to contain it is table data (<td>) , but the psuedo element got painted outside of the layout “flow” and gets clobbered by the source code which is painted right on top.
After fiddling with the display rules for a while I found that applying display: flex to the table row expanded the gutter to almost fit the number, I then had to apply position: relative; right: 0.5em; to shift the number back to the center.
I added this CSS snippet to the “Additional CSS” form on Dashboard -> Customize Your Site and was good to go, here’s a live embedding of the gist shortcode with the style applied:
Thankfully since the plug-in itself has a page on GitHub I was able to report issue #5 to the maintainer. Looking at the changelog, I just now realize this library hasn’t required an update in 5 years, and still works out of the box ! Being a wordpress plug in that loads content with a github API, both sides of these dependencies have remained stable enough over the last 5 years that old, useful software still works, cheers to that!
I’ve always loved holograms, those virtual objects intermingled with the real world in so many of my favorite films, from Star Wars to Zenon: Girl of the 21st Century. But I wasn’t looking forward to learning Unity game development and Windows system APIs before I could make my first animation for the hololens platform.
So while I still have to dig into Windows-Guts and Direct3D to make interactive applications, I was excited to find a straight forward way to generate hololens-compatible animations with just a few lines of python.
Using the python embedded in the open source Blender project, we can read files, make database queries, do whatever data science we want to do, and then import meshes or generate cubes, apply colors and materials, define keyframes, and export an FBX file that can be displayed in Hololens or on web pages. Hololens lets you open multiple animations and place them around the room, so while I can’t define any gaze-and-click behaviors, I can still move and scale and rotate my animation in the mixed-real world.
The first step of course, is to head to blender.org and download the latest. I’m going to be really verbose and try to help out people on both Windows and Mac, because I had to figure it out on both myself. Follow all the default options, and on Mac move blender.app to /Applications like you normally do. From here, you can check out a million youtube videos on what Blender is for (That’s how I learned it!) but next we’re going to add ‘blender’ to our path so we can execute it from the command line anywhere in our system and not even have to learn how to use the GUI.
For MacOS, you can run this in your terminal, or add to your bash.rc:
Now anytime you open your terminal/powershell, you can type ‘blender’ to launch blender. (On MacOS, you may need to restart your shell for it to take effect, or type ‘source bash profile’ to force bash to reload your saved preferences, including path) (Another option for windows is to search for ‘edit system variables’ from the start menu, click ‘environment variables’ and use the GUI to add C:Program FilesBlender FoundationBlender to your Path)
But going through all that trouble just to open Blender isn’t the point. The point is now we can save python files and pass them to Blender as a command line argument. Before we do tho, there’s one last step: we have to re-save the start-up scene to be a blank slate, otherwise our animations will all have a cube sitting in the middle of the scene (Blender’s default start up scene).
So open blender, and hit Select -> (De)select All, then Object → Delete, confirm. (or just ‘a’, ‘x’, ‘Enter’ if you want to use keyboard shortcuts). Next click File → Save Startup File. We’re finally ready to generate some animations with python!
Save the following code as helloworld.py, navigate to the folder you saved it in terminal, and type:
blender -b -P helloworld.py
And you should see Blender printing out a load of information about what’s happening, then it should create an fbx file and exit.
Check out the inline comments for a little bit of understanding of what the code is doing, and I’ll be writing more example code in future blogs.
And of course, the answers to all your questions are in the docs!
To make it easy to deploy these to Hololens, I just drop them in the onedrive belonging to the account I created with the hololens. There’s a python API for uploading direct to onedrive, too, so I’ll probably explore that in the future.
Optional: import interesting python modules like numpy and pandas and all the rest.
Blender comes packaged with its own python executable hooked into all of Blender’s guts, so to install any modules that aren’t built-in to Python3 we need to run get-pip.py with Blender’s python. First, download get-pip.py, I’ll put in in my desktop. Then, navigate to blender’s python executable.
command on windows:
cd "C:Program FilesBlender FoundationBlender2.78pythonbin"
command on MacOS (assuming Blender was moved to Applications):
(Note that 2.78 is the blender version, not the python version) Run ‘ls’ and take note of the python exectuable’s name. On windows it was just python.exe, on mac it was python3.5m. Once you’ve cd’d into python/bin:
Install Pip on MacOS:
For Windows I had to include to the full path of get-pip.py. Also this requires administrator privileges. So run powershell “As Administrator”
./python "C:UsersColten JacksonDesktopget-pip.py"
If that exits successfully, you can run ‘pip install pandas’ and whatever python modules you want to use from within Blender. On Mac, pip added itself to my path and I could use it right away, but on Windows I had to reference the pip.exe to run it from the python directory, so it ends up looking like this (again as Admin so pip can save files to disk, your permissions may vary)
Let me know what questions you have, tho of course I would appreciate it if you googled it first !