A review of graph / node based logic declaration through Blender

Recently, Blender started their Everything Nodes project. The first output of this project is their fantastic geometry nodes system (debuted in Blender 2.9, and still under development), which allows the geometry of an mesh (and the materials it uses) to be dynamically modified to apply procedural effects - or even declare a new geometry altogether!

I've been playing around with and learning Blender a bit recently for fun, and as soon as I saw the new geometry nodes system in Blender I knew it would enable to powerful new techniques to be applied. In this post, I want to talk more generally about node / graph-based logic declaration, and why it can sometimes make a complex concept like modifying geometry much easier to understand and work with efficiently.

Blender's geometry nodes at work.

(Above: Blender's geometry nodes at work.)

Manipulating 3d geometry carries more inherent complexity than it's 2d counterpart - programs such as Inkscape and GIMP have that pretty much sorted. To this end, Blender supplies a number of tools for editing 3d geometry, like edit mode and a sculpting system. These are powerful in their own right, but what if we want to do some procedural generation? Suddenly these feel far from the right tools for the job.

One solution here is to provide an API reference and allow scripts to be written to manipulate geometry. While blender does this already, it's not only inaccessible to those who aren't proficient programmers but large APIs often come with a steep learning curve (and higher cognitive load) - and it can often often be a challenge to "think in 3d" while programming (I know when I was doing the 3d graphics module at University this took some getting used to!).

In a sense, node based programming systems feel a bit like a functional programming style. Their strength is composability, in that you can quickly throw together a bunch of different functions (or nodes in this case) to get the desired effect. This reduces cognitive load (especially when there's an instantly updating preview available) as I mentioned earlier - which also has the side effect of reducing the barrier to entry.

Blender's implementation

There's a lot to like about Blender's implementation of a node-based editor. The visual cues for both the nodes themselves and the sockets great. Nodes are colour coded to group them by related functionality, and sockets are coloured according to data type. I would be slightly wary of issues with colourblind users though - while it looks like this has been discussed already, it doesn't seem like an easy solution has been implemented yet.

This minor issue aside, in Blender's new geometry nodes feature they have also made use of shape for the sockets to distinguish between single values and values that can change for each instance - which feels intuitive to understand.

When implementing a UI like this - as in API design - the design of the user interface needs to be carefully considered and polished. This is the case for Blender's implementation - and this only became apparent when I tried Material Maker's node implementation. While Material Maker is cool, I encountered a few minor issues which made the UI feel "clunky" when compared to Blender's implementation. For example:


Blender's implementation of a node-based editor isn't all perfect though. Now that I've used it a while, I've observed a few frustrations I (and I assume others) have had - starting with the names of nodes. When you're first starting out, it can be a challenge to guess the name of the node you want.

For example, the switch node functions like an if statement, but I didn't immediately think of calling it a switch node - so I had to do a web search to discover this. To remedy this issue, each node could have a number of hidden alias names that are also searched, or perhaps each node has a short description in the selection menu that is also searched.

Another related issue is that nodes don't always do what you expect them to, or you're completely baffled as to what their purpose is in the first place. This is where great documentation is essential. Blender has documentation on every node in all their node editors (shader, compositor, and now geometry), but they don't always give examples as to how each node could be used. It would also be nice to see a short tooltip when I hover over a node's header explaining what it does.

In the same vein, it's also important to ensure a measure of consistency if you have multiple node editors. While this is mostly the case with Blender, I have noticed that a few nodes have different names across the compositing, shading, and geometry nodes workspaces (the switch node), and some straight up don't exist in other workspaces (the curve nodes). This can be the source of both confusion and frustration.


In conclusion, node-based editors are cool, and a good way to present a complex set of options in an easy to understand interface. While we've looked at Blender's implementation of a node-based editor, others do exist such as Material Maker.

Node-based interfaces have limitless possibilities - for example the Web Audio API is graph-based already, so I can only imagine how neat a graphical node-based audio editor could be - or indeed other ideas I've had including a node-based SVG generator (which I probably won't get around to experimenting with for a while).

As a final thought, a node-based flowchart could potentially be a good first introduction to logic and programming. For example, something a bit like Scratch or some other robotics control project - I'm sure something like this exists already.

If you know of a cool node-based interface, do leave a comment below.

Further reading

Tag Cloud

3d 3d printing account algorithms android announcement architecture archives arduino artificial intelligence artix assembly async audio automation backups bash batch blender blog bookmarklet booting bug hunting c sharp c++ challenge chrome os cluster code codepen coding conundrums coding conundrums evolved command line compilers compiling compression containerisation css dailyprogrammer data analysis debugging demystification distributed computing dns docker documentation downtime electronics email embedded systems encryption es6 features ethics event experiment external first impressions freeside future game github github gist gitlab graphics hardware hardware meetup holiday holidays html html5 html5 canvas infrastructure interfaces internet interoperability io.js jabber jam javascript js bin labs learning library linux lora low level lua maintenance manjaro minetest network networking nibriboard node.js open source operating systems optimisation own your code pepperminty wiki performance phd photos php pixelbot portable privacy problem solving programming problems project projects prolog protocol protocols pseudo 3d python reddit redis reference releases rendering resource review rust searching secrets security series list server software sorting source code control statistics storage svg systemquery talks technical terminal textures thoughts three thing game three.js tool tutorial tutorials twitter ubuntu university update updates upgrade version control virtual reality virtualisation visual web website windows windows 10 worldeditadditions xmpp xslt


Art by Mythdael