Visual live-coding proof-of-concept

This is the first step to live-coding GLSL and 3D in vscode using web technologies.

Here is the full article:

Any feedback very much welcome !

2 Likes

awesome! Do you have thoughts about a scene manager or how to manage scene graphs? OH, or could someone define a new object on-the-fly and then build with it?

Hi !

For scene graphs, we simply use the context (the parent object changes the “object3D” value and children act on this).

As for what “nodes” are, these are simply references to JS source. So we can simply create a node and start writing JS code (or anything the compiles to JS).

Ok, so threeJS child-parent stuff.

So, someone could pop on or off geo, vertex, and fragment shaders and define those uniforms while running?

OH, do you build your changes to an offscreen buffer then swap render contexts when the new code successfully compiles and runs? So like if you’re writing errors the simulation will run fine until either you submit new code or it auto-compiles and runs?

Yes, things can get plugged in and out.

I have tried different strategies to handle compilation errors and hot-replace in different projects. Here I wanted to have near-zero overhead in production runtime and this means no wrapping.

So my current goal is to have two strategies, one for “production” and one for “development”. In “dev” mode, nodes are wrapped to enable:

  1. try/catch protection
  2. fast hot-replace (no need to link more than the replaced node)
  3. scrubbing (this needs a source transform to replace literal values with a mutable array)

With this try/catch strategy, if a call fails, we run it again on the previous “sane” node and thus avoid missing renders.

// This is a simplified view of this idea:

// Production mode:
child.update = link(context, newNode).update // no wrapping

// Dev mode
try {
  const update = link(context, newNode).update
  // init did not fail
  child.update = () => {
    try {
      update()
    } catch (err) {
      // notify update error
      // re-link previous node
      child.update = link(context, previousNode)
      // call previous (sane) update
      child.update()
    }
  }
} catch(err) {
  // init failed
  // re-link previous node
  child.update = link(context, previousNode)
}

Oh those are pretty good ideas. I like #3 especially. All of the hot-swapping I’ve done has been on shaders. I’ve not tried JS object swapping. Everything I’ve done has been for production/performance. Dev mode would be a good idea with the try/catch.

I’ve done experiments with scrubbing and it is really hard to make it work outside of the shader coding editor running in the same memory space as the renderer. The other issue is that vscode does not allow mouse events on the code so this means using some other editor just for scrubbing. But it’s a must have feature for shaders.

This week’s work: :blush:

Render graph as a vscode extension.

OH DnD live-coding like that would be pretty sweet. Can you use pop-up widgets like Patricio uses in the Book of Shaders? https://thebookofshaders.com/02/ in Hello World on that page click in the vec4 for the color and it pops a color chooser. Or this one, https://thebookofshaders.com/03/ go down and click on one of the numbers in the vec4 and it gives a slider.

There is also Jack’s Visor, https://www.visor.live it’s written in Ruby.

One of my main goals with this project is to enable proper code reuse in artsy 3D coding. Hence the visual blocks. The other important aspect is the short feedback loop and in this regard, the “sliders in editor” are really important (especially for shaders). But I’ll probably integrate midi and ways to sync the rendering state with a DAW timeline. The “timeline” is the killer feature that all demo scene and live performance artists end up using. Creating a render tree that can hold a 2h show with many scenes, this is really important, handle scene transitions smoothly, enable scene debugging without complete show replay, effects reuse between scenes, etc…

The goal is to help creating complex interactive visuals for stage performances. In the likes of vvvv, field from openendedgroup or isadora. Or maybe VR experiences for musical albums :blush:. Visor is more like Modul8.

Drag & drop working (first time today):

Scrubbing experiment (in previous twitter post: cannot put second link)

Sweet! I’m sure you’ve heard or Scratch and Alice 3D. Both are DnD programming. Similar hmmm definitely not the same as what you have going.

Nice. I’ve been livecoding the TouchDesigner scene network with python and GLSL recently.

Yeah, the idea here is to create a bridge from two kinds of brains (living or not in the same person). The creative/aesthetic brain (visual coding) and the analytic/coder brain (source code), making it easy for these two worlds to work together. I think coding will inevitably be a required skill for these kinds of things but it does not have to be all the time… :slight_smile: