HLCI ‘20 Session 3: Instruments #2

Hybrid Live Coding Interfaces: performance & craft

Workshop session 3 of 4: Instruments #2


  • 2020-07-29T15:40:00Z - Victor Zappi
  • 2020-07-29T16:00:00Z - Mark Santolucito
  • 2020-07-29T16:15:00Z - Kate Sicchio
  • 2020-07-29T16:30:00Z - Simon Blackmore
  • 2020-07-29T16:45:00Z - Qichao Lan
  • 2020-07-29T17:00:00Z - Sol Bekic


Please feel free ask questions here (or YouTube / Twitter / hybridlive_@jarm.is ) during the live stream, and to continue the discussion after the event.

Hi all, please note we’ll start 20 minutes earlier than originally planned, to fit in a talk from Victor Zappi due to tech issues yesterday.

Live Coding a Vintage Chip Synth

Victor Zappi



  • The SID chip is quite old and hard to get hold of, any problems with that?
  • Was there are nerdy reverse engineering involved in this project?

Cryptoguitar – Live coding with a guitar

Simon Blackmore



  • How do you feel about improvisational fluency with this system?
  • This sounds beautiful, but it’s also very nice to understand the technical underpinnings. With the drumming performance at algomech festival you didn’t explain what was going on, and it was quite nice to hear people guessing in the audience… How do you feel about audience interpretation?
  • Your system is extremely constrained in an interesting way, how did you arrive at this level of constraint (not more or less)?
  • ​do you see this system as something more suited for a specific performance, a single peace (which can be rehearsed and maybe solve the cognitive overload problem), or as an instrument you gain fluency on over a long period of time?

Live Coding Sequencers

Mark Santolucito



  • Can you talk about the motivations for the project?
  • Have you thought about making a DSL for rhythm rather than using javascript?
  • I’m curious what you think the patterns say to the performer - how you recognise meta patterns and similarities (upbeats/downbeats) and if there’s a way to specify that using a different, maybe more domain-specific grammar or just musical/pattern-like terms

Agency in Live Coded Choreography

Kate Sicchio



  • Do you have discussions with the dancer beforehand or leave it up to them to improvise?
  • Interested in the fake mainframe! In algorithmic music human is sometimes really underplayed, with composers like David Cope pretending their software is autonomous when they’re selecting and editing the generated output. Are there examples of ‘fake’ dance performances? (sorry that q is a bit scrambled, trying to keep it short, but interested in the question of authenticity in algorithmic choreography)
  • how do the dancer’s handle the “interface” between you and them? Is the screen a good interface, have you explored other options (maybe an earpiece, google glass, wearable haptic feedback)?

Embodied Pattern Writing in Live Coding

Qichao Lan and Alexander Refsum Jensenius



  • I missed the part about the ancient Chinese drum score influence which looked super interesting - could you explain that a bit more please?
  • awesome! thanks so much… this will help me a lot to solve some things related to patterns in the quipu I am developing… which, btw, I need to point out there were also similar devices, knot based, found in china…
  • Isn’t what you’re doing similar to quantisation - going from continuous to discrete?

Persistent Expressions and Editor Integration

Sol Bekic



  • What do you think about the “downsides” of having a parentheses heavy approach?
  • Super exciting to see more exploration away from the REPL! Have you performed with it yourself, or shared it with others yet?
  • Yes, do you plan to share it, and if so, when? :slight_smile:
1 Like