Live coding IDEs?

Can live coders interact with IDEs (Integrated/Intelligent DE) to play music? The proposed system provides interaction with your editor to generate short snippets of code that are often referred to as “musical simples”.
example code snippets: create an instrument, progression, beat, transition, automation or any other specific idea.
Musical simples are pieces of music
that can stand on its own,
and that makes a satisfying loop.
they should be catchy, attractive, and (ideally) already familiar.
should be between one and four measures long.
More at: http://www.ethanhein.com/wp/2015/musical-simples/

Background: I’m a novice live coder (~3 months into it now, never played a live coding show) and music technology student. Collaborative live coding has been by far the most inspiring prospect in live coding for me. I currently use a slightly hacked version of Troop as my collaborative IDE.

IDE: The only change I’ve made to this IDE is adding a text binding (Cmd+T/Ctrl+T) to “interact” with the system in natural English language by executing text comments. In return the system returns a snippet of code (saved by me) that I can quickly execute. My in progress version of Troop : https://github.com/sandcobainer/Troop. I initially came up with this setup to simplify live coding for myself, avoid errors and to be able to execute ideas without looking at the documentation constantly. For example, in FoxDot, a python interface for SuperCollider, interacting (Cmd+T) the following comment would give me a snippet of code that I can quickly edit and execute.
//play pluck p1 > pluck([0, 2, 3, 5, 7], dur=[1/2, 1/4, 1/2, 1/8, 1/8]).every(3, "shuffle")

There are multiple forums and discussions on the “starting from scratch vs prepared material” sharing common concerns. A common trend in these discussions is finding the middle ground. Analogous to producers using quality plugins, sample packs to quickly design (sounds that they could’ve synthesized from scratch) and execute their ideas, live coders should be equipped with tools to execute their ideas. This tool may improve a live coder’s creative workflow and organisation without running into the concerns of performing copy-pasta material, while adding structure and quality to the code that a coder can quickly edit and create sections, drops etc.

To completely understand and interact with the user, the IDE will require some level of Natural Language Processing on musical terms and context. NLP intention and context models are their own area of research mostly focused in e-commerce but can be adapted to understand musical terms.
Some general ideas for code generation:

  1. // set up template with intro, verse, chorus, bridge, verse, chorus, outro
    Commented session generation: setup a template for a live performance, practice session, composition etc. The template should create be able generate commented sections of code much like how an IDE generates an empty class, constructors etc.
  2. Understand and generate musical simples on the fly
  3. Assign collaborators a section of code with a template to start with
  4. Transform few lines of code into a function with parameters (encapsulation)
  5. Generate call and response code snippets using machine learning (reinforcement learning) (advanced features)

This system raises some interesting questions regarding work flow, live coding performance, composition.

  • What other platforms or editors can be used? Emacs, Atom? I chose Troop as it supports collaborative editing, multiple audio languages like FoxDot, Sonic-Pi, Tidal Cycles and of course SuperCollider.
  • Do you see a need for structured code in your performances or other live performances?
  • Can coding platforms be used to compose, produce or arrange music?
  • Improvising ideas on the fly from scratch vs relying on tools that generate general musical simples. General thoughts on this approach?
  • Using ML to create sequences of notes and patterns in generated code
  • Comments from professional live coders. Why would you use an interface when you know a language inside out?
  • Any thoughts on collaborative ensembles in education?
  • How does this change an art form that is traditionally a solo performance?

I don’t have any opinions on NLP or ML for this.

Snippets and auto-completion are fair game IMO. In my system, if I want to add an effect to a player, there’s a certain amount of boilerplate code that’s always the same. I don’t think it’s cheating to call that up in a half dozen keystrokes.

One other IDE feature that interests me is syntax-aware navigation. I implemented this in my system (SuperCollider + extensions), with some limitations – it has to be in a GUI window TextView, and the statement you want to navigate through has to be complete and syntactically correct. But it’s pretty cool: I can hit a hotkey on one of these statements, and the lowest level syntactic element at the cursor position is highlighted. Up arrow expands the selection to the parent element, down goes to one of the children, left/right go to the same level sibling. With a little practice, I can zip the cursor through a long line of code rather quickly.

Many editors’ “forward/back word” bindings are really terrible for code. If text is to be a fluent interface for musical improvisation, editors need attention to this.

hjh