Please feel free ask questions here (or YouTube / Twitter / hybridlive_@jarm.is ) during the live stream, and to continue the discussion after the event.
How do you feel about improvisational fluency with this system?
This sounds beautiful, but it’s also very nice to understand the technical underpinnings. With the drumming performance at algomech festival you didn’t explain what was going on, and it was quite nice to hear people guessing in the audience… How do you feel about audience interpretation?
Your system is extremely constrained in an interesting way, how did you arrive at this level of constraint (not more or less)?
do you see this system as something more suited for a specific performance, a single peace (which can be rehearsed and maybe solve the cognitive overload problem), or as an instrument you gain fluency on over a long period of time?
Can you talk about the motivations for the project?
Have you thought about making a DSL for rhythm rather than using javascript?
I’m curious what you think the patterns say to the performer - how you recognise meta patterns and similarities (upbeats/downbeats) and if there’s a way to specify that using a different, maybe more domain-specific grammar or just musical/pattern-like terms
Do you have discussions with the dancer beforehand or leave it up to them to improvise?
Interested in the fake mainframe! In algorithmic music human is sometimes really underplayed, with composers like David Cope pretending their software is autonomous when they’re selecting and editing the generated output. Are there examples of ‘fake’ dance performances? (sorry that q is a bit scrambled, trying to keep it short, but interested in the question of authenticity in algorithmic choreography)
how do the dancer’s handle the “interface” between you and them? Is the screen a good interface, have you explored other options (maybe an earpiece, google glass, wearable haptic feedback)?
I missed the part about the ancient Chinese drum score influence which looked super interesting - could you explain that a bit more please?
awesome! thanks so much… this will help me a lot to solve some things related to patterns in the quipu I am developing… which, btw, I need to point out there were also similar devices, knot based, found in china…
Isn’t what you’re doing similar to quantisation - going from continuous to discrete?