Hydra in the wild

Hi all! I would love to have a place for sharing documentation of live performances using hydra. I know some of these end up buried in closed platforms but I would love to try to share them anyways. Please respond w/ any photo or video documentation you have from past performances, or share info about upcoming performances.


hi all! loving this thread <3

With c0d3 p03try we explored using the webcam with a macro lens capturing things in a very zoom in view.

Also drawing with a Wacom as if it where the mouse function https://youtu.be/f_2OU-DZASk?t=381

Used feedback from another web tab, for example https://dddance.party/ in https://www.youtube.com/watch?v=NgLAFjuvQPo

Poetries in sticky notes from desktop https://www.youtube.com/watch?v=f_2OU-DZASk

We realized that hydra itself can be used and performed like a visual instalation. We were in a place/gig/date were the poetry from a person should be the limiter and shooter for ideas, so in hydra we feedbacked some poetries and let the text move/regenerate/etc… the result was beautiful https://c0d3-p03try.neocities.org/assets/images/rumordelluvia.jpg

We thought a lot about feedback not only as a digital resource but also a conceptual idea for an artistic practice. The up and go around the hardware and the software, the material, the body, the idea and the visual result. As a kaleidoscope of a global thing.



hi all. I love Hydra for its flexibility and the philosophy behind its development. thank you @ojack for developing this tool.

I have been exploring the software by playing with the sample shaders that are shared when you start the Hydra editor. I am slowly becoming more comfortable with the coding aspect. But I feel the idea of software is much more than that, it is the role it plays as a kind of open-ended connector of media. I deeply appreciate the fluidity and flexibility of the software and have been slowly but surely figuring ways to use it in live shows.

You can find a short video that I captured of me using the techniques I practiced at a show last night. Here: Visuals01_Mix on Vimeo (apologies for the quality, it was originally instagram story captures.)

Just to list some things I practiced last night and will continue to work on:

  • Connecting Hydra to a Kinect via a Processing sketch.
  • Using the live Kinect depth data as a mask to layer different visuals.
  • Creating a simple fader knob with Arduino that controls the grayscale of a Processing sketch. Using the Processing sketch as cross-fader mask controlled in realtime by the analog knob.
  • Placing the Hydra window between the extended monitor and projection screens so that: the code was visible on the monitor and the visuals were on the projector.
  • Using the monitor screen as a screen to setup visuals before I drag them over to the projector screen.

I guess I could go on. Perhaps my words are confusing without directly seeing the examples. I will make more screen captures as I continue exploring the software and how I can connect different software, hardware, media… The possibilities are infinite! Thanks again @ojack for developing the type of software that I’ve been looking for for some time! :slight_smile:

1 Like

I used Hydra during a noise-oriented chiptune show with three artists called Cyanide Dansen, RRayen (from Feminoise Latinoamerica) and Siren’s Carcass, everything on LED panels, we can’t see much of it on the pics but it was a lot of fun, I’ll use it again soon on a party involving shaders, I’ll post pics as well, can’t wait !


oh wow it looks amazing! did you use screen capture to bring in the kinect from processing? or something else?

1 Like

Really appreciate hearing these thoughts. the photo with the poetry and books is so beautiful! I am really captivated by feedback as well and feel like I am only beginning to understand it, as a technical/mathematical concept as well as an artistic practice and way of adapting to and performing with my computer.


I used screen capture. Below find the Processing code. (Basically lightly edited from a Shiffman tutorial.)

// Daniel Shiffman
// Kinect Point Cloud example

// https://github.com/shiffman/OpenKinect-for-Processing
// http://shiffman.net/p5/kinect/

import org.openkinect.freenect.*;
import org.openkinect.processing.*;

// Kinect Library object
Kinect kinect;

PImage img;

float minThresh = 100;
float maxThresh = 600;

void setup() {
  // Rendering in P3D
  size(640, 480, P3D);
  kinect = new Kinect(this);
  img = createImage(kinect.width, kinect.height,RGB);

void draw() {

  PImage dImg = kinect.getDepthImage();
  PImage vImg = kinect.getVideoImage();
  // Get the raw depth as array of integers
  int[] depth = kinect.getRawDepth();

  for (int x = 0; x < kinect.width; x ++) {
    for (int y = 0; y < kinect.height; y ++) {
      int offset = x + y*kinect.width;

      // Convert kinect data to world xyz coordinate
      int rawDepth = depth[offset];
      if (rawDepth < maxThresh && rawDepth > minThresh) {
        img.pixels[offset] = color(255,255,255);
      } else {
        img.pixels[offset] = color(0,0,0);


@ojack I saw that you mention Regl in the Hydra documentation as a library used in development. I am very curious about the use of other WebGL features with Hydra. For instance, would it be possible to draw a bunny mesh and add lighting or a texture?

I have more than enough to play with and learn with Hydra now, of course. I am just curious and excited for the possibilities. :stuck_out_tongue:

BTW I’m trying to share the software with other visual folks in my city. Some were looking over my shoulder when I was doing a show and were really surprised at how easily I could pull in media from a wide variety of sources.

I recently stumbled across the work of eerieear. Beautiful colorful noise.

They work with hydrasynth and synth software. They post recordings of their performances on Youtube and social media.

I played with him in Paris this April, stunning sound and visuals

1 Like

Glad I found this thread. I am curious about Hydra and have some basic questions about how it is used/implemented. Do you generally code the visuals live? Or does it act as a responsive visual to your audio stream? If the former, doesn’t it get confusing managing both audio and video at the same time?

Can it be used with any input stream? If I am not using Tidal, can I integrate hydra with any custom live coding environment I’m using?


@headlessghost @Pulsaare
Thank you so much for the kind words : )
They are more heart-warming than you can imagine !

1 Like

Hi all !
I arrive a bit late to this forum. Very excited to be part of this beautiful community.
Recent visuals with in two events here in Europe.
One was for Pantropical in Rotterdam with live performances by NAAFI artists
(I was asked to not disclose this one, so discretion please…)
screen capture :

Dj set that night:

and then in JSnation last week in Amsterdam

I have been testing some new stuff following up on a script by @flordefuego4 using noise to marge layers smoothly.

https://youtu.be/xGqp_jsDOUQ /

Need more time to develop it. but as it was being discussed above, just so many possibilties!
I said this before several times but it feels line never enough. Thanks @ojack !