Hi all! I would love to have a place for sharing documentation of live performances using hydra. I know some of these end up buried in closed platforms but I would love to try to share them anyways. Please respond w/ any photo or video documentation you have from past performances, or share info about upcoming performances.
hi all! loving this thread <3
With c0d3 p03try we explored using the webcam with a macro lens capturing things in a very zoom in view.
Also drawing with a Wacom as if it where the mouse function https://youtu.be/f_2OU-DZASk?t=381
Used feedback from another web tab, for example https://dddance.party/ in https://www.youtube.com/watch?v=NgLAFjuvQPo
Poetries in sticky notes from desktop https://www.youtube.com/watch?v=f_2OU-DZASk
We realized that hydra itself can be used and performed like a visual instalation. We were in a place/gig/date were the poetry from a person should be the limiter and shooter for ideas, so in hydra we feedbacked some poetries and let the text move/regenerate/etc… the result was beautiful https://c0d3-p03try.neocities.org/assets/images/rumordelluvia.jpg
We thought a lot about feedback not only as a digital resource but also a conceptual idea for an artistic practice. The up and go around the hardware and the software, the material, the body, the idea and the visual result. As a kaleidoscope of a global thing.
Bests,
Rapo
hi all. I love Hydra for its flexibility and the philosophy behind its development. thank you @ojack for developing this tool.
I have been exploring the software by playing with the sample shaders that are shared when you start the Hydra editor. I am slowly becoming more comfortable with the coding aspect. But I feel the idea of software is much more than that, it is the role it plays as a kind of open-ended connector of media. I deeply appreciate the fluidity and flexibility of the software and have been slowly but surely figuring ways to use it in live shows.
You can find a short video that I captured of me using the techniques I practiced at a show last night. Here: Visuals01_Mix on Vimeo (apologies for the quality, it was originally instagram story captures.)
Just to list some things I practiced last night and will continue to work on:
- Connecting Hydra to a Kinect via a Processing sketch.
- Using the live Kinect depth data as a mask to layer different visuals.
- Creating a simple fader knob with Arduino that controls the grayscale of a Processing sketch. Using the Processing sketch as cross-fader mask controlled in realtime by the analog knob.
- Placing the Hydra window between the extended monitor and projection screens so that: the code was visible on the monitor and the visuals were on the projector.
- Using the monitor screen as a screen to setup visuals before I drag them over to the projector screen.
…
I guess I could go on. Perhaps my words are confusing without directly seeing the examples. I will make more screen captures as I continue exploring the software and how I can connect different software, hardware, media… The possibilities are infinite! Thanks again @ojack for developing the type of software that I’ve been looking for for some time!
I used Hydra during a noise-oriented chiptune show with three artists called Cyanide Dansen, RRayen (from Feminoise Latinoamerica) and Siren’s Carcass, everything on LED panels, we can’t see much of it on the pics but it was a lot of fun, I’ll use it again soon on a party involving shaders, I’ll post pics as well, can’t wait !
oh wow it looks amazing! did you use screen capture to bring in the kinect from processing? or something else?
Really appreciate hearing these thoughts. the photo with the poetry and books is so beautiful! I am really captivated by feedback as well and feel like I am only beginning to understand it, as a technical/mathematical concept as well as an artistic practice and way of adapting to and performing with my computer.
I used screen capture. Below find the Processing code. (Basically lightly edited from a Shiffman tutorial.)
// Daniel Shiffman
// Kinect Point Cloud example
// https://github.com/shiffman/OpenKinect-for-Processing
// http://shiffman.net/p5/kinect/
import org.openkinect.freenect.*;
import org.openkinect.processing.*;
// Kinect Library object
Kinect kinect;
PImage img;
float minThresh = 100;
float maxThresh = 600;
void setup() {
// Rendering in P3D
size(640, 480, P3D);
kinect = new Kinect(this);
kinect.initDepth();
kinect.initVideo();
img = createImage(kinect.width, kinect.height,RGB);
}
void draw() {
background(0);
img.loadPixels();
PImage dImg = kinect.getDepthImage();
PImage vImg = kinect.getVideoImage();
// Get the raw depth as array of integers
int[] depth = kinect.getRawDepth();
for (int x = 0; x < kinect.width; x ++) {
for (int y = 0; y < kinect.height; y ++) {
int offset = x + y*kinect.width;
// Convert kinect data to world xyz coordinate
int rawDepth = depth[offset];
if (rawDepth < maxThresh && rawDepth > minThresh) {
img.pixels[offset] = color(255,255,255);
} else {
img.pixels[offset] = color(0,0,0);
}
}
}
img.updatePixels();
image(img,0,0);
}
fff
@ojack I saw that you mention Regl in the Hydra documentation as a library used in development. I am very curious about the use of other WebGL features with Hydra. For instance, would it be possible to draw a bunny mesh and add lighting or a texture?
I have more than enough to play with and learn with Hydra now, of course. I am just curious and excited for the possibilities.
BTW I’m trying to share the software with other visual folks in my city. Some were looking over my shoulder when I was doing a show and were really surprised at how easily I could pull in media from a wide variety of sources.
I recently stumbled across the work of eerieear. Beautiful colorful noise.
They work with hydrasynth and synth software. They post recordings of their performances on Youtube and social media.
I played with him in Paris this April, stunning sound and visuals
Glad I found this thread. I am curious about Hydra and have some basic questions about how it is used/implemented. Do you generally code the visuals live? Or does it act as a responsive visual to your audio stream? If the former, doesn’t it get confusing managing both audio and video at the same time?
Can it be used with any input stream? If I am not using Tidal, can I integrate hydra with any custom live coding environment I’m using?
Thanks!
@headlessghost @Pulsaare
Thank you so much for the kind words : )
They are more heart-warming than you can imagine !
Hi all !
I arrive a bit late to this forum. Very excited to be part of this beautiful community.
Recent visuals with in two events here in Europe.
One was for Pantropical in Rotterdam with live performances by NAAFI artists
(I was asked to not disclose this one, so discretion please…)
screen capture :
Dj set that night:
and then in JSnation last week in Amsterdam
capture:
I have been testing some new stuff following up on a script by @flordefuego4 using noise to marge layers smoothly.
captures:
https://youtu.be/xGqp_jsDOUQ /
Need more time to develop it. but as it was being discussed above, just so many possibilties!
I said this before several times but it feels line never enough. Thanks @ojack !
Hello there!
I’ve just discovered this thread, so I would like to share some experiences with this amazing livecoding tool.
As I discovered Hydra, I was and still am amazed of its visuals and possibilities - abstract patterns, animations, complex movements, playful colouring - that simply opens a door to endless possibilities to whatever artistic expressions!
By the way, I often play music in the background when livecoding in Hydra - such symbiosis is satisfying.
There is the capturing of my session where I made visuals reacting to sound - with the help of my installation recording from virtual audio cable:
https://youtu.be/rI-kKB-ejjw?feature=shared