Event streaming backend and frontend development + archiving automation

Hi! Let’s use this thread to discuss further development of the events streaming backend (Muxy), events website (https://sun.tidalcycles.org/) and archiving related stuff.

For those who are unaware, based on some brainstorming with @iSaladino and a discussion on the Discord chat about the way we were handling these streaming events (a single, shared stream key) we started developing a streaming backend system for this last TidalCycles-related Solstice Marathon stream. It aims to facilitate streaming keys for streamers of a shared online event stream. Currently the main issue that solves it that it allows each user to have her/his/their own unique stream key, and they can only use it during a time slot they booked. It’s developed in Python using Django and Django REST Framework, and the source code is already available here.

Related to this, @sfradkin has been working for some time on archiving the stream recordings of past events, and is now automatizing parts of the process, and it would be nice to make sure this system works towards easing this process. There’s already an issue for fixing a few things to help him with this.

One of the things we discussed on the chat was to integrate the Solstice website (developed by @mashaal - looks like he isn’t here?) with Muxy. Right now the website and Muxy have their own databases (the website uses Firebase Realtime DB, Muxy an SQLITE db, soon to be replaced I think…) and we had to sync them up for the last event due to lack of time, but ideally we would be using a single database as a truth source. Muxy exposes a REST API so I guess the easiest way to do this is to make the website use this API for sign ups/slot creation, edition and removal. I know @mrreason and @mashaal wanted to contribute here, maybe someone else wants too?

Do you think it’s best to discuss, suggest and organize here and then use a chat (like the new TOPLAP Discord server or chat.toplap.org) for more pair programming/discussion stuff?

7 Likes

Thanks @munshkr, and for your work on muxy - it went really smoothly for the solstice stream.
It’d be great to use the next iteration for the big TOPLAP birthday stream planned around the weekend 20/21 Feb.

I thought I’d just point to this:
https://openstreamingplatform.com/

If we wanted to get away from youtube, it might be an option, although I think my server probably couldn’t handle an audience of more than 50, depending on the bitrate.

1 Like

Thanks for the invite @munshkr.

Happy to use the new REST API – I wasn’t very happy with the Firebase integration at the end of the day. I can strip all that stuff out and use the new one pretty quickly, and i can post the repo to a Tidal Cycles github account?

There’s only a few API calls, the public one is probably the most important one to get right.

https://sun.tidalcycles.org/api

From this GET request, we should be able to see all public information, name/description/location/timeszone/cycle number/ etc – and it should strip out any email and key information. I chose to index the array by cycle number, but not sure it was the best idea going forward – but it do think we need to have a way to infer which cycle numbers are taken and which ones aren’t, without doing a lot of weird UTC stuff…

Other than that, I had a POST to book a cycle, and a DEL to remove a cycle, we can probably make PUT to update a cycle as well…

Happy to chat on discord.

1 Like

An option that we could look into for streaming is using AWS or one of the other cloud providers. That should give considerable bandwidth and allow us to keep the data. AWS storage is cheap, but the servers aren’t necessarily cheap. However, we would only need the servers for a relatively short period of time to live stream.

Something to think about.

1 Like

Also from me thanks for being invited @munshkr

I have another points on my mind from last time. Should we secure the server with HTTPS? It should be no problem to use letsencrypt for this. And I would like it if we come up with something else to improve the retrieval of the stream keys. And then I was wondering if we should think about contiguous slots or how or under what circumstances you can merge slots. The point about merging is related to the question of whether performances and talks can differ in length. If so, I think it would make sense to be able to use one stream key over several contiguous slots.

1 Like

Webtorrent could be an alternative to using AWS… I saw that @ojack is experimenting with this

1 Like

Yes! Friends at scanlines recently self-hosted a streaming event for Vidicon using this setup (not using webtorrent, just HLS): https://github.com/langolierz/scanlines-technical-details/blob/master/streaming-setup.md and have also been hosting periodic streams.
I was curious how it would work to scale this up using webtorrent, as it seems like one of the main barriers to self-hosting streaming is when you want to scale it up past 30 or so viewers.

I just tried installing nginx-rtmp on a digital ocean server configured for hls following the instructions above, and then connected it to the p2p media loader. It actually just…works! I logged the p2p connections so I could tell that they were actually being used. It seemed occasionally choppy, but I haven’t played with any of the parameters for HLS or for the loader yet. Here is a link to the code for the p2p version of the player: https://glitch.com/edit/#!/p2p-hls-loader?path=index.html%3A42%3A11
and for the regular hls: https://glitch.com/edit/#!/p2p-hls-loader?path=base.html%3A1%3A0
(you wont see anything unless i am streaming something at the moment)

I don’t have a lot of time for this but hoping to document research as I go along. Maybe we could do some load tests if people are interested.

3 Likes

I think we could also embed an open rocketchat channel or something similar to have a relatively straightforward chat along with the stream.

2 Likes

I’m a bit busy this week, will have more time next week…

I thought I’d share the lineup for 19-21st Feb so far though (times are in UTC, and tbc):

So the majority of the time that weekend is taken up by locally organised events, but we can make other slots those days available for individual (20 minute?) performances. For the frontend+backend this adds some complexity though:

  • longer pre-registered events, potentially with subevents of different durations - maybe some will stream from the same location / streamkey, but I’m guessing some will need multiple keys…
  • shorter open slots in between

Still I think if we can keep it as simple as possible that would be best …

We can manually create a single long duration slot for each node, in case their performances are going to be streamed from only one computer (so they have a single streaming key set up). On the other hand, I guess we could allow creating different slots with the same streaming key for these “subevents”, I’d need to remove a unique key constraint and check a few things probably to make sure it still works ok.

Maybe Events in Muxy should have a “slot duration” field, which would be 20 or 30 minutes for example, and the API should generate a list of available contiguous slots from the start and the end of the event. That should simplify things a lot for the frontend (no need to do any calculation, just list the free and registered slots in order from the API response), and would allow free slots in between these pre-registered node event slots.

I think one streamkey for every node would be good, we as NL_CL will probably not be together and stream from 1 computer the entire time.

I haven’t seen Muxy in action, nor have I tried it myself, but I was wondering how it handles going from one stream to another if the same streamkey is used and 1 performer starts their stream earlier then the other one is finished? Is there some form of cross-fading possible?

I’m not sure about 20 minutes for the individual open slots. I find it a bit short and also a little unfair considering the nodes had first pick and can decide amongst themselves how time is distributed. I think this should be 30 minutes. Maybe we can include Thursday? Or start a pre-party on the weekend before?

(This last reply/discussion should maybe be moved to https://forum.toplap.org/t/first-transnodal-new-year-stream/1547/27 ?)

Yeah… actually I’ve just re-read what I said and that makes no sense :sweat_smile:
The server can’t distinguish two streams with the same key, so if there are two people streaming with the same key, one “overwrites” the other (although the first connection has priority, so if the first one forgets to disconnect when their time is up, the new one will be queued, that was actually one of the issues we had in previous events).

Muxy behaves as some kind of “authentication backend” for the RTMP server. It allows or disallows a connection attempt based on the stream key by looking up the slot information (using the key) and checking whether the current time is inside the slot timeframe. Every 10 seconds it checks if this is still true and forcefully disconnects the client otherwise. Meaning, if you are streaming and you’re time is up, you will get disconnected, allowing other streamers to push to the server. You can also start streaming 5 min before your slot starts. In this case, you will briefly get disconnected and reconnected again at the start of your slot.

So, there is no cross fade, the server only receives and forwards a single stream to Youtube/Twitch.

1 Like

@mashaal What are your thoughts on this? Do you think it would help you having the API list all free and taken slots/cycles in datetime ascending order? I can work on this next weekend probably.

A post was merged into an existing topic: First Transnodal New Year Stream

Hey all as I mentioned elsewhere I think it’s simplest if we make the TOPLAP birthday stream just with locally organised/sub events. There is still some spare time in the schedule but I think it doesn’t make sense to open that up for individual slots as a lot of people would miss out. However if someone wants to organise a sub event in that spare time they’d be free to do what they wanted with it, if that makes sense.

We could do a big open stream in March instead… I guess we wouldn’t need the next iteration of the front end until then? Although we will need some way for the sub-event organisers to programme slots in their bit of the stream, and get stream keys to their participants… I’m guessing most will not be in a real venue due to covid19 restrictions.

Hey friends, is very nice to see you all here building software for this event. Let me check if I got this right: every organiser will receive a unique stream key by mail according to the timeframe they reserved for their origin nodes. That stream key should be shared by all the live coders (in situ for the nodes that’d be able to perform in real place and have a system that allows them to capture image with a camara -not anyone’s screen- + sound, and remote for the nodes that couldn’t make it… or that will have mixed perfromances). But the event’s website will also have slots available to pick individually (using the scheduling and stream key generator that @mashaal and @munshkr created)… Please let me know if I got this right! Thanks in advance!

1 Like

That’s right. Except that @yaxu suggest not to open the schedule for individual streams (as in New Moon/Solstice), and leave that to another event on March.

In that case, the simplest way to do this would be to create a single Stream key for each node event, so all subevents use the same key (even though it kind of defeats the purpose of using muxy in the first place). The other option would be to create keys for each different streamer, but that would require maybe a tight coordination between a muxy admin and someone representing each node event.