Algorithmically curated birdsong webradio using the xeno-canto database.
Up to three recordings (with a cc-by-sa(-nc) license and a rating > C) playing in parallel according to a region at the longitude of 6am at that moment. The region moves around the globe, updating every few minutes, randomly wandering north or south a bit while it does that, trying to find somewhere with enough good recordings to play.
The web radio runs on icecast2, fed by the amazing liquidsoap, which is controlled by a nodejs process writing to a mariadb server.
The Robot Folk Dance Experience (RFDE) is a live coding performance/toolkit using the patcher language Pure Data and the library iemguts. It is based around semiautonomous agents moving on a canvas and automatically establishing connections with nearby agents. Each agent has different sonic and/or control properties and automatically connects to the closest agent. The performance starts with a blank canvas, where the performer places agents communicating with their neighbours, some providing a beat or a subdivision of it, some providing sample playback, or reverb. The performer builds a network of these agents, eventually kicking off their autonomy by starting their movement. Through cutting, pasting, live modifying agents’ guts, starting, stopping their movement, manually moving agents around, a unique soundscape evolves.
Iemguts enables limited introspection into Pd patches (e.g., giving you access to the physical position of an abstraction on the canvas or a means to send messages to the canvas containing it) and makes it easier to create and change patches programmatically. My performance/toolkit expands on performances by iemguts creator IOhannes m Zmölnig. He used these techniques first in "do sinusoids dream of electric sweeps" (2008) (video), and furthermore in his performance “pointillism” (infos, video). In these performances, he utilises iemguts to make abstractions move around using random walks, and automatically connect themselves to nearby abstractions. His performance system therefore has a degree of independent action by the abstractions built in. Due to the graphic nature of patcher languages and specifically the use of iemguts to make the objects float around, connect and disconnect, this approach creates both a visually interesting – and to a degree meaningful – output for the audience and a primitive algorithmic co-improviser for the performer.
Further precedents are the reacTable [Jordà 2007], where players manipulate physical blocks on a table representing various wave forms, filters, and rhythms, which autoconnect based on proximity, and McLean’s TidalCycles interface texture [McLean 2011], where Haskell code autoassembles in 2D space based on proximity and the argument types expected by the used functions.
As these previous works show, two-dimensional layout of synthesis graphs and programme code was explored in a live coding context before. Popular patcher languages (Max and Pure Data) are not strictly two-dimensional in nature, as the spatial position of objects in most cases does not influence the flow of execution. By autoconnecting based on proximity, RFDE (as well as the other performances mentioned above) actually assign more than decorative meaning to two-dimensional layout.
In RFDE some extensions to the ideas outlined above are explored: Each agent has a specific way it moves over the canvas (e.g., circular, in a straight horizontal line, randomly) and can have inputs and outputs that accept only specific signals – control signals or audio signals, or both.
Agents can excite other agents to produce sound using sound synthesis or sample playback, receive sound and process it, or manipulate control signals determining sound synthesis parameters. Their placement on the canvas determines the panning of the emitted audio and in some cases other parameters, such as the playback speed of samples. The performer places these agents in the scene and can manipulate their initial parameters, as well as being able to live-code their internals or create new forms of agents. Through the ability to start and stop the movement of agents, interesting constellations can be frozen or destroyed.
The performance was presented at the Algofive and Algosix birthday streams (2017/2018), the Chemical Algorave (2017, Newcastle upon Tyne, UK) and at Algorave Sheffield (2018). It was submitted to ICLC 2017 but lack of funding prevented a presentation there.
We are four guys doing live coding. We did live soundtracks to two films as well, my favourite one is this one:
More infos and links on our website
SCGraph was a graphics server for SuperCollider. I adopted and extended it and ported it to Mac and more current SuperCollider versions.
I also added support for video playback, text rendering and various graphics primitives.
It never really became easy to install and my VJ career never took off. Life came and took the time away.