Hello

My name is Holger Ballweg. Welcome to my web page. I had a wordpress site before, but it was hacked and I decided to do something different and go back to plain HTML and CSS, following the permaculture principle of 'Use small and slow solutions'.

I've been working as a front-end developer for the last 2.4 years, building web interfaces with React.js.

I finished a PhD at Northumbria University in 2019, my thesis was titled 'Ways of Guided Listening: Embodied approaches to the design of interactive sonifications'. You can find it on Northumbria Research Link.

Before that I was doing a Master of Arts at Karlsruhe University of Music. As part of my thesis I developed a live coding thing in Clojure that tried to use some machine learning. It was called frankentone.

While I was studying in Karlsruhe, I became part of a live coding band called Benoît and the Mandelbrots

News

27/3/2020

Made a simple website and added some projects.

Links to myself

holger.ballweg@gmail.com

GitHub

Linked In

Project: RFDE (2017-)

Description

The Robot Folk Dance Experience (RFDE) is a live coding performance/toolkit using the patcher language Pure Data and the library iemguts. It is based around semiautonomous agents moving on a canvas and automatically establishing connections with nearby agents. Each agent has different sonic and/or control properties and automatically connects to the closest agent. The performance starts with a blank canvas, where the performer places agents communicating with their neighbours, some providing a beat or a subdivision of it, some providing sample playback, or reverb. The performer builds a network of these agents, eventually kicking off their autonomy by starting their movement. Through cutting, pasting, live modifying agents’ guts, starting, stopping their movement, manually moving agents around, a unique soundscape evolves.

Influences/Prior art

Iemguts enables limited introspection into Pd patches (e.g., giving you access to the physical position of an abstraction on the canvas or a means to send messages to the canvas containing it) and makes it easier to create and change patches programmatically. My performance/toolkit expands on performances by iemguts creator IOhannes m Zmölnig. He used these techniques first in "do sinusoids dream of electric sweeps" (2008) (video), and furthermore in his performance “pointillism” (infos, video). In these performances, he utilises iemguts to make abstractions move around using random walks, and automatically connect themselves to nearby abstractions. His performance system therefore has a degree of independent action by the abstractions built in. Due to the graphic nature of patcher languages and specifically the use of iemguts to make the objects float around, connect and disconnect, this approach creates both a visually interesting – and to a degree meaningful – output for the audience and a primitive algorithmic co-improviser for the performer.

Further precedents are the reacTable [Jordà 2007], where players manipulate physical blocks on a table representing various wave forms, filters, and rhythms, which autoconnect based on proximity, and McLean’s TidalCycles interface texture [McLean 2011], where Haskell code autoassembles in 2D space based on proximity and the argument types expected by the used functions.

As these previous works show, two-dimensional layout of synthesis graphs and programme code was explored in a live coding context before. Popular patcher languages (Max and Pure Data) are not strictly two-dimensional in nature, as the spatial position of objects in most cases does not influence the flow of execution. By autoconnecting based on proximity, RFDE (as well as the other performances mentioned above) actually assign more than decorative meaning to two-dimensional layout.

In RFDE some extensions to the ideas outlined above are explored: Each agent has a specific way it moves over the canvas (e.g., circular, in a straight horizontal line, randomly) and can have inputs and outputs that accept only specific signals – control signals or audio signals, or both.

Agents can excite other agents to produce sound using sound synthesis or sample playback, receive sound and process it, or manipulate control signals determining sound synthesis parameters. Their placement on the canvas determines the panning of the emitted audio and in some cases other parameters, such as the playback speed of samples. The performer places these agents in the scene and can manipulate their initial parameters, as well as being able to live-code their internals or create new forms of agents. Through the ability to start and stop the movement of agents, interesting constellations can be frozen or destroyed.

The performance was presented at the Algofive and Algosix birthday streams (2017/2018), the Chemical Algorave (2017, Newcastle upon Tyne, UK) and at Algorave Sheffield (2018). It was submitted to ICLC 2017 but lack of funding prevented a presentation there.

References

Band: Benoît and the Mandelbrots (2009-)

We are four guys doing live coding. We did live soundtracks to two films as well, my favourite one is this one:

More infos and links on our website

Project: Maintaining SCGraph (2012-2016)

SCGraph was a graphics server for SuperCollider. I adopted and extended it and ported it to Mac and more current SuperCollider versions.

I also added support for video playback, text rendering and various graphics primitives.

It never really became easy to install and my VJ career never took off. Life came and took the time away.