Since 2022 we programmed on TouchDesigner a control interface allowing us to control our visuals and give audio reactivity (or midi) to any of the control parameters. Most important, it allows us to switch between all the animation .tox we created in 8 different layers. We developed a random system creating new possibilities of animations and we created a preset system that can be sequenced to the bpm. Everything is controlled through midi controllers or mouse directly from the interface, and we added an integration of PS5 controller to move inside the visuals & pointclouds.
The interface has 2 preview mode. Live mix (two layers side by side + master), which allows us to prepare animations before we send them to the master, and a preview mode which present in full screen the visuals behind the buttons of the interface.
At first we chosed to build the control interface only to preview what we were doing, the name of the effects in a more efficient interface. Then, we wanted to be able to work on the visuals without having to connect our controllers, which allows us to work on more different setups (on the train, in holidays…). Lately, we choosed to transform it more like a software that can be used as a standalone. From the interface, it's now possible to build preset for each .tox animations, play randomly between them. We can also right click on the main pitchs to add an audioreactivity (low, mid, high, bpm) to any settings. The goal there is firstly to have more randomness in our animations (even a random button that creates random presets, inspired by the amazing zerror.tox by 404 zero). Then we wanted to have it like a backup system, in case of one of our controllers stop working - we can always change any settings directly on the mac.
The system is divided in 5 different parts: Audio input, analysis and settings / Midi inputs, Control settings / Resolution settings / Layer & FX / Outputs & 3D preview systems + the control interface component. Separating systems in different components allows us to stay organised and having many people working on the same system - we only need to replace what has been changed.
Audio inputs have an analysis system that can be set by combining a TOP waveform in small resolution, combined with different white rectangles for each analysed frequence. By moving the rectangles to isolate parts of the waveform, we can the analyse the quantity of white in the rectangle / waveform and transform it into a control value. Huge thanks to B2BK for the idea!
On the midi inputs component, we control everything that is related to controllers: APC40MK2, Midi Fighter, Midimix, LaunchpadXL, ps5 controllers and everything related to the mouse control through the interface. Later we wish to add the possibility to remap every controls with any controller. We also built the system that can recieve midi signals from sound artists like with Owelle through our project Echoes of Archs.
The resolution settings component is only there to give master inputs of resolution and height value later used on each .tox animations, to adapt everything in a procedural way to the venue scenographies.
Layer & FX outputs is the main component where we store every animations and build the mix FX between the layers. We use a component which allows us to switch between the tox on our animations folder for more flexibility during live. There is also the master FX settings there and a custom time displacement .tox which allows us to mix the layers together at different time values.
Finally, in the master output we have all the output settings like color correction, video mapping tools and different preview preset for immersive spaces, architecture mapping, etc.
We still need to organise the project and work a lot on the optimisation, add some features, and later in the following years we really think about sharing the project to the public.