Join us for an immersive workshop blending technology, art, and interaction, for artists, designers, musicians, performers, and curious minds interested in experimenting with interactive technology and visual creativity.


LIVE EVENT SOLD OUT.

The Workshop will be uploaded on 11.04.2025
Register here and get access to the full recording.


    During the workshop we will explore the integration between TouchDesigner, Max MSP and interactive lighting control to create a dynamic audiovisual experience.

    Software that will be used:
    – TouchDesigner (for the management of AI visuals and integration with Kinect and lights)
    – Stable Diffusion (for the generation of AI images in real time)
    – Max MSP (for sound design and audio-light synchronization)

    We will start by using the silhouette captured by Kinect to generate reactive AI visuals through Stable Diffusion, which will change in real time following the movements of the participants. Next, we will integrate the use of lights, controlling them through body movement, creating a direct interaction with the environment.
    Max MSP will add a sound dimension, influencing both the visuals and the lights. The goal is to create an interactive experience in which body, light and sound merge using advanced technologies.

     With:

    Nima Gazestani
    Interaction Designer, Creative Technologist, and University Lecturer
    Jack Sapienza
    Sound Designer, Music Producer, and Co-founder of RKH Studio

    Get hands-on with the system and experience how your movements generate real-time images, sounds, and lighting effects.