aux synesthesia

Interdisciplinary
project team

B.A. Product design
Kunsthochschule Berlin-Weißensee

Jan Batelka

B.A. Computer science
Freie Universität Berlin

Thushan Satkunanathan

B.A. Computer science
Freie Universität Berlin

Description

THIS PROJECT CREATES AN INTERFACE BETWEEN REAL AND ARTIFICIAL WORLDS. ACOUSTIC SIGNALS TRAVERSE SPACE, ARE ABSORBED, REFLECTED OR CHANGED IN THEIR QUALITY. THESE AUDITORY DIMENSIONS ARE TRANSLATED IN AN EXTRAORDINARY VIRTUAL SPACE, BASED ON THE PHYSICAL REALITY BY REDUCING IT AND REINTERPRETING OUR CONSCIOUSNESS INTO A NARRATIVE AND INTUITIVE EXPERIENCE.

THIS AUGMENTED REALITY IS AN ABSTRACTED REFLECTION OF THE TRUE WORLD, REDUCED TO THE MOST NECESSARY DETAILS WE NEED FOR AN ORIENTATION IN OUR ENVIRONMENT. THOSE DETAILS WILL SHOW UP AS DIGITAL SOUND WAVES WHICH REACT DIFFERENT TO CERTAIN MATERIALS, MOVING OBJECTS OR CHANGING SITUATIONS.

Implementation

Aux synesthesia acts as an interface between the physical and the virtual reality, creating an installation on a performative level.

This experience gives a limitation of what we need to see and what to do miss in world CONCERNING THE SENSORY OVERLOAD THROUGH OUR ENVIRONMENT WHICH WE ARE EXPOSED CONSTANTLY.

For DEVELOPING this augmented exploration we created a physical and a virtual prototype TO EXPLORE AND TRANSPORT OUR VISION: The physical prototype recogniSes the surrounding in real time. The read in data will be translated into to an auditive stereo signal which increaseS or decreaseS in its frequency if AN object comeS closer or the position of the subject changes.

Technology

The handset is the interface between the real and the virtual world. The device is guided by hand in any direction and recogniSes changing spatial situations.


Inside the handset is a Kinect for depth detection, a Raspberry Pi for data processing and a step up converter for the power supply.

Virtual reality

The data read in by the handheld device is output as a fractal visualization within the virtual environment. Patterns are formed which vary depending on the materiality and nature FROM object to object – and support the auditory signal.


The virtual simulation acts as full representation of the real environment – coded in Unity and ready for HTC Vive. A sound wave layer is “placed” over the real world and appears with every auditive interaction in a fractal visual echo. 

This Project is supported BY