30 August - 1 September 2023
Edinburgh Napier University, United Kingdom

Embodied Sound in the Virtual

Javascript must be enabled to continue!

Demo Program

Paraslap: Bass string instrument designed for the slap technique

Rotem Ifrach, Ezri Tarazi and Lior Arbel (Technion, Israeli institute of Technology)

Since its introduction by Leo Fender in the 1950s, the electric bass guitar has become immensely popular among musicians and fans alike. To this day it plays a significant role in modern western music in a wide range of musical genres and styles. The slap playing technique involves forcefully hitting the strings with the thumb. It has gained significant popularity over the years and has become the signature sound of many genres, such as jazz and funk. However, no instruments are specifically designed to accommodate the slap technique. This work describes the Paraslap, an electroacoustic string instrument specifically designed for slap playing. The instrument features eight open bass strings mounted on a central pillar, allowing both hands to slap the strings and produce more elaborate slap phrases.

A demonstration: real-time emotionally adaptive music in a VR immersive environment

Marta Rossi (Abertay University)

In this demo, the author will demonstrate how to develop a workflow to create an immersive autogenerative project in VR using Max/MSP and Unreal Engine 5. Although many have written and shown how to use immersive techniques for asynchronous VR projects, very little can be found on how to set up a real-time immersive VR space. Using EEG sensors on Arduino to generate emotionally adaptive music in Max/MSP and hardware modular synth, and encoding the sounds in HOA with the Max/MSP Spat library, the music is going to be generated and rendered in real-time in binaural format for the VR headset’s headphones. The headset tracking data is gathered separately in Max/MSP and UE5 to reduce to a minimum the latency. The sensors data also modify, being sent to UE5 via OSC, the 3D environment and Niagara Systems in Unreal Engine 5 through bespoke referencing of Unreal blueprints. The audience will learn a method to integrate tools for the generativity and spatialisation of sound in real-time in Unreal Engine, and to create interactive VR installations that challenge the interaction between the user and the artwork, destabilising the subject-object hierarchy.