The Doors will be at 7 pm, with the concert starting at 7:30.
The concert will be hosted at the Glassroom, Edinburgh Napier University, Merchiston Campus (directions). Transport from the conference Craiglockhart Campus to Merchiston Campus is provided. The pickup time is 6:30 pm. Please, note that the bus does not make any stops; if you wish to stop at the hotel or go through a different route you will have to make your own arrangements.
Goni Peles (Bath Spa University) and Yuval Adler (McGill University)
ScoreCraft is a multiplayer music game exploring online music making mediated through gameplay. Each player controls the game by producing sounds, therefore, by playing the game the players are making music. ScoreCraft is structured as a modular environment consisting of a collection of mini games and scenarios. Mini games require the players to produce a particular set of sounds in order to interact with the game, shaping musical material. Scenarios determine how the mini games are arranged, organising the musical material into larger musical forms. We present a performance of the ScoreCraft scenario ‘Fall’, which is based on the mini game ‘Gaps’. The scenario consists of a series of race tracks, each containing barriers with gaps in them, which the players should pass through in order to reach the bottom of the track and complete it. Whenever a player completes a track, all the players move to the next track. The player who passed through the highest number of gaps by the end of the scenario wins.
Emma Margetson (University of Greenwich)
A work combining close-up microphone recordings of a bicycle and field recordings from the Elbe Tunnel in Hamburg, a 426m tunnel for pedestrians, bikes and cars. Drawing on the cyclic, continuous nature of the material, the evolving sounds reference the never-ending structures found within such spaces. This is a 3D spatial composition composed for 16 loudspeakers. The immersive spatial environment subtly transitions and morphs between the two sound sources: the bicycle and the tunnel. This immersive environment, with spatial transitions takes the listener on a journey to different places drawing on the philosophy of reality tunnels and the poem by Atticus:
“Life is a tunnel
not a cave
and you will find again
you left behind.”
Gravity and Density
Anthony T. Marasco (University of Texas Rio Grande Valley) and Jesse Allison (Louisiana State University)
gravity | density is a work for cyber-hacked devices and web audio applications. Our goal is to develop systems that merge repurposed and hacked pieces of hardware into the networked world of web art. While the electronic sophistication of mobile devices and the flexibility of web applications allow artists to create immerse audiovisual environments without the use of traditional music hardware, we believe that digital artists should not cast aside the tools of the past, but rather find new and creative ways of modifying them so that they can inform the ways in which we explore and create with new digital, web-based tools. Through these new hybrid systems, we can both embrace the limitations and push the boundaries of any hardware we use for the purpose of creating collaborative sonic environments.
In gravity|density, we begin by manipulating fixed-audio sources through the performance of hacked CD players. The sonic results of this mangled audio are sampled and then distributed to the audience’s mobile devices in both passive and interactive manners. Passive distributions allow us to create an intricately spatialized rhythmic interplay between the glitching CD players and the blanket of overlapping samples dispersed throughout the networked audience. Active distributions enable the audience to join in our performance; by choosing small portions of the audio sent to them and sending these selected samples back to us, we string this audio together and feed it into a cyber-controlled distortion pedal before sending it back to the audience for more manipulation. This results in overlapping cycles of control and audio generation between performer, audience, network, and machine.
VRitual AV: Tactile Micro Interface
Steve Gibson (Northumbria University)
Continuing the motion-based work of Virtual VJ and Opto-Phono-Kinesia I have developed VRitual AV. In response to the limitations introduced by the COVID-19 pandemic, I utilised a more portable and simpler interface than the room-scale GAMS motion-tracking system used in the above projects. The key aim is to create a tactile micro interface that is easy to use and most importantly is easy to transport. The interface serves as a solution for older body-based performers (including myself), who may physically struggle with larger-area interaction. A secondary aim to provide a model for gestural control of mixed media (sound, light, video) in a single form that is both repeatable, but complex enough for genuine expert performance. VRItual AV is a body-based audio-visual performance that deliberately diverges from much live audio-visual work in which the performer sits behind a desk and controls audio-visuals by interacting with a traditional computer interface. The computer itself is deliberately hidden and the interface is the contained within the gestural actions and movements of the performer’s body.
Atau Tanaka (Goldsmiths, University of London)
Myogram is a concert work for performer and the EAVI bio-electrical interface as musical instrument. The sensors capture electromyogram (EMG) signals reflecting muscle tension. The system renders as musical instrument the performer’s own body, allowing him to articulate sound through concentrated gesture. A direct sonification of muscle activity where we hear the neuron impulses of muscle exertion as data. Throughout the piece, the raw data is first heard, then filtered, then excite resonators and filters in a modular synthesizer system.