Rookie Awards 2024 - Open for Entries!
Hypernature
Share  

Hypernature

by WZY and aleo on 31 May 2023 for Rookie Awards 2023

Hypernature reveals the fragility of our sound ecology through an immersive audio-visual experience. The evolution of a natural soundscape based on the effects of global warming is simulated. 

1 111 0
Round of applause for our sponsors

Hypernature - An encounter with the soundscape of the future 

Ziyue Wang (CN), Alicia Leonie Waibel (DE)


How could we speculate about the sound of the future? 

Hypernature reveals the fragility of our sound ecology through an immersive audio-visual experience. The evolution of a natural soundscape based on the effects of global warming is simulated. 

The project is based on speculative data that shows the evolution of a forest under human-made climate change. Changes in localised meteorological conditions and associated foliage densities directly affect bird song, leading to an adaptation of new communication typologies.

The project represents emergent soundscapes with twelve synthetic birds in a spherical coordinate system, moving around a listener. Simulated bird song changes through adaptive frequency modulation synthesis, the output is an audible evolution in site specific communication.

The interaction with the virtual landscape and the subsequent change in sound are a metaphor for human intervention in the landscape and on avian communication. The user experiences the future soundscape in a direct and immersive manner.

 Hypernature demonstrates the impact of environmental change on a soundscape. The design project simulates a change and effect that is normally intangible due to the vast timescales of natural evolution.

Pointclouds translation in Unity as VFX particle system; we scanned parts of Epping Forest near London with a LiDar 3D scanner.

Synthesized bird sound - this sound track shows the different stages of bird evolution towards distinction.

Technical Description

Hypernature is a medium-sized audiovisual interactive installation which contains virtual and physical elements. The virtual part of the installation is created using Unity and Max MSP (Max signal processing) to provide an interactive environment which can be experienced by a single participant but is also projected into a physical space for the benefit of a potentially unlimited audience. We are using the Oculus system and its inbuilt cameras. The physical part of the installation consists of an ambisonic speaker array (6 speakers in total) accompanied by a three suspended fabric panels onto which the virtual environment is projection mapped from the rear, thus avoiding overshading by the participant. The speakers which make up the ambisonic sound environment surround the participant at different vertical levels, increasing the immersive nature of the experience. The projection mapping captures a live feed of the virtual environment which the participant is navigating, for an external audience to see. Therefore, the audience can follow the story and reactions. 

The virtual forest environment responds to the movements of the participant, which in turn influences the sounds emitted by the virtual birds which inhabit this forest. It represents the evolution, the forest is likely to go through, in future years, influencing the bird sound to adapt, as well as translating researched effects on how sound waves react to different meteorological conditions, and forest individualities. The synthesis in Max MSP enables the digital birdsong to adapt and evolve. The frequency, amplitude, rhythm, and song syllable duration of every bird are controlled by the participant's presence in the virtual environment.

The sonic component of the experience is characterised by a flock of synthesised birds. The birds are powered by several clockers in Max MSP, which control the triggers for each of the 12 birds, and control the duration of their songs, as well as their order of singing, and their deduction in number. ADSR (attack, decay, sustain, and release) envelopes linked to layered carrier waves produce the bird-like sound. The envelopes shape the soundwave into a new form which is gained from recorded real bird song dawn chorus syllables in Epping Forest. We analysed song strings of different birds appearing in our recordings, after transforming them into their spectrogram form. 

The forest environment is generated from a point cloud of Epping Forest, captured using a FARO laser scanner. The point cloud was then manipulated in Unity, in response to the participant's hands proximity to several trigger areas which enables a non-linear experience. It will influence the shape (defoliation) and colour (green to red simulates temperature change) of the forest VFX patterns. They are linked via an OSC connection to build up the user interaction part of the project.

For a more sensual and natural experience we chose to not use the default VR gaming controllers. Instead, we worked with hand movement tracking through the Oculus system and its inbuilt cameras. The participant can reach for leaves with their fingers. 

 Research Background

Visual representation and interaction design - The forest is the visual translation of landscape change based on global warming. It represents the change in sound and the sound of the future in a visual way.

Sound representation and soundscape design - The audio is represented through bird sound which reacts based on global warming and change in landscape, as well as human noise impact on avian communication. It imagines future change in birdsound through human interferance.

Exhibitions & Scientific references


Comments (0)

This project doesn't have any comments yet.