In the face of ongoing environmental degradation, despite widespread awareness of climate change, new modes of engagement are needed to reframe ecological discourse. Creative practice, particularly sound art, offers unique ways to embody and communicate the sensory dimensions of environmental change. Synthetic Ornithology situates itself at the intersection of sonic arts, machine learning, and climate change communication, responding to the growing need for affective approaches to communicate complex environmental challenges.
This interactive sound installation uses a custom generative machine learning model, EAGLE (Environmental Audio Generation for Localised Ecologies), to synthesise hyperrealistic birdsong-centred soundscapes based on future climate scenarios. Trained on thousands of recordings from across Australia tagged with temporal, geographic, and climate data, the model maps correlations between ecological audio features and environmental conditions. Audiences engage via a touchscreen interface, selecting time, location, and climate parameters to generate speculative sonic ecologies that reflect potential futures shaped by climate change.
By transforming abstract climate data into immersive auditory experience, Synthetic Ornithology invites critical reflection on environmental futures through somic, parafictional realism. Subtle shifts in soundscapes such as altered birdcalls, unexpected combinations of biotic and anthropogenic sounds, create a tension between the familiar and the uncanny, making climate change perceptible at a sensory level.
Presented at Phoenix Gallery (Melbourne, 2025) as part of doctoral research at Deakin University, the work contributes to emerging dialogues in practice-led research, sonic arts, and climate change communication.