The creative team behind Wombtunes, Caroline Pegrams’s innovative art-meets-science project, took to the Discovery Stage at SXSW Sydney to officially launch the unique data sonification platform. Pegram created Wombtunes during her time as a Cybernetic Imagination Resident at the ANU School of Cybernetics.
Wombtunes is a creative work that uses computational models to translate foetal ultrasounds into music. The project invites us to explore how listening and sound perception can help us engage differently with our data and make sense of it in new ways.
Users can select or upload an ultrasound image, and Wombtunes creates a unique ‘sonic fingerprint’ from the sonogram (the original image is not stored on the platform). Users are then guided to interact with the sonic fingerprint, steering the final music composition based on their preferences, resulting in an incredibly personal and specialised sound experience.
“How special is it that you can really take something… special to you…and then have this beautiful sonification and emotion evoked from something that connects us all”
Dr Kirsten Banks describing the meaningful connection that Wombtunes builds with users.
Wombtunes invites important conversations about data accessibility, music rights and royalties in the context of artificial intelligence (AI), and the ethical use of medical data in computational systems. In working with images associated with a major life event, it also raises questions about the responsibility that comes with using technology to elicit the full range of human emotions in its users.
To launch Wombtunes at SXSW Sydney in October 2024, Pegram was joined on a panel by collaborators Tushar Apte, an LA-based music producer, composer, and tech advisor; Adrian Schmidt, KopiSu creative technologist; Lara Nakhle, registered music therapist; and guide dog Maxwell. The panel was moderated by astrophysicist and science communicator Dr Kirsten Banks.
Dr Banks guided the panel through a discussion about the Wombtunes project, its collaborative development, and the questions it raises around accessibility, innovation, creative technology, ethical practice, and intellectual property.
Nakhle, a musician and registered music therapist who was born completely blind, reflected that whilst there are many apps that can assist people who are blind by describing photos as text, to “actually know the mood or vibe of an image is a totally different level altogether.” Apte agreed, reflecting that potential “huge benefit of sonification” is in how you can “represent data [like images] for the visually impaired”.
Pegram also reflected on her journey developing Wombtunes whilst a Resident at the ANU School of Cybernetics. “This residency is about giving people an opportunity to explore a topic. I love sonification as a concept, and the residency provided funding, networks, and an invitation to develop a new creative work exploring the potentials of data sonification in our everyday lives.”
What is data sonification?
Data sonification is gaining increasing attention in a range of fields. It is used at NASA, where astronomers turn images from space into otherworldly symphonies to make astronomy more inclusive, and at MIT, where an engineer developed a musical representation of the spike protein of the virus that causes COVID-19 to gain a different understanding of the virus. This transformation of complex data into sound allows us to connect with information in different ways, leveraging the potential for sound to engage emotion and creativity, to create a shared expereience, and to be accessible to new audiences.
People were at the centre of the platform’s development, iteration and testing. Emerging technologies such as Wombtunes become stronger, more meaningful, in valuing the ‘human(s)-in-the-loop’ and keeping humanity at the core of these technological projects. In developing Wombtunes, Pegram consulted with a broad range of people including creative technologists, human research ethics experts, medical imaging professionals, medical practitioners, designers, musicians, music rights experts, engineers, people with different accessibility requirements, and people whose families had experienced successful or unsuccessful pregnancies. Their collaboration and advice made the project possible.
“What can we hear that we can’t see in data and systems? Wombtunes is just one way to answer this question, and it is a very human way to tell a data story.”
Caroline Pegram summarises the Wombtunes project.
Wombtunes is currently under development for wider release and will soon be on display in the Birch Building at the Australian National University. Join the ANU School of Cybernetics mailing list for updates.