Symphony of the Cell (Data Sonification Software)
Custom interactive software was developed, using data sonification to analyze protein amino acid sequences and identify structural and functional changes in proteins. This approach aims to enhance understanding of the molecular basis of diseases, including sickle cell anemia, Parkinson’s disease, Alzheimer’s disease, cancer, and others.
Methods include mapping the 20 amino acid types to pitches spanning a 3-octave range and using the amino acid hydrophobicity scale, or "affinity for water," to generate rhythmic information, note durations, amplitude, and panning. A “healthy” amino acid chain plays simultaneously with mutated variants, allowing users to identify when, where, and how often mutations occur. These mutations create changes—or “mistakes”—in pitch, rhythm, and timbre, which can significantly distort the musical relationships between voices, resulting in a musically catastrophic effect. Each sequence can have a user-defined instrument assigned to make individual sequences as distinct as possible.
Methods include mapping the 20 amino acid types to pitches spanning a 3-octave range and using the amino acid hydrophobicity scale, or "affinity for water," to generate rhythmic information, note durations, amplitude, and panning. A “healthy” amino acid chain plays simultaneously with mutated variants, allowing users to identify when, where, and how often mutations occur. These mutations create changes—or “mistakes”—in pitch, rhythm, and timbre, which can significantly distort the musical relationships between voices, resulting in a musically catastrophic effect. Each sequence can have a user-defined instrument assigned to make individual sequences as distinct as possible.
Data Driven Sound Spatialization (Data Sonification Software)
This software builds on the Symphony of the Cell project to explore how DNA sequence mutations alter the physical positioning of atoms in proteins and amino acid chains. By examining how these mutations affect protein folding and repositioning, we can gain deeper insights into the diseases that result. Atomic coordinates and other data are used to spatially position and move sounds in 360-degree configurations around the listener, revealing patterns within complex datasets.
By representing this data in three-dimensional space, we create an immersive experience that allows users to step inside the data itself, uncovering hidden aspects of the natural world and mathematical structures in novel and impactful ways. This project aims not only to develop a scientific tool for understanding complex information but also to explore creative possibilities in data-driven music composition, sound design, and audiovisual art.
By representing this data in three-dimensional space, we create an immersive experience that allows users to step inside the data itself, uncovering hidden aspects of the natural world and mathematical structures in novel and impactful ways. This project aims not only to develop a scientific tool for understanding complex information but also to explore creative possibilities in data-driven music composition, sound design, and audiovisual art.
Data Sonification of Genome Profiles and Mutational Signatures in Human Cancers
(Data Sonification Software)
This software leverages groundbreaking cancer research from the University of Cambridge, developed in collaboration with Dr. Serena Nik-Zainal and her team at the Early Cancer Institute, to create multi-dimensional data analysis methods. Using data sonification, it aims to advance our understanding of mutational patterns in the DNA of cancer patients. By translating complex genomic data into sound, this project provides an intuitive way to decipher patterns and trends in DNA mutations that might be overlooked through visual analysis alone.
The software uses cancer genome profiles and mutation signatures to manipulate and mutate existing music. Each type of cancer exhibits specific mutation signatures or probabilities, which are then applied as motivic and rhythmic alterations to a user-defined musical score. In essence, each cancer profile acts as a filter applied to music in real time, allowing us to “hear” the mutated DNA sequence.
The software uses cancer genome profiles and mutation signatures to manipulate and mutate existing music. Each type of cancer exhibits specific mutation signatures or probabilities, which are then applied as motivic and rhythmic alterations to a user-defined musical score. In essence, each cancer profile acts as a filter applied to music in real time, allowing us to “hear” the mutated DNA sequence.
Aufbau (Real-time Data Sonification & Visualization)
Excerpts/Experiments - Work in Progress
Aufbau builds upon the Data Driven Sound Spatialization project, integrating protein and amino acid data from the human body to generate musical and visual information spatialized across an 8-speaker setup. This piece focuses on aesthetic exploration rather than scientific inquiry.
In this work, protein data influences image selection, rhythmic elements, and image manipulation, creating a unique blend of sound and visual expression.
In this work, protein data influences image selection, rhythmic elements, and image manipulation, creating a unique blend of sound and visual expression.
Recycled Linoleum (Data Sonification | Electracoustic Composition |Audiovisual Media)
The sonic material for Recycled Linoleum was generated using a data sonification method which converts image, text, and video into sonic material. In this case, much of the raw audio was the result of the sonification of a PDF of document of an old Pro Tools manual. This unconventional use of and conversion of media resulted in extremely harsh, digital, yet surprisingly rhythmic results. Hours of material was generated by this process. The majority of the material resulted in unusable white noise. Sporadically distributed within this vast amount of sonic information were short moments of incredibly rhythmically and timbrally interesting sonic events. These events were used as the motivic figures to create the piece. The aesthetic goal of the piece is to create a stark, harsh, sterile, sonic experience. The visual component was created to reflect and emphasize this aesthetic.
The 3D material for the video was procedurally generated using custom software developed in Houdini and assembled using Premier Pro. Similar to the techniques used in Golden Cuttlefish, organic motion was juxtaposed with synthetic mechanical imagery.
The 3D material for the video was procedurally generated using custom software developed in Houdini and assembled using Premier Pro. Similar to the techniques used in Golden Cuttlefish, organic motion was juxtaposed with synthetic mechanical imagery.
Iridium (Real-time Data Sonification & Visualization | Live Performance)
Performers: Timothy Moyers & Michael Olsen