This summer’s discovery of the Higgs boson — or, more accurately, a “Higgs-like particle” — definitely inspired physicists, who waited decades for that momentous occasion. But it’s also inspiring musicians to create ethereal-sounding music.
The process is called sonification: taking raw data and transforming it into sound while still retaining the information contained therein. That sensor is your car that beeps faster and faster the closer the car gets to another object is an example of sonification, albeit a very simplistic one. It is transmitting information about the distance between your car and the object through sound. Ditto for a Geiger counter, except it carries information about radiation levels.
There has been quite a bit of work done over the years, using sonification to study the atmospheres on Mars and Venus, certain stars, black holes, the Northern lights, Cassiopaeia A, elementary particles, and the rings of Saturn.
The sounds of Saturn’s rings even inspired a couple of musical compositions, while the sounds of Fermilab inspired an “Alternative Energy” symphony.
British composer and network engineer Domenico Vicinanza‘s sonification of particle tracks in cloud chambers was highlighted by Wired — a project similar in concept to a 2011 musical collaboration orchestrated by Alexis Kirke to create a duet between a live violinist and radioactive subatomic particles produced inside a cloud chamber. Vicinanza has also composed “music” from sonified data of volcanic activity, which made it easier for seismologists to track possible eruptions.
Most recently, he created a symphony based on data collected by CERN’s ATLAS experiment. Per Symmetry Breaking:
You can hear the full ATLAS “symphony” at SoundCloud. Meanwhile, check out the video below, which has been making the rounds of late. The
creator goes by the moniker benmccormack91, identified on Sound Cloud as Benjamin Doyle. The data for the composition came from Vicinanza’s work. Per Doyle:
There is even a more extensive collaboration between particle physicists, software developers, artists, and musicians called LHCSound, a project founded by CERN physicist Lily Asquith to sonify data from the ATLAS detector at the Large Hadron Collider.
Data from collisions reveals information about position and direction, but there is a lot of it, and physicists generally rely on artificial neural networks to analyze all that data. Sound might offer a new tool to represent all that complexity, since the human ear is capable of distinguishing source and location of sounds over a broad range of frequencies, as well as very slight changes in pitch. As Asquith wrote recently in The Guardian:
The growing sound library includes the sound of a top quark jet, and the
sound of a detector sweep “played” on a synthesized marimba. And in 2010, Asquith and her team produced this stunning video simulating what sounds the Higgs boson would be likely to make when they are produced at the LHC:
“We can hear clear structures in the sound, almost as if they had been composed,” Richard Dobson, a composer with the LHCSound project, told BBC News. “They’re so dynamic and shifting all the time, it does sound like a lot of the music that you hear in contemporary composition.”
Ultimately the goal as as much scientific as artistic, according to Asquith: “onification doesn’t just mean turning numbers into sound. We want to make a sound that has some information in it.” The hope is that LHCSound’s sonification of ATLAS data could be a useful supplement to more traditional methods of data analysis, eventually enabling physicists to detect candidate events by ear.
Image: Simulated particle track after a collision in the LHC. A Higgs boson is produced in the collision of two protons at 14 TeV and quickly decays into four muons. Credit: CERN.