Blog Archives

Adventures in acoustic cosmology

00:01 by Gavin Starks


Adventures in acoustic cosmology

RAS PR 17/30 (NAM 13)

3rd July 2017

A project that explores whether there is a musical equivalent to the curvature of spacetime will be presented on Thursday 6th July by Gavin Starks at the National Astronomy Meeting at the University of Hull.

Starks, who has a background in radio astronomy and electronic music, been working on developing an ‘acoustic cosmology’ for more than 20 years in collaboration with Prof Andy Newsam of Liverpool John Moores University. Their aim is to test whether mathematical relationships that describe cosmology and quantum mechanics can be applied to a sonic universe, or ‘soniverse’.

Starks explains: “If we look at the way that music has evolved from mediaeval plainsong to the algorithms that generate current chart-hits, we can see parallels developing in the way we describe music and descriptions of our perception of the universe. We can now create new types of sound from scratch – electronic sounds that simply couldn’t have existed before. It leads us to think about a digital sound world that we can’t enter, because it physically doesn’t exist. The question is – what next? 

“We are starting to develop completely new forms of manipulating the microstructure of sound, as well as the macroenvironment in which we experience it. This raises questions about whether we can create a soniverse based on a set of fundamental equations, in the same way that we can create mathematical models of the universe.”

Starting with a single wavelength ‘sonon’, a fundamental particle in the soniverse equivalent to a photon, Starks has attempted to define its properties and the physics that may apply to it. The project’s initial model of ‘wave-time’ has three independent dimensions: the individual sonon wavelength, instrument time (the duration that an individual instrument plays) and performance time (duration equal to the length of an individual piece).

Some of the relationships explored to date are causal (i.e. the physics is consistent within the soniverse) and some are aesthetic (i.e. they describe a subjective musical construct). Many have direct parallels in the physical universe. For example, the listener in the soniverse is analogous to the observer in quantum mechanics: a sonon is only rendered musical or not when it is heard.  A temporal gravity allows the clustering of sonons to create rhythms or musical phrases. Wave-time can be bent by clusters of sonons, in the same way that gravity distorts space-time in the universe.

Starks believes that bringing together cutting-edge science and understanding of musical structure creates the opportunity for discovery: “There’s a long common history between physics and music, for instance people built columns in cathedrals at a height linked to the resonant frequency, even before they understood the nature of pressure dynamics. It’s a relatively recent phenomenon that art and science are treated as different disciplines. By bringing them back together, and creating a common language, we can find different ways of interpreting and thinking about both music and cosmology.”

Newsam adds: “As astronomers, our experience of the universe is essentially visual – images, graphs, and so on. With the soniverse, we hope to create a new way to appreciate the cosmos, using our instinctive grasp of music and tone to explore relationships between different objects and cosmological models.”

Media contacts

NAM press office

Robert Massey, Royal Astronomical Society,

Anita Heward, Royal Astronomical Society,

Morgan Hollis, Royal Astronomical Society,

Science contacts

Gavin Starks, Dgen,,, @agentGav

Andy Newsam, Professor of Astronomy Education and Engagement, Director of the National Schools’ Observatory, Liverpool John Moores University,


Blog post with embedded player, video and more information: 

This data-cube is based on an optical image of the “Antennae Galaxies” colliding, as taken by the Hubble Space Telescope and a radio-image taken by ALMA. Each pixel actually represents a spectrum of frequencies across the electromagnetic radio spectrum

The data-cube works in two ways.

  • Firstly, the radio frequencies have been transformed into visible colours, so you can see a slice of the cube.
  • Secondly, the electromagnetic spectrum has been transformed into an acoustic spectrum. Remember light≠sound: the frequency of electromagnetic radiation (‘light’) has been transformed into a frequency of pressure wave (sound).

By clicking the image and moving your cursor around you can “play” a spectrum of the colliding galaxies. Underneath, you can see a visual representation of the frequency spectrum. Spend some time moving slowly around the red (redshifted) areas – there is a surprising richness to the harmonics for such a simple sonification.

Note: the data-cube is 8MB and could take between 20 seconds to several minutes to appear if you are on a slow connection.

Update: presentation slides used now online and embedded below.



Inspired by the discovery of the first double-pulsar system (ranked as the 6th most important scientific discovery of 2004), and specially written to celebrate the 10th anniversary of RT32 – a reclaimed 32m Radio Telescope in the middle of the Latvian forests (the VIRAC Radio Telescope in Irbene) brought to life over a decade, after being trashed by the Soviet military, as the only radio telescope in the world that is dedicated to both science and art. 

The full name of the piece is ds² – series 1 (PSR J0737-3039B) 

Further information:

Further information on the project can be found at:

Notes for editors

Running from 2 to 7 July, the RAS National Astronomy Meeting 2017 (NAM 2017, takes place this year at the University of Hull. NAM 2017 will bring together around 500 space scientists and astronomers to discuss the latest research in their respective fields. The conference is principally sponsored by the Royal Astronomical Society and the Science and Technology Facilities Council.


The Royal Astronomical Society (RAS,, founded in 1820, encourages and promotes the study of astronomy, solar-system science, geophysics and closely related branches of science. The RAS organises scientific meetings, publishes international research and review journals, recognises outstanding achievements by the award of medals and prizes, maintains an extensive library, supports education through grants and outreach activities and represents UK astronomy nationally and internationally. Its more than 4000 members (Fellows), a third based overseas, include scientific researchers in universities, observatories and laboratories as well as historians of astronomy and others.



The Science and Technology Facilities Council (STFC, is keeping the UK at the forefront of international science and has a broad science portfolio and works with the academic and industrial communities to share its expertise in materials science, space and ground-based astronomy technologies, laser science, microelectronics, wafer scale manufacturing, particle and nuclear physics, alternative energy production, radio communications and radar.

STFC’s Astronomy and Space Science programme provides support for a wide range of facilities, research groups and individuals in order to investigate some of the highest priority questions in astrophysics, cosmology and solar system science. STFC’s astronomy and space science programme is delivered through grant funding for research activities, and also through support of technical activities at STFC’s UK Astronomy Technology Centre and RAL Space at the Rutherford Appleton Laboratory. STFC also supports UK astronomy through the international European Southern Observatory.


Listen to the radio-cube of the Antennae Galaxies

00:20 by Gavin Starks

Below is the optical image of the “Antennae Galaxies” colliding, as taken by the Hubble Space Telescope.


Overlaid below is a radio-image taken by ALMA. In the ‘image’ in this case, each pixel is actually represents a spectrum of frequencies across the electromagnetic radio spectrum: the ‘image’ is actually a data-cube.

We have transformed this data-cube in two ways.

Firstly, so you can see it on the overlay, we’ve transformed the radio frequencies into visible colours, so you can see a slice of the cube.

Secondly, we’ve transformed the e-m spectrum into an acoustic spectrum. Remember light≠sound: what we are doing is translating the frequency of e-m radiation (‘light’) into a frequency of pressure wave (sound).


Below is the data cube on its own in a playable format: to view & listen, wait a moment (20-30 seconds) while the player below loads. NB: works best on Chrome or Firefox, not IE or Safari.

By clicking the image and moving your cursor around you can “play” a spectrum of the colliding galaxies. Underneath, you can see a visual representation of the frequency spectrum. Spend some time moving slowly around the red(redshifted) areas – there is a surprising richness to the harmonics for such a simple sonification.

Note: this loads a 8MB data-cube before displaying … it could take between 20 seconds to several minutes to appear if you are on a slow connection: but note it did take these photons 70 million years to reach us, so please be patient while they go the last few bit-miles!

If that doesn’t work, here’s a short video of it in action (make sure subtitles are switched on in your youtube player to see a description)


Sources:  HST image, National GeographicWIRED


09:09 by Gavin Starks

Interesting to compare: the sound of electromagnetic measurements of Earth transformed directly into sound, and then slowed down


Slowed down:

with this excerpt of a transitional piece I wrote when I was 19yo (I was moving from traditional tone/rhythm-based music to electronic).

Tempest — download now available

15:51 by Gavin Starks



Tempest is now available for download at:  for free or, if you wish, for something.

Escape into the multiverse

01:39 by Gavin Starks

[10 min, via Thread]

Over the past 20 years, my stupid, grasping mind has trying to find ways to combine music and cosmology. This began with writing music that I (and others) have struggled to describe verbally.

When I first stumbled into the methods of creating this highly textural form, I knew I’d laid the foundations for all the music I would write in the future. It was a moment I found so exciting I had to immediately find a phone box (yes, that long ago) to call my friend Andy to tell him. We have collaborated ever since.

For me, music, like mathematics, is a non-verbal language. It is experiential.


A theme for me seems to be to begin with creating common languages. This work has evolved to explore the articulation and progression in musical language, one that has mirrored our understanding of the universe for centuries. I’ve tried to show this evolution, in broad terms, in the image above.

We used to believe there was a fundamental relationship between the distances between the planets and music. We called this Musica Universalis or ‘Music of the Spheres’. We know now there isn’t a causal relationship, but our emotional relationship with this idea remains profoundly powerful.

My question was ‘is there an equivalent in contemporary physics and electronic music?’. Is there a ‘Music of the Hypersphere’? Or, bringing in more specific language, is there an ‘Acoustic cosmology in Hilbert Space’. You might also ask ‘What is the shape of music?’.

My explorations led me to try and find relationships (mathematical, or experiential, not causal) between the mathematics used in astrophysics (specifically cosmology and quantum mechanics) and the codified mathematics-as-algorithms in the realms of computer- and electronic-music. For those who understand science and/or music, some of this will make no sense in either of those domains. This is intended: we are exploring. This isn’t a small project.

We can draw interesting parallels between the macro-languages of music over centuries and the evolution of our understanding of the complexity of the universe.

Starting about a thousand years ago, with PITCH and DURATION in monophonic music (polyphony was considered ungodly), through to the development of HARMONY in baroque to the ROMANTIC and COMPLEX symphony into REDUCTIONIST music-concrete, and the ATOMISED computer-music to the contemporary GENERATIVE algorave, we can see clear parallels between the language used to describe our perception of the universe and the words used to describe music.

Our latest iterations can reduce the sonic universe (soniverse) to a set of fundamental equations (the equivalents to Maxwell’s equations) from which all other sounds can be produced. From additive synthesis to neural network-generated chart hits, it seems to me that these very different worlds form part of our convergence of understanding: art and science are reconvening.

This had led me to ask: is there a musical equivalent to the CURVATURE of space-time? Can we operate natively in a frequency, rather than a time-based domain? What might DIMENSIONALITY mean in a wave-time cosmology?

My hypothesis is to test if there exists and/or if we can create a soniverse in which the language of science is relevant. It may be relevant in a causal sense (for example, the physics may be self-referentially consistent). It may be relevant aesthetically (for example, it may describe a subjective musical construct). Our first principle is to create a fundamental particle, which I will call a sonon — the equivalent to a photon in this sonic universe (and not to be confused with a phonon). Our next challenge is to start to define its properties, and then to define the physics that may apply to this universe.

Our single sonon is one wavelength. Time is, as-yet, undefined.



There is a lovely visual similarity between how navigators and astronomers have been mapping the heavens for centuries (the Celestial Sphere), and the way scientists model quantum mechanics (the Bloch Sphere).

Whereas a Celestial Sphere helps us map the entirety of the heavens by imagining you as the observer (O) at the center of the universe, a Bloch Sphere is a geometric representation of the complex mathematics of qubits, used in quantum computing. In ‘normal’ computing, a bit is either a one or a zero. A qubit can kind of be both at the same time (see “you guys put complex numbers in your ontologies?”). And if that sounds beautifully confusing, it is.

bloch-sphere celestial-sphere


As I’ve been looking at these ways of viewing the universe at such wildly different scales, and how Bloch Spheres are used to help describe photons, I wanted to attempt to create an equivalent of a Bloch Sphere to define dimensions of influence in our sonon. As I’m also trying to bring in the large-scale cosmological mathematics, it seemed to fit to deal with these confusing extremes in parallel.

We begin with phase in a classical sense.  We then consider amplitude to be defined by the number of sonons counted (as classical physics, we’ve not yet considered quantum state probability amplitudes). In sound, as classically defined in our universe as a pressure wave, there is no polarisation. This makes it hard to create any mapping that might lead to concepts of entanglement, or that captures the strangely observer-centric nature of photons.

Instead we create the notion of temporal musicality. A sonon is and is not musical if it is embodied in an observable frame of reference that is musical (e.g. an observable frame of reference could be a piece performed in a rendered space for a listener). We define ‘musical’ as ‘an emotive response’ over a period of time (performance time). A sonon may be considered musical if it lies on the surface of the Bloch Sphere. Inside the Bloch Sphere we admit that we are not sure if it is.

Whether or not it is, is dependent on its observation—which embodies the entire context and structure in which it is heard, and the listener. It is impossible to recreate a sonon since it is, by definition, only rendered in a temporal space as an auditory event.

We may or may not be able to model context: a combination of sonons in an acoustic environment. We may or may not be able to model the listener: which could range from a microphone to a human, from a bird to a black-hole, and therefore may be unknowable. We may be unable to differentiate between a sonon and a rendered sonon (e.g. if time runs backwards, is it a different sonon?).

We create a parallel of wavefunction collapse as this temporal rendering. A sonon can exist in many states prior to being heard. It is only the act of hearing that renders it as having a musical response (cf. in quantum mechanics it is not possible to know the state of a photon until it is observed). (I leave the idea that consciousness is prerequisite for ‘hearing’ as an exercise for another day. It’s also interesting to think about how you might render Cage’s 4’33” in this context and whether a sonon could be considered musical without any other context.)

Having defined a fundamental particle, we then consider what physics may exist in this soniverse. We first must create time. We are drawn to ideas of ‘causality’: in this context causality is only rendered by the listener. Our soniverse may end up only being rendered temporal through experiencing it. It is multi-directional: conversations can be run backwards as well as forwards.

Our big-bang event is the spontaneous creation of all sonons that can exist. We expand the event horizon of the soniverse exponentially and introduce spontaneous random variations. Our parameters of expansion are time, phase and wavelength. We do not define any physical space (there is no space-time). In our initial model, distance is only measured in sonon wavelengths. Analogous to the relativistic perspective of a photon, its position is ‘everywhere’ and its ‘distance’ is ‘nothing’ in between. It may be that we begin in zero or one dimension.

We define Time to have three independent dimensions: the individual sonon wavelength (=1?), the duration an individual instrument as rendered (the instrument time), and a duration equal to the length of an individual piece (the performance time). An instrument time can be longer than a performance time (for example, the 1000 year piece Longplayer, could be considered transitory: an individual’s snippet of conversation could be considered a shard of the instrument that is every word they ever utter).

In our wave-time, wavelength, phase and Time may vary over time. We define an equivalent of the inverse square law: sonons diminish to the square of their wavelength. We define a temporal gravity whereby harmonics may cluster at intervals proportional to their relative density at temporal wavelengths (ie. repetition of sonons within a localised event horizon may cause clustering—’mass’). This could, for example, be an equivalent to a ‘rhythmic or musical phrase’ in classical music. Sonons attract if there is a temporal ‘harmony’.

We define an Epoch to be the finite age of this soniverse. It equal to or longer than the performance time.

We define a curvature of wave-time. This curvature is in frequency space (Sμʊ;ג) and is time-variant based on where we are in this epoch.

We can also attempt to visualise the soniverse in our four dimensional physical space-time. We can develop new modes of listening, and new modes of interaction between these two universes.

We can take an individual sonon cluster and redshift it. We can take clusters of our virtual sonons and vary their localised gravity based on where our ‘projected hand of god’ creates motion in a sonon cluster.

We can enquire if there is a concept of ‘aubit’ — the equivalent of a physical orbit. For example, if we can define the notion of ‘clustering’, then we can define clustering in relation to a notion of gravitation that operates in a cosmological context (the wave-time is bent in the soniverse as space-time bend in the universe).

We can create singularities: supermassive, hyper-localised gravity that acts to distort the surrounding wave-time akin to gravitational lensing. A listener may traverse a timeline in any direction.

We can define entropy, and declare it to increase and decrease as the soniverse expands and contracts. Entropy does not really derive from the expansion/contraction of the universe (they are related, but not consequential) but I have always assumed them to be quite deeply linked. I couldn’t explain to you why, but I like the idea of either decay-to-noise, or crystallisation-from-noise, and to me these are linked to the idea of big bang and big crunch in an oscillating universe.

In the beginning there nothing. And a god said, “Let there be noise”.

We have yet to define other particles, such as electrons or quarks, or other atomic structures. If I was an academic, at this point I would say something like ”we leave molecular interactions as an exercise for the reader”, however I’m much more interested in the actions below.

We’ve started modelling this here:


  1. If you have no idea about any of the science, I’d love to hear your feedback about the ideas you took away from reading this. Over time I want to find ways of trying to write about this in an accessible way.

Acoustic cosmology at EMF camp 2016

03:08 by Gavin Starks

Evolving language

21:01 by Gavin Starks

Evolving thoughts on the language of music, I wonder if our reductionist adoption of the language of physics (cf. Wishart On Sonic Art) should extend to adopt the language of cosmology and quantum mechanics?

Acoustic Cosmology - evolving language

Binary Dust at ISEA 2012

21:12 by Gavin Starks
Here’s a copy of the presentation I gave at ISEA 2012.


Heavenly Discourses – Acoustic Cosmology presentation

03:43 by Gavin Starks
Here’s a copy of the presentation I gave at Heavenly Discourses. I’ll post the video when I get it.

View more presentations from Gavin Starks

Listen to the colliding “Antennae Galaxies”

16:22 by Gavin Starks

Cross posting between my two blogs (not sure what the best approach is for this kind of thing, which is quite apt given the collision-nature of the content), but here’s a sneak preview of some of the work I’ll be presenting on Sunday…