Quantum Soundtrack

Allon Goldberg
4 min readApr 16, 2021

In the Fall of 2020, a team of students at Parsons School of Design in New York City showed how quantum behavior can be interpreted as artistic behavior using IBM’s cloud quantum computing servers and music.

Quantum computers leverage specific quantum-mechanical phenomena to do dense computation, namely superposition and entanglement. Today’s technology is functionally useful yet imperfect, battling hardware and manufacturing constraints, and environmental electromagnetic interference. The Quantum Soundtrack project leverages superposition, entanglement, and experimental interference to make music.

Many people know that the inner mind of “regular computers” is binary — that is, it boils down into 0s and 1s, or bits. Quantum Computers are fundamentally different. They do computation with qubits — “quantum bits” — which can be a 0, 1, or any self-contained combination of part 1 and part 0. So instead of 2 possible values, qubits have an infinite number of potential states and can be 2 things at once.

Superposition basically describes that two potential outcomes of an event both coexist before the event as probability in the form of a probability wave. The wave wants to make up its mind but needs help to collapse itself out of waveform — that’s where you can help the system decide what it’s gonna do, by measuring it. The measured value, or quantum state, is the outcome the wave collapsed to. Which outcome it collapses to is purely random to a specified degree.

Art as a practice boils down to an artist making decisions. Considering quantum randomness, the team asked, is a quantum computer an artist? They used music to argue yes. Just as pianists choose notes on a given scale the quantum computer chooses a note by measuring the state of a qubit in superposition. This method produces songs which let listeners hear quantum behavior.

The project starts with a custom quantum circuit, which is similar to a computer program in classical computing (see below).

4-Qubit Circuit, Screenshot IBM Circuit Composer

The circuit is made of 4 qubits meaning it can output 1 of 16 potential numbers with a randomness specified by the circuit. It was designed to output only 1 of 12 numbers, but because of noise, the team knew they would also see the other 4 numbers in the results (see below).

Using Python, the team transposed each numerical output onto musical scales. In the first iteration, they ran the circuit a couple-thousand times and got a one-time unique song in MIDI format with a couple-thousand notes — that the quantum computer composed in around 30 seconds. At 120 beats per minute, that made a song over an hour in length. The MIDI data in a digital audio workstation (DAW) such as Ableton or Logic Pro X can be heard as sound.

Using common musical effects like compression, distortion, reverb, delay, and modulation, the team took other data from the circuit, such as the noise and quantum state-specific complex numbers, and texturized the music to make it sound more dynamic, i.e., less robotic (see below).

Individual Qubit States, Phases

To add depth, the team used every 8th musical note as the bass note for the next bar, and added rhythm with human-played percussion. The main melody in the song is a summing instrument made of a flute, a piano, a trumpet, and a guitar each representing a qubit with texture and phase (see above). Just as the four qubits from the circuit mesh to form the potential music notes, four texturized and phased instruments play the notes together.

The team also made experiment-specific album covers with the same data as the song. They traded musical notes for colors and made a visual with common Python plotting library matplotlib. They also colored the noise data in a different color scale as well. In theory, you could follow the song as you listen by looking at the song artwork (below).

Album Cover Generation Method

--

--

Allon Goldberg

Engineering Designer | Creative Problem Solver | IBM Quantum Practitioner Certified