Designing algorithms that distribute agency: On 'Lotteries We All Lose'

February 23, 2024

In short: How can algorithms distribute agency and magnify curiosity? Glenn McDonald draws from his experience designing music algorithms at Spotify to examine how “systemically moral” algorithms can shift cultural validation from lotteries toward meritocracies.



At Imbue, we think a lot about how technology can give people more agency rather than acting as subtle mechanisms of control. One way this manifests in our lives is through music recommendation algorithms, which purport to introduce new artists aligned with our tastes, but instead often serve us platter after platter of acoustically-similar songs, with little ability from artists and listeners to steer recommendations.

In “Lotteries We All Lose”, Glenn McDonald, creator of the incredible viral website “Every Noise At Once”, explains that music algorithms act like lotteries because they randomly distribute streams to less popular artists. However, this is not a morally progressive system, as these algorithms don’t support agency and community:

Disproportionately concentrating streams among the most popular artists is straightforwardly regressive, but distributing streams to less popular artists is not itself necessarily progressive. A morally progressive algorithm distributes agency: it gives listeners more control, or it encourages and facilitates their curiosity; it helps artists find and build community and thus career sustainability. Holistically, it rewards cultural validation, and thus shifts systemic effects from privilege and lotteries towards accessibility and meritocracies.

While using ML to recommend similar-sounding music may seem desirable, it places artists in a lottery-like system that gives a “fleeting illusion of access” but can’t be sustained. Music recommendation becomes a “chaos engine printing rigged lottery tickets” in which neither artists nor listeners can express their creativity and agency to hone their taste.

How can we escape this chaos engine? McDonald suggests that magnifying curiosity can be a means of decentralizing power:

The algorithms I wrote to generate playlists for the genre system I used to run at Spotify were not explicitly conceived as moral machines, but they inevitably expressed things I believed by virtue of my involvement, and thus were sometimes part of how I came to understand aspects of my own beliefs. They were proximally motivated by curiosity, but curiosity encodes an underlying faith in the distribution of value, so systems designed to reflect and magnify curiosity will tend towards decentralization, towards resistance against the gravity of power even if they aren’t consciously counterposed, ideologically, against the power itself. The premise of the genre system was that genres are communities, and so most of its algorithms tried to use fairly simple math to capture the collective tastes of particular communities of music fans.  

Right now, algorithms approximate love by showing us more of what we loved before. But McDonald points out that “machines that do not run on love will not produce it.” As his piece reminds us, real love is not determining what is worth loving for another; it is conferring the freedom, power, and resources for them to figure it out on their own.