Biography, CV, publications, activities, links

I am a researcher and maker working with creative technologies. I come from a highly diverse academic background spanning social anthropology, evolutionary and adaptive systems, music informatics and interaction design, with a parallel career in electronic music and digital art spanning over 15 years. I am interested in how artists, designers and musicians can use advanced computing technologies to produce complex creative works. My current active research areas include media multiplicites, musical metacreation, the theories and methodologies of computational creativity, new interfaces for musical expression, and multi-agent models of social creativity.

Following a bachelors in mathematics and social anthropology at the University of Cambridge and a masters in evolutionary and adaptive systems at the University of Sussex, I undertook a PhD looking at simulation models of the evolution of human musical behaviour at the Intelligent Sound and Music Systems group, Goldsmiths College, London, under the supervision of Professor Geraint Wiggins. In 2008 I moved to Melbourne, Australia, to work as postdoctoral research assistant with Jon McCormack at Monash University, working on an ARC Discovery Project that looked at ecosystemic approaches to digital creativity. Then in 2011 I took a lectureship at the Design Lab at the University of Sydney, where I am currently on a postdoctoral fellowship.

During this time I have played in a number of improvied music ensembles, most notably my electronic music duo Icarus (with Sam Britton), the Not Applicable Artists Collective (with Sam Britton, Tom Arthurs, Lothar Olhmeier, Maurizio Ravalico, Britt Hatzius, Martin Hampton, Rudi Fischerlehner and Oliver Dukert), and Tangents (with Peter Hollo, Adrian Lim-Klumpes, Shoeb Ahmad and Evan Dorrian). I am also a collaborating member of the digital interactive art group Squidsoup, and have collaborated with many artsits, musicians and designers on digital interactive artworks.

I am a founding member of the Musical Metacreation Research Network, a member of the steering committee for the International Conference on Computational Creativity, and a creative advisor to the University of Sydney Vivid Festival.

I have performed, composed and created interactive works in a number of countries, contexts and collaborations, such as...

Sonic Acts Festival Amsterdam, Vivid Sydney, AudioVisiva Milan, Bimhuis Amsterdam, Scala London, Oslo Lux, Kinetica Art Fair, Powerhouse Sydney, ISEA, Four Tet (remix), Murcof (remix), The Leaf Label, Output Recordings, Cafe Oto, Adem Ilhan (duo), Futuresonic, Dysfunktional Beats, The NOW Now, Eclectic Method feat. Chuck D (remix), ABC New Music Up Late, ABC Sound Quality, Codame Festival San Francisco, ABC Catalyst, Science Museum London, FBi Ears Have Ears, Weirdcore, Shepherds Bush Empire London, The Monastery of Sound, JazzHouse Copenhagen, Norburg Festival Sweden, Rump Recordings, Jazzjuice Festival Aarhus, ISEA, HellosQuare, Sage Gateshead, North Sea Jazz Festival, King's Place London, Brussels Planetarium, The Wire Magazine, BBC Mixing It, BBC Late Junction, Shunt London, Cube Cinema Bristol, Temporary Residence, Siouxie Sioux (remix), Sonic Arts Network Expo, Ars Electronica, Transmediale Festival, Cave12 Geneva, Caribou (remix), Aphex Twin (software development), Lux Cinema London, PRS Foundation for New Music, STEIM Amsterdam, Sage Gateshead, xCoAx, Dispatch Festival Belgrade, Seymour Centre, Australia Council, British Council.


“Just by rotating this knob, any one of you can produce up to three sonatas per hour. Yet consider how hard it was for your ancestors. They could be creative only by driving themselves into fits of ‘inspiration’— an unknown form of epilepsy.” Yevgeny Zamyatin, "We".

Aphex Twin Remote Orchestra

In 2012 Icarus created the software for Aphex Twin's notorious Remote Orchestra project at the Barbican Hall.

In 2012 Icarus (Ollie Bown and Sam Britton) were commissioned by the legendary electronic music producer Aphex Twin to create the live performance software for his Remote Orchestra project. The concept involved giving an entire 32 piece orchestra headphones, over which simple tones could be sent for the players to match. Aphex wanted an interface that would allow him to meta-control the 32 frequencies by manipulating their distribution in an intuitive (and outrageously atonal) way. The work was performed to a sell-out audience at the Barbican Hall in October 2012.


Distributed Interactive Audio Devices.

The Distributed Interactive Audio Devices (DIADs) are an experimental platform for exploring sound an interaction in the context of networked portable systems.

In their use in music performance, the categories of performance device, loudspeaker system and compositional structure become entangled. The members of the audience are invited to be a part of the piece through their interaction with a set of DIADs. In taking hold of the individual sculptural and mechanical units, they find themselves holding and passing around the loudspeaker system itself, so that rather than the audience being immersed in surround sound, the sound sits amongst the audience. The work creates a transformation of how we perceive live electronic music, bringing the audience into the performance and allowing non-linear, game-based flows in the compositional structure, driven by interaction. It extends collaborative and game-based approaches to composition and performance, such as the work of artists like The Hub, Nick Didkovsky, Jon Rose and John Zorn, but also involves the audience in the gameplay.

The work takes the form of a live electronic music performance, with a performer on stage, who is remote-controlling the DIADs system, and a set of 8 to 15 interactive and wirelessly connected sound devices. The musical sound is dissipated through the devices, which are embedded in curious tactile shell designs. The audience is invited to take the DIADs in their hands, pass them around, and discover gestures that provoke responses from the units. The gestures cause changes in the structure of the overall piece, producing a game-like interplay between the audience and the performer.

Each ball contains a small WiFi-enabled computer, with loudspeaker and sensors (accelerometer and gyroscope). The balls can be ‘live coded’ from a host computer, running synchronised real-time generative audio software that incorporates the sensors’ activity into the sonic behaviour. They are fully portable and interactive, and are encased in 3D printed shells by architects reinhardtjung. Woolly covers by crochet artist Kirsten Fredericks are optional.

Musical Metacreation

Research into software that actively supports creative practice.

Musical Metacreation (MuMe) is an emerging term describing the body of research concerned with the automation of any or all aspects of musical creativity. It looks to bring together and build upon existing academic fields such as algorithmic composition, generative music, machine musicianship and live algorithms. It is understood as a branch of computational creativity, the study of autonomous systems that produce outputs that, if a human were to have produced them, would be deemed creative. It involves a broad community across research, practice and industry, with a growing body of literature that shows the automation of creative processes to be an active reality, not a distant research objective. The consolidation of these various strands of research in an active community is timely, as creative music software flourishes, creative digital music practice becomes ever more nuanced and innovative, and big-data and artificial intelligence expand their reach into a growing number of areas of human activity, the creative arts being no exception. These developments are set against a backdrop of philosophical enquiry into how the automation of creativity sheds light on human creativity and the possibility of artificial creativity.

The MuMe Research Network

The Musical Metacreation Research Network was founded by Philippe Pasquier, Arne Eigenfeldt and Ollie Bown. It runs an annual workshop as part of the AAAI AI in Interactive Digital Entertainment (AIIDE) conference, and regular concerts, tutorials and other events around the world. In 2015 we introduced a series of events called Musebot Ensembles, aiming to bring together researchers in collaborative music making.

Musical Metacreation Research Network

System Development

Since 2010 I have been working on a system for live improvisation with human performers called Zamyatin.

Zamyatin is part of an ongoing study into software systems that act in performance contexts with autonomous qualities. The system comprises an audio analysis layer, an inner control system exhibiting a form of complex dynamical behaviour, and a set of "composed" output modules that respond to the patterned output from the dynamical system. The inner systems consists of a bespoke "Decision Tree" that is built to feed back on itself, maintaining both a responsive behaviour to the outside world and a generative behaviour, driven by its own internal activity. The system has been evolved using a database of previous work by the performer, to find interesting degrees of interaction between this responsivity and internal generativity. Its output is 'sonified' through different output modules, mini generative algorithms composed by the author. Zamyatin's name derives from the Russian author whose dystopian vision included machines for systematic composition, that removed the savagery of human performance from music. Did he ever imagine the computer music free-improv of the early 21st Century?

Zamyatin has performed with Finn Peters, Tom Arthurs, Lothar Ohlmeier, François Houle, Ben Carey, Peter Hollo, Evan Dorrian, Steffan Ianigro and Roger Dean in Europe, North America and Australia.


Realtime audio for Java/Processing

Beads is a software library written in Java for realtime audio. It was started by Ollie Bown in 2008. It is an open source project and has been developed with support from Monash University in Melbourne, via the Centre for Electronic Media Art‘s ARC Discovery Grant Project “Creative Ecosystems”, and a Small Grant for Early Career Researchers from the Faculty of Information Technology. Beads contributors includes Ollie Bown, Ben Porter and Benito.

An introductory guide to Beads, called Sonifying Processing, has been contributed by Evan Merz.

Beads Homepage.

Solstice LAMP

Large scale digital interactive artwork at Sydney's Vivid Festival in 2013.

Solstice LAMP was commissioned by AMP for the Amplify and Vivid Festivals, and was a coallboration between Miriama Young of the Sydney Conservatorium of Music, and Oliver Bown, Martin Tomitsch and Luke Hespanhol of the Design Lab.

An interactive concourse area outside the famous AMP building at Circular Quay in Sydney drew people's outlines on the floor, guiding a responsive musical composition. Once a minute the shapes made by people on the floor would float off, up the building, metamorphose into birds and fly away.

Solstice LAMP Video.

Fake Fish Distribution

An album in 1,000 variations, by Icarus

In 2012 Icarus (Ollie Bown and Sam Britton), with support from STEIM Amsterdam, the PRS Foundation and Ableton, created a parametrically distributed album "in 1,000 variations". They created their own custom add-ons the Ableton's Live music production suite, that allowed them to turn a dial and flick through 1,000 different versions of the tracks whilst working on them. The 1,000 versions of each track were automatically rendered and compiled into digital albums, that have been selling online one-at-a-time. As well as examining interesting new parametric production techniques for music, the work explores ownership and uniqueness in the digital age.

Icarus - Fake Fish Distribution


Networked audio video system.

ANIM.OS is a collaboraiton between Ollie Bown and Canadian singer and media artist Erin Gee. A network of computers is turned into a distributed audio-visual instrument that can be controlled or allowed to exhibit emergent behaviour through the interaction between the elements.