Interview: 5 Minutes Paul Wolinski of 65daysofstatic
Interview by Shannon Lawlor
65daysofstatic are a four-headed experimental electronic-rock band from Sheffield, UK. Formed in 2001, the band have put forth numerous singles, EPs, remixes, soundtracks, five full-length LPs and other rare ephemera in their extensive back catalogue. Initially exploding onto the underground post-rock scene with their highly acclaimed 2004 debut The Fall of Math, released via Monotreme Records, 65daysofstatic have since become widely recognized for their eclectic merging of genres and different tones implemented into their sound. 2010 saw a notable step forward in direction, shaking off the inevitable “math-rock” tag the band had previously fallen under. With their album We Were Exploding Anyway released via Hassle Records, 65 featured a vast use of electronics encompassing programmed basslines and beats to further solidify the band’s vision on dance-oriented music.
In 2014 65daysofstatic were then commissioned to compose an original soundtrack to the now acclaimed video game No Man’s Sky, focusing on texture and atmosphere rather than their typically structured full-band writing session. Thus, in 2016, No Man’s Sky: Music for an Infinite Universe was the born – met with high praise from critics and fans alike, and eventually spawning a world tour in support of the band’s dynamically atmospheric soundtrack.
The band have since been preparing an exclusive, exciting performance piece titled Decomposition Theory, premiering three separate live shows at Algomech Festival in Sheffield, 9-10 November. The performance will see 65daysofstatic explore new depths in composition by experimenting with algorithmic music techniques, procedurally generated audio and live-coded noise – “Processes can be combined, rewritten, manipulated or ignored. Each performance will be a unique curation of algorithms, coded by 65 to generate live music..”
We caught up with Paul Wolinski of 65daysofstatic to speak on their upcoming performance piece Decomposition Theory:
Your special new live performance, titled Decomposition Theory or How I Learned to Stop Worrying and Demand The Future premieres at Algomech Festival in Sheffield, UK. How will this be unique compared to a typical 65daysofstatic show?
It’s shaping up to be very different from a regular 65 show. Decomposition Theory is designed to be ever-changing. There’s no back catalogue and the new material will never be the same. It’s a collaboration between us and an algorithmic musical framework we built that generates new streams of music every time we run it. Nevertheless, it will be structured, it’s not improvisation. Also, it’s not even really a performance. The four of us will be there, and we will be working on/with the music, and it will all be really happening live. There’s no backing tracks or anything like that. But if anything it will be more like us being in a workshop than on a stage. There’s no simple way to bend this idea to fit the usual expectations that come along with the concept of a ‘live performance’, so that’s something we are not even going to try. We’re just focusing on making it a great experience for people in the room.
By generating music programs at will, what pieces of software will be essential to Decomposition Theory’s portrayal?
It’s a mixture of approaches. We are trying to think of the shows in Sheffield as just one example of Decomposition Theory. There’s definitely pieces of software we’ve built that won’t get used in these initial shows. It’s been a steep learning curve but at the same time, succeeding at a particular problem throws open new possibilities and the original idea we’d been trying to realize gets left behind as we follow up unexpected results we find interesting. Also, sometimes ideas, once realized, aren’t as good as less complicated approaches. We’re pretty merciless in that regard as we don’t want to bend the music to fit the software even if it’s taken us months to iron out.
More specifically, I suspect the show in Sheffield will be driven by a large collection of custom Max/MSP patches that include bespoke live-coding tools, running inside Max For Live. We’ve used TidalCycles, and FMOD in development too, as well as a host of different pieces of custom software written ourselves in Unity.
What are some of the risks of implementing procedurally generated audio into a live setting?
There’s the risk that everything crashes. but that’s happened to us about a billion times already. It’s not that we’re inured to it now, but at least it wouldn’t be that much of a surprise to the crowd. The bigger risk is that what gets procedurally generated isn’t very good. Editing and rejecting bad ideas is a crucial part of songwriting. I can’t speak for all musicians, but the vast majority of everything 65 creates is destroyed because it wasn’t good enough. And songs that make it on to our records go through countless iterations before they reach their final state. That’s not possible when you’re hearing music the same time as the audience. So the emphasis has to be different. It has to be about the process and realtime sculpting of the music rather than any notion of a finished song. And so another big risk is that this isn’t communicated to the audience, I guess, and so we don’t successfully re-contextualise what’s happening as far as the people listening are concerned.
65 have posted potential teaser material for Decomposition Theory online, how important do you feel the imagery is to the live performance, and will these visual-aids be vital to the music?
The experience of a live show from the perspective of the audience is something we think about a lot. Visual aids have strengths and weaknesses. They are no replacement for the physicality or virtuosity of a live performance, but when you’re thinking about algorithmic music, then there’s a good chance that a lot of that involves standing quite still behind a laptop. And that’s not particularly interesting to watch. So you either make it all about the music and play in the dark, or try and figure out other ways to embrace the site-specificity. You have to find a reason to make it worthwhile for people to be in the room with you, rather than just listening to it online/in headphones.
What kind of ‘algorithmic coding’ is going to fuel Decomposition Theory?
A very subjective kind. We’re not so bold as to think we are creating universal, neutral tools that can be used by anybody to make whatever kind of music they want. We are making biased, partisan algorithms that will make music according to our very specific demands. They’re going to make 65daysofstatic music.
The development process has swung back and forth between live-coding, in the sense that we’ll be typing patterns of code onstage and turning them into music in realtime, and then running algorithms through more complicated code we have pre-written, which are generating melodies and rhythms and song structures and so on much faster than we would be able to type. I suspect by the time the shows happen, it’s going to be leaning toward the latter, but that could still change.
Feeling more like an art, or sound-installation performance piece, would Decomposition Theory possibly appeal to a wider, more club-oriented audience?
Yes – As mentioned earlier, this is more of a new approach to composition rather than a specific project. The premiere is at an art gallery, with a mechanical piano on hand, and an old school rave sound system to power it all. So the show is going to make the most of those elements. There’s no reason we couldn’t use the same central idea to do a more club-orientated show, or similarly a noisier festival-style show. Hopefully it will be fairly flexible in this respect.
As a quartet, how does Decomposition Theory differ to a usual rehearsal or performance between four individual human beings? (I almost imagine your instruments or hardware telling YOU what to do!)
It’s all new to us. As usual we’re making it up as we go along. It’s certainly different to regular rehearsals because there is nothing to learn in terms of song arrangements or fixed melodies. The music keeps being different every time we play. It can get confusing.
Has the preparation process behind Decomposition Theory been a refreshing, or cathartic experience as opposed to previous, familiar territory?
It’s just usual 65 territory really. Our eyes are always drawn to whatever is slightly beyond wherever we find ourselves.
Will Decomposition Theory ever see digital, or physical imprisonment in some shape or form?
No idea. The idea of making a not-easily-commodifiable music is one of Decomposition Theory’s not-explicit but central themes, probably. I’m sure it would make our record label breath a sigh of relief if the answer turns out to be yes though.
Be sure to catch 65daysofstatic performing Decomposition Theory at Algomech Festival 9-10 November, Sheffield, United Kingdom
For more information follow 65daysofstatic on Facebook