Ascendance is an abstract animation project, in VR 360 video (monoscopic) format with spatialized audio. It is best viewed in a Head Mounted Display (i.e. Quest, Vive, Rift etc) rather than via YouTube in a browser window, but if you wear decent headphones the browser experience is at least able to represent the spatialized sound (when you use the  mouse to navigate the 360 viewing space, the audio direction shifts according to where you are looking.

Oskar Fischinger still image

The intention behind the creation of this film was to re-imagine, using modern software tools, the ‘Visual Music’ or ‘Colour Music’ abstract animation forms that were conceived by the pioneering artists of this field in the early/mid 20th century. Visual practitioners such as Oskar Fischinger, Norman McLaren, Mary Ellen Bute, John Whitney, Jordan Belson, Len Lye et al. explored this territory using the analogue tools that were available at the time. Today, we can revisit the themes and questions that those artists investigated, creating visuals and sound using the sophisticated digital audio workstations, 3D software and compositing tools that are now available to us. We can generate immersive viewing and listening experiences in VR Head Mounted Displays, in our own homes. We wanted to see what was possible, and discover what it means to be a Visual Music practitioner today. 

Along the way, we encountered many creative challenges that we hadn’t expected. In particular, the difficulties in developing the right balance of movement and speed of movement of the various animated elements in the film, in a way that worked on a practical level for viewing in a HMD. If the movement was too fast or moved too far around the 360 space, it created a disorienting and uncomfortable experience for the viewer.  There were also difficult decisions to make in regard to what musical elements to represent in the visuals. At first we attempted to represent all of the different instruments in the musical composition as independent objects moving separately through the space. Because we were working with spatialized audio, it meant that the audio that each instrument was generating needed to be attached to the independent objects that represented them, and the resulting audio seemed too  busy, with sound coming from so many different directions that it wasn’t clear where any of them were really coming from. The resulting viewing and listening experience was a busy confusion of sound and music with no apparent direction or meaning. It was also exhausting to watch. So instead of trying to represent individual instruments at all times, we instead tried to identify the major themes of the music, the musical flourishes and underlying harmonies that we could represent more harmoniously and with more visual and aural texture through the use of masses of small objects that weaved and flowed around each other like the waves of a sea. 

The process of actually matching the location of objects in the 360 VR/video space to the separate audio elements that they were meant to represent was another big challenge, especially given that we were using a 3D software (Maya) to create the visuals rather than a game engine where there is the capacity to attach an audio element to a moving visual element. Our composer and spatial audio creator Mark Douglas Williams had experience with the Unity Game engine for spatialised audio, but now he had to familiarise himself with other tools that provided him with some sort of visual mapping of where the audio elements were located in the 360 space (Facebook 360 audio spatial workstation, Reaper, and other tools).

Another issue (for me, as the animator) was how to minimise the sterile ‘CG-ness’ that naturally results from using 3D software and try to infuse a more organic feel into the visuals. The decision to use 3D software and not a game engine was my own idea because I was familiar with Maya and had used it and the Arnold renderer to create other VR projects. This time, I also made the decision to create a monoscopic project rather than stereoscopic, because (in an effort to introduce a more organic feel) I wanted to composite Red Giant’s Trapcode particle effects to the scene and (currently) it is not possible to utilise this tool effectively or fully in a stereoscopic workspace. 

I will post further info on the creative process in later posts.

Louise Harvey, April 2021

Animation – Louise Harvey | Audio spatialization – Mark Douglas Williams | Producer – Peter Moyes

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.