Exhibit A
using Skream’s electronic track Settled can test how we can perceive visuals in combination with clear leitmotifs. As it progresses, the repetition of chords resonate progressively with each reiteration (much like the motifs within Jaws and Hereditary). I believe that combining timed visual cues boost the stimulus by providing more information.
Visuals are also performed live. Much like the meticulous construction of a motion picture, live shows will be choreographed and played in a specific manner. This means that visuals can be tailored accordingly when for example a climactic moment arises within a performance. This comes in many manifestations. The work of a/v practitioners will assist in defining the methods behind adding emphasis to music. Whether the individual’s work takes influence from film or not, there will be correlation between the methods used by artists such as Weirdcore and filmic theory.
Exhibit A is designed to correlate to perfectly timed visuals. Although the methods used to achieve this are different between my work and the visual work of artists like Bretschneider, the desired effect is the same and has links to cinematic theory. This effect is to assign geometric shapes to specific notes within the soundtrack while also creating a sense of anticipation within my hypothetical audience The short clip contains the first of my projection mapping experiments. To reiterate, I have used Skream’s Settled as my soundtrack because it represents a climactic buildup of leitmotifs. The viewer can see that each chord sounded comes in conjunction with a window on the model house lighting up.
Chion comments on the “ear’s temporal threshold”: That the ear brain system is only able to process auditory/visual information in sections. The information is “gestalt”, meaning that although they are essentially different entities, we as viewers perceive them together, which is how a practitioner would give these elements “added value” (the statement Chion uses when a stimulus is added to another one to boost cognitive engagement from an audience). Exhibit A represents this. The first leitmotif contains static sounds and images manifesting in the piece, as the scene evolves the sound and image begin to pulse and change colour. (the placement of the windows lighting up moves up and down with the musical chords to assist the audience in connecting the two). Each leitmotif essentially presents the audience with something new. The first iteration is processed as a section of a few seconds of sound and animation. The second changes slightly to give a sense of progression and anticipation. Therefore, the first test is a starter into adding temporality to the music and proves that “synchronous sound does impose a sense of succession”.
Despite my personal findings and first hand realisation of this statement, the methods of creating such an effect are usually achieved through different means by audiovisual practitioners. Where I had created these visuals by manually selecting where and when to incorporate them, there is programming which will automate this according to the type of sound playing. Synthesiser based musician, Floating Points works with Hamill Industries to create audiovisual experience. One of their latest pieces, Anasickmodular represents the complex ways sound can be manipulated into image. In a commentary on this music video, Hamill Industries comment on their methods, saying “The Floating Points show is based on the visualization of sound through the manipulation of analog technologies. It will combine light, cameras, and visuals activated by sound waves, which in turn are modulated, creating image and sound loops”. The comparisons between Anasickmodular and exhibit A lie essentially in what each is achieving (visuals synchronised with music). However, Hamill industries devalues the ways in which to reach this because technology allows us to create the effect with the precision of a machine. Therefore, doing this manually could be seen as an obsolete practice, but still shows how image temporalisation can be applied to film and musical performance.
Exhibit B
The proceeding clips take image temporality into more broad terms to gauge if filmic theory can be useful further to what is discussed in Exhibit A. A key concept from Chion is that “sound vectorises or dramatises shots, orienting them toward a future, a goal, and creation of a feeling of imminence”, and that the shot is “going somewhere and it is oriented in time”. Skream’s song develops from the leitmotifs previously described to a crescendo, leading to the climax of the song. This section features less distinguishable notes but rather an ominous fade together of electronic tones. The visuals in turn present more abstract animation of sound by also fading frames in and out slowly. Without the sound, the moving image would have no meaning, but by applying it with the music. The viewer can see how the image has endowed the sound with a further sense of anticipation. This represents an evolved idea of how Chion’s theories are relevant to musical practice. Although the visuals are not timed with specific points in the music, they do give the music this sense of “added value” because the act of processing them simultaneously creates the desired effect of the future of the shot through the crescendo of the music and the slow progression shown in Exhibit B. This relates with the aforementioned Square Cube exhibit at the Electronic exhibition. Although I had first observed the piece in conjunction with untimed, unrelated music, the sound had still created a progressive sense, meaning that even visuals paired with separate auditory information can still be paired through the individual as a processing centre.
Exhibit C
The final test moves away from the ideals of a/v practice by using more organic forms to accompany music rather than non-narrative based visuals: “Movement of characters or objects, movement of smoke or light” Chion states that use of such material gives the image itself the sense of succession which would usually require the assistance of music. “Here, sound’s temporality combines with the temporality already present in the image. The two may move in concert or slightly at odds with each other, in the same manner as two instruments playing simultaneously.” Exhibit C represents this, initially demonstrating the movement of ink through water being introduced by the climax of the tune. The simple act of the animation running on the same timeline therefore creates temporality without the precise timing previously mentioned.
A similar practitioner to Hamill studios is Weirdcore, who creates bespoke live sensory experiences for many performances. Perhaps his most prestigious collaboration is with Aphex Twin (another electronic artist whose work was exhibited at the Design museum). One such collaboration is Collapse (2018), here the audience is presented with an amalgamation of visual techniques including a/v. However in this instance my focus is on the organic material used as the subject matter. We see a virtual camera zooming through an architectural environment based on pictures the artist took of Aphex Twin’s home town in Cornwall. Much like the ink movement represented in Exhibit C, the movement through space gives the scene forward momentum. Imagining the progression of the audiovisual gives a good impression as to how Chion’s theory of temporaliastion works. Similar to how the camera hurtles through space, showing different cobbled streets as the music plays, My demonstration shows the motion the ink takes in water as an accompaniment to the music playing. I applied other organic animation to further demonstrate this. A rotoscoped face with mouth moving was applied over the ink animation with imprecise timing. Despite this, the mouth shapes move in time with the music creating further links between the separate stimuli.
Aphex Twin, Crush
No comments:
Post a Comment