London-based designer/AD Dan Hoopert just released the second short film in his Audio Synthesis series exploring the relationship between motion and sound. Although technically-driven, both films make for delightful viewing.
Dan Hoopert: “In Audio Synthesis: Bird Songs [above], motion data is collected from the source video and used to generate note values and MIDI information that is sent to various synthesizers and samplers, creating a generative soundtrack unique to each visual input.
“Graphic shapes are emitted from various data points throughout resulting in an abstract visual language for the sounds that are generated. Created using Houdini and Ableton Live.
“The aim of this project was to see what is possible when thinking about designing sound using the data from the visual content itself.”
“For Audio Synthesis: What Does a Tree Sound Like? [below], a single beam of light scans from left to right across a tree gathering data from the cross section of each branch. Using the area of this cross section MIDI data is generated and fed into a sampler triggering a mixture of synthesized sound and field recordings from the forest.
“Data from the cross section is also made visible by abstract shapes that follow the beam of light along the tree. The result is a rich audiovisual experience completely driven by the tree itself. Created in Houdini and rendered using Redshift.”
Director/animator: Dan Hoopert
Audio: Dan Hoopert