Monachopsis - Short Film
An experimental short film narrating a monotonous life out of place, through mood and space, where every day feels like yesterday, again and again. A storytelling experiment trying to tell a relatable character-driven story... without any characters.
Sound and mood-driven story - Headphones recommended for highest chance of enjoyment.
About the Project
In my final year at The Animation Workshop, I got the rare opportunity to direct and produce my own short-film for a semester. To learn more about the whole process of making a film, and going outside my comfort zone... I wanted to explore how my normal role as a technical-minded CG Generalist would influence my approach to learning directing and storytelling.
I was motivated by the idea of trying to tell a character-driven story, but without any characters.
An experiment in trying to convey an implied story through mood, sound, editing and repetition, to see if it was possible to make someone relate to the mood of a character they couldn't see, or know anything about.
It was also an exciting challenge to find a story narrative and visual structure I would be capable of making by myself, with the time and resources I had available. Finding a compromise between story and techncial solutions, which led me to also learn Python in order to build a fully custom, automated pipeline, allowing me to produce shots efficiently without tedious bottlenecks.
Hope you enjoy and can connect with the film!
Kim Strandli - Director, responsible for all aspects.
Rasmus Meyer - Music & Sound Design
Freja Printz Ringbæk - Sound Design
The film was made over the course of 28-ish weeks.
- 3 weeks of story exploration and development, starting with scratch-sound edit and placeholder stock-photos.
- 13 weeks of pre-visualisation, research and development, early asset builds. Learning Python, developing automated shot-pipeline and asset-publishing tools.
- 12 weeks of shot and asset-production. Cinematography, editing, sound-design, and look-development, rendering and compositing.
Part of my goal of writing my own pipeline from the bottom, was to eliminate tedious bottlenecks, such as shot-setup, directory traversal, render submission and output processing.
I integrated my shot manager directly into the school’s render-farm powered by AWS Deadline, so that I could submit multiple shots to the farm with a single click whenever I updated sets and props.
To work more efficiently, I made a “submit testFrame to farm”-button, which would render a single production keyFrame into an output directory separate from the full production renders.
LightPass-AOVs were automatically generated based on naming-convetions at render-time on the farm.
RenderSamples could also be overriden based on inputs from the shotManager, allowing to tweak render-quality of multiple shots quickly.
My comp-template in Nuke allowed for seamless and automatic toggling between the two CG outputs, allowing me to comp the full shot with a single frame, and re-render the comp once the full production render was ready and approved. The comp would similarly also output renders to two different directories depending on the render-job.
"5-frame full-quality"- and "lowQuality full-shot"-render submissions were also availble, for convenient render-quality control and diagnosing.
All renders were automatically processed by FFMPEG to compile frame sequences into ProRes proxy-files for edit and review, single-frame renders were formatted to png-files with metadata and render-time burn-ins. These files were collected into a directory showing only the latest renders of each shot, making it easy to track progress.