***This article originally appeared in the August ’20 issue of Animation Magazine (No. 302)***
Although the promise and potential of VR/AR is far from being fully realized in 2020, according to a report from CCS Insight, consumers were expected to purchase over 22 million VR and AR headset and glasses this year. Heavy investment from the likes of Samsung, Google, Facebook and Apple continues to bode well for this complex area of art and entertainment. Of course, artists and animators have continued to push the creative limits of the technology, and festivals such as Tribeca, Cannes, Annecy and SIGGRAPH play an important role in spotlighting innovative projects from animated VR creators around the world.
One of the prolific pioneers in the field is Northern California’s six-time-Emmy-winning Baobab Studios, which has continued to deliver innovative and visually arresting animated VR projects such as Invasion!, Asteroids!, Crow: The Legend, Bonfire and this year’s Baba Yaga. This latest offering is directed by Eric Darnell (Madagascar films, Crow: The Legend, Bonfire) and co-directed by Mathias Chelebourg and is inspired by the Eastern European legend. Featuring the voice of Daisy Ridley, the experience invites audiences to follow their “10-year-old sister Magda” as they search a magical forest for a plant that can cure their mother’s illness. But the user and Magda must also watch out for the evil witch who lives nearby.
“Baba Yaga has been a project that we have been wanting to do for a long time, but we knew it was too ambitious when we first started Baobab Studios five years ago,” Darnell tells Animation Magazine. “I’ve always been interested in the story of Baba Yaga: a fairytale that has not gained the same level of popularity in America as other classic stories and is long overdue to be told in a new way. The witch named Baba Yaga appears with a variety of characterizations in the various fairy tales that include her. Sometimes she is a nasty witch that is a danger to children. Sometimes she is neither good nor bad. And sometimes she is a particularly good witch. We wanted to create a contemporary retelling of the story that leverages off of this history.”
Adventures in 2.5D
Baba Yaga is Baobab’s first project with human characters and the studio’s most stylized to date. “To get the theatrical two-and-a-half dimension look we wanted, we had to push the game engine to do things it doesn’t normally do,” explains Darnell. “We adapted many of the storytelling techniques we learned from Invasion! and Bonfire to Baba Yaga, such as how to approach interactivity and how to use motion, character actions, sound and lighting to direct the viewer’s attention.”
To produce the animation, the Baobab team uses a mixture of traditional animation tools along with proprietary technology and pipelines. “Like our past projects, we develop everything to run in a game engine so we can take advantage of a real-time toolset for creating interactive experiences,” says Darnell. “With our studio operating remotely during the pandemic, we also built out a toolset that allows us to quickly review and collaborate together remotely in VR.”
According to the veteran director, every project in the studio’s brief history has helped them learn more and experiment with what is possible in VR. “With Invasion! we learned about the power of immersive characters, but interactivity was limited at the time,” he explains.
“Asteroids! and Crow: The Legend confirmed for us the power of making the viewer an interactive character influencing the direction of the story. Crow: The Legend also helped us polish the use of theater and performance techniques to direct the user’s attention. We improved our capabilities of using theatrical lighting to design in our virtual environments. With Bonfire, we took all our past VR insights and added more complex AI to the characters to enhance the interaction and, for the first time, made the viewer the main character of the story.”
Adventures in Claymation
The National Film Board of Canada is another active player in the VR field, with titles such as Bear 71, Minotaur and Gymnasia receiving global attention in previous years. This year, the NFB presented director Frances Adair McKenzie’s five-minute project The Orchid and the Bee at Annecy. Produced by Jelena Popović, the short uses stop-motion animation with modelling clay puppets to offer a story that moves from a jellyfish changing its form to a bee that pollinates a flower.
Adair McKenzie began developing the concept for the short about four years ago. “The pre-production work of defining the narrative, technical development and designing the sets took about two years,” she tells us. “From day one of the animation process, to having all the elements spherically composited for VR in Nuke, took a little over a year. Getting a final sound design that we loved and the last color tweaks in online took us into 2020.”
The main inspiration for the project was natural science. “The piece started out on a very apocalyptic trajectory, and I eventually recognized that being in a dark room for years thinking about factory farming would be detrimental,” she says. “I began researching animal ethics, genetics and evolution. This led me to discover some very beautiful and complicated survival tactics present in the natural world. It was Donna Haraway, one of my favorite feminist theorists, who introduced me to the inter-species love affair of the orchid and the bee … I wanted to use this project as an opportunity to emphasize relationships, collaboration, disintegration and growth as a continual process in the natural world.”
The artist says she was excited about the possibility of working within VR to make a piece representing the beautiful and violent process of transformation to the viewer in a very intimate way. “I feel like stop-motion and clay animation is a form of metamorphosis,” she offers. “So, I wanted to pay homage to this and create a mutating narrative that could, through its materiality, echo the concepts of evolution and mutability that are at the core of its narratives.”
For Adair McKenzie and her team, the biggest challenge was the technical development and digital learning. “Almost everything in the project is real — the animation, the objects and the lighting effects — so it was an intimate and very experimental process,” she explains. “Each phase of the production had to be unravelled with the consideration that it would be integrated into a spherical environment. We built a motorized rotisserie set, which emulated the digital movement for a fixed camera but meant we were stop-motion animating over a rotating three-dimensional object. We adjusted the narrative arcs as we worked, because the sets were being discovered through the process.”
The director hopes her work will pique the audience’s curiosity. “I hope they experience it as a pure, magical sort of immersion,” she notes. “That they can then carry this strange mythical space around with them in the real world. I hope young adults come into contact with it and are totally weirded out and have to wonder at what it is and what it means. I think that encounters with strange works of art, especially at defining moments in our development, are very important, and that we less and less frequently experience media that is without an agenda.”
Post-Apocalyptic Vision
The work of sci-fi author Philip K. Dick (The Man in the High Castle, Do Androids Dream of Electric Sheep) was the inspiration for The Great C, a 37-minute work produced by Canada’s Secret Location in partnership with U.S.-based Electric Shepherd. Directed by Steve Miller and produced by Luke Van Osch, the striking piece was awarded the Cannes Film Festival’s Positron Visionary Award this year.
“Many of us at Secret Location are fans of the works of Philip K. Dick,” says Miller. “For a long time, we’d been looking for an opportunity to make a project based on his work. When The Great C became available, we took it. We didn’t know exactly what type of project we were going to make with it, we just knew the concept, setting and theme would work well in VR. We explored several different avenues. We made some prototypes that were more like a game, that had a lot more user-driven interactivity, but in the end we found that doing a cinematic-like experience in VR felt the most exciting and engaging.”
Miller and his team of about 20 worked intently on the project for eight months, and launched it at the Venice Film Festival in late August 2018. The film was rendered in real time using Unreal Engine. “Game engines are a particularly good fit for volumetric VR movies, allowing viewers to immerse themselves in the story world while maintaining six-degrees of freedom movement,” says the helmer.
“We used a mix of 3D packages to create the art: Maya, Blender, C4D and Modo were used, as well as ZBrush and Substance Designer,” he adds. “As a relatively small, scrappy team, we each tended to work in the software where we were most comfortable, and used fbx and occasionally alembic files to bring it all together in the engine. We also made use of the VR hardware itself, using Vive trackers to do quick and dirty mocap to work out most of the 3D blocking in the scene, which could then be passed off as reference to the animators. I am pleased to say the sultry walk of the story’s female antagonist, Grey, is mostly me!”
Miller says he is pleased that he and his team have created something that blends the pacing and visual language of cinematic storytelling with the immersion of VR. “There were lots of unknowns for what an audience could tolerate in terms of duration, editing and camera movement,” he maintains. “While I think there’s plenty yet to learn and evolve when working in this new medium, I’m overall really happy that I can still sit down and go through it and lose myself in the story.”
Of course, nothing can be as gratifying as an audience that completely becomes absorbed in the experience. “When we first premiered in Venice, our screenings would run until 10 p.m.; at which point a boat would ferry everyone off the island where the festival was held,” Miller remembers. “A gentleman had tried to sneak in a viewing at the end of the night, and begged them to hold the boats so he could see how the story ended! It’s truly humbling to get someone engaged with your story like that, and luckily we were able to get him in for a proper screening the following day!”
A Peek into the Future
The three directors we spoke with all seemed optimistic about the future of the technology and the art form. “VR is going to keep on growing and become more popular as headsets become easier to use and more affordable,” believes Darnell. “Unlike live-action entertainment, animation development works well even when working remotely, especially in our case because we already had a lot of experience working this way. I would expect more studios to embrace the use of animation for their upcoming projects. With powerful and affordable headsets like Oculus Quest ramping up production and with new functionality like hand-tracking rolling out, the level of immersion and interactivity will continue to increase in the months and years to come.”
“I’ve been particularly excited by projects like Wolves in the Walls and Vader Immortal,” where the viewer gets to engage with the characters in a story,” says Miller. “Feeling like you are connecting with fictional characters is such a cool concept, and I’m excited to see those types of interactions get even richer and more involved as we go forward.”
Adair McKenzie says she rides her bike every day past an outdoor VR headset installation, and she can’t think of anything that feels more antiquated right now. “I think that VR is still in its youth,” she points out. “It needs time to be explored by artists and makers, as well as the tools, software and hardware that are accessible and facilitate creativity. I think this moment in time is a make-it-or-break-it sort of situation for VR, and I hope it survives, because it is an interesting tool for therapy and immersive experience, but I cannot predict the outcome!”