MPC recreates the look and the performance Arnold Schwarzenegger gave in the 1984 original for Terminator Genisys.
MPC of Montreal was tasked with the biggest VFX challenge in Terminator Genisys: recreating Arnold Schwarzenegger’s memorable T-800 cyborg from the original 1984 Terminator feature for a fight with the older version of himself known as the Guardian in an alternate timeline. But, as we know, making a CG version of a younger actor can take you into the Uncanny Valley (such as Jeff Bridges in Tron: Legacy).
“When I first heard about the project a year and a half ago, quite frankly, I thought it was madness to take this on,” says Sheldon Stopsack, MPC’s visual-effects supervisor. “And the more iconic a character is, the more difficult the challenge becomes. Nevertheless, when you think something is crazy, there is this fascination as well. We knew we had to step up the game in the same classical fashion as any other digital character that we’ve done in the past. We also had to reconfigure the whole process in our pipeline.”
The challenges didn’t stop there. “From a build perspective, how do we model it? Normally, we’d set up a texture and photograph shot as a good starting point,” says Stopsack. “But the Arnold Schwarzenegger of 1984 doesn’t exist anymore, so we didn’t have that luxury.”
So the first thing they did was collect vintage images and footage, including Pumping Iron. This gave them a reference point for his look back then. Interestingly enough, this became the theme pretty much throughout the entire movie: the constant evaluation in comparing against the footage that they had assembled and created a library from. “It was something we utilized from the very beginning in setting up the model and then always within in the context of the shot,” says Stopsack.
MPC also checked out the work of other visual-effects facilities and research groups — including Paul Debevec’s Light Stages at his Institute of Creative Technologies at USC — and conducted its own research to push it to another level.
“An early decision that we made during the build-up process was evaluating the implementation of the full path-tracing methodology that came with the RIS rendering in RenderMan 19,” Stopsack says. “This was interesting because it leaves you with a more physical, plausible full ray-traced approach, which is always desirable if the goal is to do something physically accurate. But on the back of that, Terminator was one of the first shows involved in that implementation and deployment of that technology, which meant for us that we had input in what the requirements were in terms of the shader light. On the back of RenderMan 19 and RIS, they introduced a new skin-shading model, which went hand in hand with the texturing work that we had to do, which was oriented more toward a multilayered scattering approach. Early tests of the skin model that we did shifted with the coming of RenderMan 19, so the subsurface scattering was a lot more naturalistic and true to a human being. This approach also provided much richer features and fine micro-level detail of the skin pores.”
Interconnected Techniques
Shading, texturing and rigging were interconnected. Therefore, a lot of work that happened on the back of the rig development fed into the skin shading and dynamic muscular displacement, so data from the rig drove the shader. For instance, blood flow had an impact on subsurface skin appearance,
MPC took an inside out approach to rigging, which is more common in the industry today. “We needed to create the underlying skeletal structure, making sure the body proportions are anatomically accurate to what Arnold Schwarzenegger was and still is to some extent, and have the muscular system built out of what Arnold looked like in ’84,” says Stopsack. “But your digital model still ends up being an abstraction. So to ensure a 1:1 accuracy where we recreated shots from the original movie, there was also a level of correction that had to go on top of every single shot.”
The recreation of an iconic figure also requires a high level of performance. But it’s not as simple as animating a robot. They had to analyze Schwarzenegger’s distinctive performance. They made a shot list of facial expressions and body movements from the first movie that gave them a target of what the digital character could look like. Of course, creative license was needed in terms of how stern or angry he should be at any given moment.
Overcoming the Uncanny Valley was a further analytic exercise. The eyes were important but so were the lips, the nostrils and the tip of the nose, Stopsack says. So it was constant shot evaluation and fine-tuning it until it looked believable. “In the end, you deviate from the brute force analytic approach,” Stopsack says. “We put a shot together and assessed what the weak points were and then we tried to improve on that. And it gets back to the collaborative iteration that goes on between the animator and the rigger. It comes down to the talent and seeing the same thing where it looks like Arnold and it looks like the original.”