New CG spot for Smart Heimkehr
Check out the recent Smart car ad with ground-breaking postproduction and visual effects by nhb, and find out how the spot was created as VFX supervisor Nhat Quang Tran walks us through the production process
BBDO, Cobblestone and the German-based postproduction studio nhb worked on this new spot for Smart.
nhb worked on the whole postproduction of the film, including grading, editing, visual effects and sound design. The team had a core production time of four weeks. The prep time for tracking, character development, etc. took another four weeks before core production started.
Once the pipepline had been worked out, the shots were completed quite quickly: nhb could finish a shot after three days, on average.
“We could even have progressed faster, but with that relaxed timeframe given, we were using it to push the creative,” says Tran.
What’s the story?
A man and woman are driving to the harbour. They’re alone – hmm, something is mysterious. There’s something he has to tell her, very openly. Something’s not right – there is trouble in paradise. He tells her of his “mission” – to find intelligent life on Earth. Then he says that he has to go home, but in an awkward way that is completely ‘foreign’ – watch the film below to find out what happens next…
Watch the exclusive making-of video here
BBDO came up with the first treatment for Smart Heimkehr in September 2012, and Cobblestone joined as production company soon after.
“I already knew the creative director David Mously from a previous project,” says Tran. “I met the director Robert Nylund for the first time in person in October for the PPM at BBDO Berlin. I felt, all creative had the same vision of the film and heading for the same direction.”
The film was completed using a combination of Softimage, Maya, ZBrush, Nuke, V-Ray and Flame.
- Read reviews of the specific software by clicking on the links above
“We used Softimage as a main 3D environment for the character effects in Smart Heimkehr.”
“Mudbox was used for sculpting during production while ZBrush was used by our Art Director Carsten Kuhoff at the design stage of the project.”
“The car was done by our team in Dusseldorf using Maya. The rendering of the alien was achieved with Arnold. The CG car used V-Ray.”
The tracking was mainly done as a combination of NukeX and manual hand tracking by the company’s 3D team in Hamburg: “We simply used, whatever our artists felt comfortable with and being able to achieve a fast progress. Final touches, conform and additional effects work was done in Flame.”
“I think the alien was the most interesting part of the production. It had the greatest creative degree of freedom, so we had to keep it more open production wise. It turned out, that the production itself was going to be very straight forward.”
“We did a dedicated design phase and locked most of the creative. Starting from there we worked straight forward. We took the approved ZBrush design model and built a clean base mesh in Softimage to be aware of a fluid edgeflow, so we can rig it properly.”
The alien head model and sketches of the initial alien design
“This model was refined in Mudbox to create displacement and texture maps. The decision for Mudbox was more a matter of habit, than a technical decision. The same result could have been done in ZBrush or any other sculpting tool. We simply were used to work in Mudbox and we wanted to progress fast.”
“We used on set HDR photography for a basic lighting and refined it by adding lights to the scenes. The rendering was achieved using Arnold.”
Creating believable alien skin
“I think one of the most interesting parts of the production was the skin look of the alien itself. We did a series of tests for it and communicated with the agency to drive it into the right direction.”
“We felt, it should have a clear visible subsurface translucency, without getting it wax-like. So we added underlying structure based on painted textures underneath it, like veins and pigments. First results turned out to look too procedural.
Alien skin shaders tests: For the cinema, nhb had to take into consideration the overall look of the colour-correction of the film. It had to look as if it were nighttime. Skin tones were naturally maintained, but the environment lends dramatic mood to the film
“It took another day and a fresh eye of our art direction to balance it correctly the whole alien head, so it made sense.
“You can especially notice it, if you look at the dotted pigments. We rendered out nine passes at all, to get control over the look in NukeX, without driving it too extreme, so comps and re-rendering getting to slow and take to much rendering time, to achieve a fast progress. At the end an average of six passes were needed.
“Interestingly we also introduced a refinement session with the client, where we fine-tuned the look in our grading suite using a full open NukeX composite. That was a real production booster. After that session, we all sat in the grading suite and said: “Yes, that’s it. That’s the alien we all wanted.”
Then that NukeX comp was used as a master comp afterwards, with such slightly modifications concerning colour and rotos.
Arnold was an eye-opener
Except for Arnold as the renderer, nhb used tools that it was already familiar with.
They are actually figuring out which renderer to use as main renderer in future: “We clearly saw the limits of Mental Ray today,” Tran eloborates. “Especially if it comes to displacement, Mental Ray rendering times become unforeseeable. Natural choice for the project would have been 3Delight as a renderman compliant renderer with fast displacement and native support for Softimage, but we had already bought some Arnold licenses and started playing around with it, simply because we got access to it.”
“First tests were stunning, while render times were quite controllable. I think we never went over 10 minutes per frame, on full HD resolution with displacement and motion blur, even on the slower machines, which just contained 4 GB of RAM and Intel Quadcore Xeons. Coming from Mental Ray, that’s really awesome.
“I wouldn’t say, Arnold was vital and a ‘never-would-have-been-without’ for the production, but it was my greatest learning curve and most astonishing for me, especially if you notice, it isn’t 1.0 yet – the documentation is just made of a Wiki and they don’t even have a website yet! It felt a bit like Shake by Nothing Real, before Apple bought it.
“Do you still remember [the line from] the Shake documentation: “If your supervisor thinks it’s a good idea to use the sky for keying, shoot him!”? It felt like that.”
Key pipeline moments
“I simply love the floating point colorspace of NukeX,” says Tran. “Together with floating point EXRs coming from 3D you can achieve looks and adjustments very fast and easy. Even it’s usually not our style of working – we tend to deliver 3D elements lit and shaded as close to final picture as possible – you can tweak a lot in compositing, if needed.”
nhb also managed to create an EXR-based workflow to get fast exchange to their Flame-based online systems – this was vital to get the work back and forth to client review sessions and change requests.
The project wasn’t without its challenges though, and Tran explains that these were more on the creative than the technical side.
Questions they asked were: How does the alien look? How should it move itself? How can the tentacles support the emotional expression of the character?
“We invested far more time to develop these aspects of the production, than for the technical side of it,” comments Tran.
Of course the alien presented some technical challenges. According to Tran the most difficult sequence was the one showing the slime at the back of the alien’s head, which appears in the 66 sec. version of the film.
“He opens up the back of this head and we see a close-up shot of it. We wanted the moment to become a bit disgusting, so we wanted to add some lines of slime.
“We didn’t want to spend a lot of time for Softimage ICE simulations, so we took a Canon 5D and shot practical elements in our kitchen and comped them in Flame afterwards.
“We tried different materials to get the right look. Some egg white elements made it into the shot, were the alien shakes it head. They’re flying away as slimy elements from it. Seeing us in the kitchen with egg white and us wrapped in black stage molton was hilarious.”
A classic sci-fi ending?
nhb was also responsible for the UFO sequence. Thye didn’t want it to be an overly fancy sci-fi shot. Instead they wanted it to feel more like in Close Encounters of the Third Kind (1977).
“We took a lot of references from it and its modern homage to the movies of that era Super 8. The UFO should mainly consist of light. So we built a kind of UFO mesh kit in Softimage.
“It contained several more or less ‘classical’ UFO elements, which we then took into NukeX, where we stuck the different parts together and played around with different textures and glow effects on it.
“We thought it would be vital to see the 3D together with the 2D effects we would add on it to achieve the final look, so we kept the 3D except modelling in NukeX, including lighting and rendering.”
“Later on we added lensflares and atmospheric effects in Flame, to get it into the right mood of the film.”
A detailed look
Tran takes us through a few key moments from Smart Heimkehr:
“Let’s focus on the moment after the alien has removed his human face, explaining his real story. We took this as a master shot for the production to get the assets ready and the client happy with the overall design and animation.
“In the side-by-side video you can pretty much see how we adopted the facial expressions of our actor and projected them onto our 3D alien face.
“The facial animation system for the alien it based on the FACS system by Paul Ekman. He used it for psychological analysis of human facial expressions, where he split up the human face into partial movements. We used them the other way around and adopted them to create the shapes for the alien head, as far it was possible.
“The final design for the alien was lead by our AD Carsten Kuhoff, who shaped out all the details of the alien head, so it made sense in a anatomical and design perspective.
“The tracking marker setup on the actor’s face is a result of my thesis, written back in 2003. I analysed the FACS system and figured out stress points for the skin movement in the face, which would naturally make up the best spot for tracking points in a FACS based setup, as long as you can’t look under human skin with normal cameras.
“So, having them placed correctly makes it very easy to adopt the actor’s facial expressions onto the 3d model, cause you always got a very good matching 3D equivalent to animate. Because the facial expression layout was already based on the result of muscle movements, there was no need for muscle simulations. So the rig ended up as a standard rig using joints and FACS based shape deformations.”
If you’d like to know more and you’re heading to FMX this year, then check out Nhat Quang Tran’s presentation about the film.
on Friday, May 4th, 2012 at 2:52 pm under Commercial, Showcase.
You can subscribe to comments.
You can leave a comment, or trackback from your own site.
Tags: CG spot, nhb, postproduction, Smart Heimkehr, VFX