Innovation Week: Face rigging on the iPad
When character TD Iker J de los Mozos saw some videos showing touch-based iPad interfaces for mixing music, he quickly saw the potential for a similar setup to help in his own work.
The demo system he’s created relies on 3ds Max’s ability to accept controller signals via the MIDI interface, widely used by musicians to connect synthesisers to PCs.
De los Mozos used hexler.net’s TouchOSC as the controller on his iPad, using its customisable interface to lay out a face rigging system. He told us more about the process when we interviewed him.
3D World: What’s your job role and company at the moment?
Iker J de los Mozos: At the moment I’m working as a freelancer back in Spain, after having worked in UK for one and a half years in commercials (Nexus Productions, The Mill) as Character TD. Most of my time is focused on teaching ‘rigging’ at Animum3D, under their LiveOnLine courses which allows me to follow the work of my students remotely. I’m planning to go back to animated features soon and my eyes are looking at the United States, so I hope in some months I will be able to cross the ocean to work there.
3D World: Can you briefly describe how this rigging system works? I know TouchOSC is a MIDI communication app: what does the iPad running connect to? Is there a MIDI interface on your PC, and how does the information get through to 3ds Max?
IJDLM: The system itself is pretty basic. I came up with it after stumbling on some videos that showed these kind of interfaces to drive software for mixing music and controlling synthesizers. For setting the whole thing up on a PC I had to use a couple of additional tools to be able to get the MIDI signal from the iPad.
MIDI signals are sent from the iPad using TouchOSC, an app which is running the interface. TouchOSC Editor was used to design and build both the interface and the behaviour of each control (MIDI Channel, signals, etc.). A small program called rtpMIDI, written by Tobias Erichsen, was the key for my computer to get those MIDI signals through my local wireless network. I also needed to use loopMIDI, written by Tobias too, in order for 3DSMAX to read the MIDI data. However, after trying recently I realised that this program was only necessary because I had a MIDI piano attached to my computer at that time, so it is not strictly necessary. I have a very rough knowledge about how MIDI works, but knowing what I wanted to achieve led me to try every single combination I could find until luckily everything seemed to work.
3DSMAX has a built-in animation controller that allows the user to hook MIDI devices, keyboards or mouse. So the final step is to tie the joystick and the sliders that drive the facial shapes with this controller. And then the magic happens!
After I released the video where I demo this system, I found that I could use OSC messages and PureData, which is a real-time graphical programming environment for audio, video and graphical process, to get more advanced behaviours on the iPad sliders. For example, to create a button that resets the value of one single slider or the whole rack, which is essential to an animator, or for loading predefined facial expressions for a character. I also started to try to send MIDI data to Maya and Softimage, but that needed a bit more time since I needed to write some additional Python code. So I decided to put the project on hold for a while.
3D World: Once you got it to work, how did you like using it? Did you find it easier to set subtle expressions?
IJDLM: I love puppetry. It’s not just the person bringing life to a character but that human-object interaction. Take that concept, throw some technology at it and you have the possibility to drive in real-time digital assets with something that you can touch with your fingers. When I built the interface I thought more about this kind of interaction rather than a traditional way of animating a character. This could be an animatronic but without having to deal with cables, servos and screws (which I believe is fun but I have null skills on it).
3D World: Has this been used in production, or does it remain a proof of concept? Are you looking to develop the idea so that it could be used in production?
IJDLM: I never had the opportunity to try this in production, mainly because it might need some development time and I’m not sure it would be that useful for an animator. I always thought of this as a toy. I like to take ideas from here and there (always giving credit to the original authors!) and try to combine them to make something new. Some of these ideas or tools have been used in production (fRigBuilder, MorphConnector, …), but most of them stay as small sketches that I use to keep learning.
You can find out more information by following Iker’s blog.
on Thursday, July 19th, 2012 at 12:59 pm under Analysis, Features.
You can subscribe to comments.
You can leave a comment, or trackback from your own site.