History of Computer Character Animation Motion Capture:
1980-1983: Simon Fraser University — Goniometers
Around the same time, biomechanics labs began to employ computers to examine human motion. These research began to influence the computer graphics community with their techniques and apparatus. Tom Calvert, a professor of kinesiology and computer science at Simon Fraser University, connected potentiometers to a body in the early 1980s and utilised the output to control computer-animated figures for choreographic research and clinical evaluation of movement problems. To monitor knee flexion, for instance, they attached a type of exoskeleton to each leg and positioned a potentiometer next to each knee so that it would bend in tandem with the knee. The analogue output was then converted to digital and sent to the computer animation system. Their animation method employed motion capture equipment in conjunction with Labanotation and kinematic requirements to precisely define character motion.
1988: deGraf/Wahrman — Mike the Talking Head
In 1988, deGraf/Wahrman created “Mike the Talking Head” for Silicon Graphics to demonstrate the 4D machines’ real-time capabilities. Mike was operated with a custom-built controller that allowed a single puppeteer to manipulate the lips, eyes, expression, and head position of the character’s face. The Silicon Graphics chip allowed real-time interpolation between the performer-controlled facial emotions and head geometry. Mike was performed live at the SIGGRAPH film and video exhibition of that year. The live performance indicated conclusively that the technology was ready for use in production contexts.
1989: Kleiser-Walczak — Dozo
Kleiser-Walczak created Dozo, a non-real-time computer animation of a lady singing and dancing in front of a microphone for a music video, in 1989. They opted to employ motion capture techniques to achieve authentic human movements. On the basis of Kleiser’s motion capture experiments at Digital Productions and Omnibus (two now-defunct computer animation production companies), they selected an optically-based solution from Motion Analysis that used multiple cameras to triangulate the images of small reflective tape pieces placed on the body. The output is the three-dimensional trajectory of each reflector in space. As discussed previously, one of the issues with this type of system is the difficulty in monitoring spots that are obscured from the cameras. This was a very time-consuming post-process for Dozo. Fortunately, some modern systems are beginning to perform this in software, greatly accelerating motion capture.
1992: Brad deGraf — Alive!
After the success of deGraf/Mike Wahrman’s the Talking Head, Brad deGraf continued to work on his own, creating a real-time animation system today known as Alive! For one Alive! character, deGraf created a customised hand gadget with five plungers that were operated by the puppeteer’s fingers. The gadget was used to control the face expressions of a computer-generated, pleasant, talking spacecraft, which advertised its “parent” firm at trade exhibitions in a manner similar to Mario.
Today: Many players using commercial systems
Ascension, Polhemus, SuperFluo, and others have launched commercial motion tracking systems for computer animation during the past few years. In addition, animation software companies, such as SoftImage, have included these methods into their products, so offering “off-the-shelf” performance animation systems. Despite the fact that numerous issues have to be resolved in the field of human motion capture, the technique is now firmly established as a viable choice for computer animation production. Undoubtedly, as the technology advances, motion capture will become one of the animator’s fundamental tools.