In the early 1990s, I developed the FaceTracker, a method and device for non-contact tracking of facial expressions in real time. This process revolutionized the lip-synchronous representation of animated characters in cinema.

Basically, it is a special head camera that stays on the face even when the head is movement. A connected real-time image processing recognizes the changes in the facial expressions, transforms them into data and makes them available live without the appearance of the face, i.e. detached, only as a movement.

This allows you to remotely control the facial expressions of trick figures live. This saves effort, costs and is more accurate.

In the October 1995 issue of the American WIRED magazine it was reported that I just had a small studio in Los Angeles with a partner and a shop (!) for service and sales from FaceTracker. The report in WIRED is astonishing, because WIRED discovered me in the very early stages, when hardly anyone else understood what I was doing.

The FaceTracker was eventually used by almost all major film studios and was used internationally in many film production companies, special effects companies, research institutes and universities, including in medicine and psychology applications.

In any case, several hundred units sold helped to revive the computer animation industry in cinema films and to make trick figures more alive at a lower cost.