Hello, I’m happy to invite you today inside the Cantoche R&D Department, headed by Laurent Ach, CTO. Laurent tells us more about a project called MyPresentingAvatar…
|Laurent Ach manages Cantoche’s technical team that engages in research and development of the Living Actor™ software suite. After graduating from Ecole Centrale de Lyon in 1990, he worked in the Simulation and Virtual Reality Group of Thales, for Visiospace and for Sagem Télécommunications. He participated in several research projects and is lead partner of the MyPresentingAvatar project.|
MyPresentingAvatar is a project whose goal is to automatically generate presentations provided by an avatar starting solely from a document. The goal is not to have the avatar simply repeat the written speech but to gracefully create the form and content of avatar intervention by selecting text and accompanying gestures that are appropriate for presenting the original text. The project is co-funded by the French Ministry of Finance through the “Web 2.0” call for projects. The partners include Lingway, Telecom ParisTech CNRS (National Center for Scientific Research) and Cantoche.
Lingway’s semantic analysis of a text input source automatically produces structured data that allow creating the speech for the avatar as well as indications about the avatar’s non verbal behavior. The Greta software program, developed by Telecom ParisTech, is responsible for processing these indications and choosing avatar gestures and attitudes, synchronized with speech. Finally, Living Actor™ technology generates video sequences where the avatar speaks with the specified behavior.
These three components of semantic analysis, behavior management and avatar animation are integrated in an online application developed by Cantoche, and compose an automatic production line of avatar presentations.
Automation of avatar speech and behavior must not prevent the user from fully controlling the final presentation. Thus, Cantoche developed a user interface for editing the dynamically generated parts of speech, reorganizing the sequences and changing expressive animations selected by the behavior management module.
The two-years project will be finished end of 2011. The first prototype of the MyPresentingAvatar application automatically generates presentations of potential candidates for job offers, starting from their resumes. The second version of the prototype will automatically generate a presentation of the statistical results about the most observed discussion subjects on the Internet during particular periods of time. The first version is now finished and a mockup of the second version can be seen online on Lingway’s blog.
The project outcomes are very interesting for future commercial applications based on the use cases of these prototypes. In addition, certain parts of the project results will be integrated into the Living Actor™ technology. For instance, the Living Actor Presenter online application will include some user interface components based on the work done on the avatar animation timeline.
Self-generated personal presentations, analysis of web usage trends and automatic content generation… The avatars’ uses continue to increase and not only in the field of graphic representations but also as real alter-egos or assistants in many situations.
What are your thoughts?
What will be the major applications for avatars in the coming years?
This post is also available in: FRENCH