DEMONSTRATION

Deep Motion Synthesis for Character Animation

About Demonstration

About

«DEep MOtioN SynThesis foR character AnimaTION (DEMONSTRATION)» is a project that covers a wide range of multidisciplinary topics that are in line with the recent tendencies in computer graphics, character animation, and virtual reality. It aims at investigating modern trends in machine (deep, convolutional, adversarial, and reinforcement) learning, with ultimate target to provide ingenious solutions for overcoming the current limitations in character animation, and essentials for future improvements in a wide range of ambitious and challenging projects.

The main research objective of this project is to advance the current limitations in motion analysis and synthesis. Some of those unconventional aspects consist of methods for creating well-organized motion repositories, generating autonomous and interactive characters, and motion retargeting.

Research Impact:

  • Industrial impact - through our work, we will create a mocap database which can benefit small enterprises acquire data for training purposes. The outcome of the project is expected to advance autonomus character animation generation, provide the foundations for interactive applications, and aid the natural visualization in VR/AR.

  • Social and cultural impact - the dance motion capture database we aim to create helps protect our cultural dance heritage.

  • Scholarly impact - through our research, we will discover and use novel research methodologies in the field of character animation, analysis, and synthesis

Research Goals:

  • We aim to create an innovative mechanism for real-time synthesis of realistic, highly dynamic dance motions, with long-term memory that respects the semantic context of its genre.The generated motion will be fully autonomous, controlled only by the music beat and/or the interaction with other characters.

  • We intent to build unconditional generative models that learn the internal distribution and statistics of a motion sequence, and then be able to generate high quality, diverse motion that carry the same content as the original.

  • We plan to model fully convolutional GANs to learn the distribution of motion segments within a sequence, and generate movements using only the important and unique segments that best describe and summarize that movement.

  • Through our course of action, we will develop a skeletal retargeting algorithm that, driven by human-like motion, manipulates characters with fundamentally different body proportion, joint constraints, or moving style. Retargeting is a non-trivial process that is of great interest for the film and game industry.

  • Furthermore, we aim to develop and train a network using physics-based reinforcement learning to animate highly varied creatures that do not exist in real-life or are currently difficult to motion capture.

Objectives

  • Investigate the current trends in deep, convolutional, adversarial, and reinforcement learning.

  • Enrich our Dance Motion Capture Database with high quality dances and generic data that can be used to train our deep neural networks.

  • Develop a network for music-driven motion synthesis that respects the representative culture for different dance genres.

  • Design a network that generates shorter or longer movements of a corresponding input motion, ensuring though that the movement distribution of that motion remains the same.

  • Develop networks for motion retargeting with different skeleton configurations (e.g., biped to quadruped). We will use deep learning descriptors to learn the spatiotemporal correlations between the joints' transformations and movement of the two characters, and to automatically preserve the essence of the motion naturally.

  • Animate highly varied agnostic creatures by using deep reinforcement learning to generate physically based animations.

People

Andreas Aristidou

Andreas Aristidou

Project Manager

Christina-Georgia Serghides

Christina-Georgia Serghides

Research Associate

Anastasios Yiannakidis

Anastasios Yiannakidis

Alumni

Andreas Lazarou

Andreas Lazarou

Alumni

Publications

For more publications please visit the Graphics and Extended Reality Lab website.

Digitizing Traditional Dances Under Extreme Clothing: The Case Study of Eyo

Temi Ami-Williams, Christina-Georgia Serghides, Andreas Aristidou

Published at Journal of Cultural Heritage, Volume 67, pages 145–157, February 2024.

The video has been presented at the International Council for Traditional Music (ICTM) and the Cyprus Dance Film Festival (CDFF).

This work examines the challenges of capturing movements in traditional African masquerade garments, specifically the Eyo masquerade dance from Lagos, Nigeria. By employing a combination of motion capture technologies, the study addresses the limitations posed by "extreme clothing" and offers valuable insights into preserving cultural heritage dances. The findings lead to an efficient pipeline for digitizing and visualizing folk dances with intricate costumes, culminating in a visually captivating animation showcasing an Eyo masquerade dance performance.


Collaborative VR: Solving riddles in the concept of escape rooms

Afxentis Ioannou, Marilena Lemonari, Fotis Liarokapis, Andreas Aristidou

Presented at the International Conference on Interactive Media, Smart Systems and Emerging Technologies, IMET 2023..

This work explores alternative means of communication in collaborative virtual environments (CVEs) and their impact on users' engagement and performance. Through a case study of a collaborative VR escape room, we conduct a user study to evaluate the effects of nontraditional communication methods in computer-supported cooperative work (CSCW). Despite the absence of traditional interactions, our study reveals that users can effectively convey messages and complete tasks, akin to real-life scenarios.


Collaborative Museum Heist with Reinforcement Learning

Eleni Evripidou, Andreas Aristidou, Panayiotis Charalambous

Published at Computer Animation and Virtual Worlds, May 2023.

In this paper, we present our initial findings of applying Reinforcement Learning techniques to a museum heist game, where trained robbers with different skills learn to cooperate and maximize individual and team rewards while avoiding detection by scripted security guards and cameras, showcasing the feasibility of training both sides concurrently in an adversarial game setting.


Pose Representations for Deep Skeletal Animation

Nefeli Andreou, Andreas Aristidou, Yiorgos Chrysanthou

Published at Computer Graphics Forum, Volume 41, Issue 8, September 2022.

In this work we present an efficient method for training neural networks, specifically designed for character animation. We use dual quaternions as the mathematical framework, and we take advantage of the skeletal hierarchy, to avoid rotation discontinuities, a common problem when using Euler angle or exponential map parameterizations, or motion ambiguities, a common problem when using positional data. Our method does not requires re-projection onto skeleton constraints to avoid bone stretching violation and invalid configurations, while the network is propagated learning using both rotational and positional information.


Virtual Library in the concept of digital twins

Nikolas Iakovides, Andreas Lazarou, Panayiotis Kyriakou, Andreas Aristidou

Presented at the 2nd International Conference on Interactive Media, Smart Systems and Emerging Technologies (IMET), October 2022.

In this work, we reconstruct the Limassol Municipal University Library in the concept of a digital twin. To do so, we conducted a perceptual survey to understand the current use of physical libraries, examine the user’s experience with VR, and identify potential use cases of VR libraries. Based on the outcome, we design five use case scenarios where we demonstrate the potential use of virtual libraries.


Digitizing Wildlife: The case of reptiles 3D virtual museum

Savvas Zotos, Marilena Lemonari, Michael Konstantinou, Anastasios Yiannakidis, Georgios Pappas, Panayiotis Kyriakou, Ioannis N. Vogiatzakis, Andreas Aristidou

Published at IEEE Computer Graphics and Applications, May 2022

In this paper, we design and develop a 3D virtual museum with holistic metadata documentation and a variety of captured reptile behaviors and movements. Our main contribution lies on the procedure of rigging, capturing, and animating reptiles, as well as the development of a number of novel educational applications.


Let's All Dance: Enhancing Amateur Dance Motions

Qiu Zhou, Manyi Li, Qiong Zeng, Andreas Aristidou, Xiaojing Zhang, Lin Chen, Changhe Tu

Published at Computational Visual Media, May 2022

In this paper, we present a deep model that enhances professionalism to amateur dance movements, allowing the movement quality to be improved in both the spatial and temporal domains. We illustrate the effectiveness of our method on real amateur and artificially generated dance movements. We also demonstrate that our method can synchronize 3D dance motions with any reference audio under non-uniform and irregular misalignment.


Rhythm is a Dancer: Music-Driven Motion Synthesis with Global Structure

Andreas Aristidou, Anastasios Yiannakidis, Kfir Aberman, Daniel Cohen-Or, Ariel Shamir, Yiorgos Chrysanthou

Published at IEEE Transaction on Visualization and Computer Graphics, Mar 2022

In this work, we present a music-driven neural framework that generates realistic human motions, which are rich, avoid repetitions, and jointly form a global structure that respects the culture of a specific dance genre. We illustrate examples of various dance genre, where we demonstrate choreography control and editing in a number of applications.


Virtual Dance Museums: the case of Greek/Cypriot folk dancing

Andreas Aristidou, Nefeli Andreou, Loukas Charalambous, Anastasios Yiannakidis, Yiorgos Chrysanthou

Presented at EUROGRAPHICS Workshop on Graphics and Cultural Heritage, GCH'21, Nov 2021

This paper presentes a virtual dance museum that has been developed to allow for widely educating the public, most specifically the youngest generations, about the story, costumes, music, and history of our dances. The museum is publicly accessible, and also enables motion data reusability, facilitating dance learning applications through gamification.


Activities & Media

Coming soon...

Acknowledgements


This project is financially supported by internal funds of the University of Cyprus.