Alibaba Intelligent Computing Research Institute Creates Animate Anyone: A Revolutionary Character Animation Technology

2023-12-05

Researchers from Alibaba Group's Intelligent Computing Research Institute have developed an advanced character animation technology called "Animate Anyone" that seamlessly transforms static images into dynamic character videos. This technology utilizes a diffusion model to address the challenges of maintaining temporal consistency and details during the image-to-video conversion process. In a research paper, the team describes their innovative work and introduces a framework specifically designed for character animation. A key element of this framework is the ReferenceNet, which combines the detailed features of reference images while preserving complex appearance characteristics. This is achieved through spatial attention, ensuring visual consistency throughout the animation process. The researchers also discuss an efficient pose guider that directs the character's movements, ensuring smooth and controlled transitions between frames. Additionally, they employ an effective time modeling method to ensure seamless transitions between frames in character animation. While diffusion models are at the forefront of visual generation research, the conversion from static images to videos presents numerous challenges, particularly in maintaining temporal consistency and details for characters. Animate Anyone aims to address these issues. If successfully released, it could pose a threat to short video content creators on platforms like Instagram and TikTok. With the aid of reference images, this framework can be used to create various forms of animation, including 360-degree rotations, enabling versatile video creation. The team acknowledges receiving numerous inquiries about demonstrations or accessing the source code on GitHub. They are currently preparing for a public release, transforming an academic prototype into a user-friendly version. However, a specific date has not been announced yet.