Kinetix, the AI startup bringing emotes to video games and virtual worlds, announces major advances in the generative AI technology powering its platform; including an updated AI model for motion extraction from videos, and an AI tool that applies a predefined motion to any animation in one click.
The new Kinetix tools have been released as part of version 2.0 of the Kinetix platform and feature a new generation of algorithms for motion extraction from videos, creating better results in terms of posture, translations, and grounding. The second key element is style transfer filters, an AI tool that applies a predefined motion to any animation, enabling users to create more expressive emotes.
Henri Mirande, CTO and co-founder at Kinetix, said: “With so much debate recently on generative AI’s potential to streamline and democratize creative processes, we’re proud to announce these advances in our custom AI model. They mean that we can now more accurately extract complex motions from video content – such as backflips, parkour, or sprinting up a flight of stairs. We have also found that a large number of our users enjoy creating animations from a pre-existing library rather than uploading their own videos. Our AI-powered style transfer filters can be used to enhance both custom-generated and stock animations, adding more fun and humor into the mix.”