How To Create AI Viral Videos With Nano Banana Pro and Kling 2.6 Motion Control: Step-by-Step Guide for Making AI-Generated Content
To create custom AI viral videos: Use Nano Banana Pro to generate a high-quality character image. Create or upload a motion reference video to define the action. Finally, use Kling 2.6 Motion Control to apply that movement to your character image, choosing "Match Video" for precise motion.
ElevenLabs has introduced a new integrated workflow designed to streamline the creation of custom AI-generated videos. By combining its high-fidelity image model, Nano Banana Pro, with the motion control capabilities of Kling 2.6, the platform now allows users to place specific characters or themselves into dynamic scenes with precise movement synchronization. The update aims to reduce the technical barriers for creators looking to maintain character consistency across animated sequences.
Character Consistency Meets Motion Control
The core of the new system lies in how it bridges the gap between static image generation and fluid video motion. Users begin by generating a base image using the Nano Banana Pro model, which is optimized for high-resolution 2K outputs and detail retention. This image then serves as the visual anchor for video generation.
The integration utilises Kling 2.6’s "Motion Control" feature, which offers two distinct processing modes: "Match Image" and "Match Video." According to technical demonstrations, the Match Image mode prioritizes textures and facial features but is capped at 10 seconds. In contrast, the Match Video mode supports up to 30 seconds of footage and prioritizes skeletal movement and camera trajectories. Gemini Nano Banana AI 3D Figurines Trend: Know How To Create Your Own Viral 3D AI Figurine With Google’s Gemini 2.5 Flash Image; Check Limits and Prompt.
Generating Motion Without Source Footage
One of the most significant hurdles in AI video production is the requirement for "reference videos" to guide movement. ElevenLabs has addressed this by allowing users to create their own reference clips within the platform. By prompting simple movements on a generic mannequin or character, users can generate a "motion template" that is then applied to their primary custom character.
"The video reference dictates the action while the text prompts primarily handle background and lighting," a representative explained in the workflow demonstration. This separation allows creators to experiment with complex actions - such as dancing or walking through 3D space - without losing the likeness of the original character.
How To Create AI Viral Videos
Here is the step-by-step guide to using Nano Banana Pro and Kling 2.6 Motion Control within ElevenLabs as outlined in the video:
Step 1: Create Your Base Character/Scene
Before adding movement, you need to generate the high-quality image of the character or setting you want to feature.
- Navigate to Image Generation: In ElevenLabs, go to the Image tab.
- Upload Reference (Optional): You can upload a photo of yourself or a specific character to maintain likeness.
- Configure Settings: Select the Nano Banana Pro model for the highest quality. Set your resolution (e.g., 2K) and paste your descriptive prompt.
- Generate: Click generate and select your favorite output to use as your "Target Image."
Step 2: Create a Motion Reference Video (If needed)
If you don't already have a video of the movement you want to replicate, you can generate one using AI.
- Generate a Simple Subject: In a new tab, use Nano Banana Pro to create a simple shot of a character (or mannequin) in a neutral pose, such as "standing with arms by side."
- Convert to Video: Drag this simple image into the Video tab's "Start Frame" box.
- Prompt the Action: Select Kling 2.6 and describe the movement you want (e.g., "do a little tango dance" or "hopping on one leg").
- Generate: This creates your "Motion Reference Video."
Step 3: Combine Character and Motion
Now, you will apply the motion from Step 2 to the character you created in Step 1.
- Go to the Video Tab: Select the Kling 2.6 Motion Control model.
- Upload the Target Image: Drag your favorite character image (from Step 1) into the Image Reference box.
- Upload the Motion Video: Drag your reference video (from Step 2 or your own file) into the Motion Video box.
Step 4: Select Your Processing Mode
Choose how the AI should interpret the movement based on your goals:
- Match Video Mode: Best for exact skeletal movement and camera trajectories. Supports up to 30 seconds. Note that the character may stretch slightly to fit the reference geometry.
- Match Image Mode: Best for maintaining specific textures and facial features. Movement is secondary to character consistency and is capped at 10 seconds.
Step 5: Finalize and Generate
- Optional Prompts: You can leave the text prompt blank, as the motion video dictates the action and the image dictates the character. Adding prompts typically only affects background or lighting details.
- Generate: Click generate to produce your final custom viral video.
Industry Context and Capabilities
The release comes amid a competitive surge in the AI video space, with companies racing to solve the issue of "character drift," where a subject's appearance changes between frames. By using Nano Banana Pro for the initial frame and Kling 2.6 for the physics-based movement, ElevenLabs is positioning itself as a comprehensive suite for both high-end image synthesis and functional video animation. Retro Prompt: Google Gemini’s ‘Nano Banana’ AI Saree Trend Goes Viral, How To Create Vintage Saree Look For Stunning Portrait Images.
The system is currently accessible through the ElevenLabs interface, supporting a variety of resolutions and aspect ratios. The company noted that while text prompts for the video stage are optional, the primary control for creators remains the visual reference, allowing for a more intuitive "drag-and-drop" creative process.
(The above story first appeared on LatestLY on Jan 15, 2026 10:40 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).