Zelili AI

Luma AI Just Solved Video’s Biggest Problem with Ray 3 Modify

Luma Launches Ray 3

Luma Launches Ray 3 Modify: Toward the other end of video generation from scratch, Alibaba’s Wan 2.6 is generating full-length videos and has an absolutely bonkers new demonstration. Luma launched Ray 3 Modify, a new workflow to address the single biggest headache in AI video: control.

But for one year, “AI video” has essentially meant typing a prompt and praying for the best. If you were looking for X facial expression or Y camera move, you rolled the dice.

Ray 3 Modify reinvents the game by presenting a “Hybrid-AI” workflow that enables shooting real actors and employing them as the structural “skeleton” for generative scenes.

The Hybrid” Acting Engine

Luma Ray 3 Continuity

The key philosophy here is “Performance Preservation”. You can record a crude scene, even on your phone, in somebody’s backyard, at four o’clock in the afternoon and Ray 3 Modify will repaint the reality surrounding your actor while preserving their precise timing, moodiness and velocity.

  • Character Reference: This allows you to “skin” your actor. You can change a person in a T-shirt into a cyberpunk soldier or an armored fantasy wizard, and the AI will stick to that new identity throughout the entire clip, avoiding the dreaded “morphing” glitches we’re used to getting.
  • Start & End Keyframes: Lastly, we control time exactly. You get to specify exactly how a clip should begin and end, meaning that you’re forcing the AI to intercede between two different visual states in a way it deems as smooth. This is essential for professional editors who need shots to cut together seamlessly.

Reasoning Meets HDR

Luma Ray 3

Underpinning all this is the Ray 3 base, a model that Luma first revealed in September but that is now fully unlocked. Luma says Ray 3 is the world’s first “reasoning” video model.

Unlike previous models that hallucinate physics from frame to frame, for Ray 3 the “plan” is made first and then it renders. It grasps 3D space and cause and effect, so it doesn’t fall apart when faced with complex interactions like fluid simulations or crowds.

Also Read: Google Just Gave Us a Microscope for AI Brains: Gemma Scope 2

Finally, at the moment, it’s also the only model to provide native 16-bit HDR output (which can be output as an EXR sequence). This is a giant tell that Luma isn’t just after TikTok creators; they are aiming for Netflix-grade post production pipelines.

The Reshoot Button

Industry insiders are referring to this as the “AI Reshoot Button.” Instead of retaking a scene because the lighting was off or the costume seemed shabby, directors will now simply “Modify” it.

With different selectable backgrounds, lighting changes or even complete wardrobe swaps possible while retaining the actor’s performance, Luma Ray 3 Modify might have found the tool which finally ends up bringing generative AI to an authentic Hollywood pipeline.