When AI learns to dream in ink — my experiment with Veo 3.1

When AI learns to dream in ink — my experiment with Veo 3.1 :clapper_board::tropical_fish::cat_face:

I spent the weekend experimenting with Veo 3.1, just to see how far AI video generation has come — and it honestly blew my mind a little.

The idea was simple:

“A drop of ink falls into water, turns into a fish, and then a cat walks out.”

That’s it. No complex prompt, no storyboard. Just curiosity.

But what came out of Veo 3.1 felt strangely alive.
The ink didn’t just become a fish — it transformed in a way that made sense visually. The textures swirled like real liquid, and the cat appeared with a sort of calm inevitability, as if it was always meant to be there.

What struck me most wasn’t just the realism, but the intentionality — the way transitions carried emotion. It reminded me of how human directors think about rhythm and symbolism, not just movement.

It made me realize something: AI tools like Veo 3.1 aren’t just about convenience anymore. They’re becoming co-creators — shaping tone, pacing, and even narrative subtext.

I enhanced a few frames afterward using HitPaw VikPea, just to test how much visual polish I could add. The combo worked beautifully — Veo builds the dream, and VikPea sharpens its light.

I’m curious: how are you all using AI for non-literal storytelling?
Are there prompts or experiments you’ve done that ended up feeling… emotional? surreal? symbolic?

Would love to hear your thoughts (or see your clips!).
Maybe the real magic isn’t in teaching AI to see — but to feel. :sparkles:

Veo 3(1)