My experiments with Remotion + Codex to create videos

If you’ve ever wanted to create videos using AI, try Remotion. It’s a powerful tool that lets you vibe code React to design, animate, and render videos, opening up a completely different way of thinking about content creation.

Unlike traditional video editing tools that rely on timelines and manual keyframes, Remotion treats videos as code. This means you can vibe code to build reusable components, generate dynamic content (like personalized videos or data-driven animations), and iterate much faster.

In this post, we’ll walk through a simple, practical workflow to get started with Remotion on a Mac. Including installation, setting up a project, generating a video using prompts, previewing it locally, and finally rendering it into a shareable file.

This is not a polished tutorial - think of it as a working notebook. It captures the exact steps, commands, and prompts used while exploring Remotion, so you can follow along, experiment, and adapt it to your own use case.

Disclaimer: This is a draft blogpost. I am dumping links, quotes and thoughts here until I'm actually ready to finalize the blogpost. 

Table of Contents

Installation

Install npm and Codex

Project setup

Create a folder for video, cd in new folder and run:

npx create-video@latest --blank ; npm i ; npm run dev

(Choose defaults for all question

Now live video preview will be running on http://localhost:3000/MyComp 

Create Video

Open new tab in Terminal and cd to the project folder e.g.

cd ~/Documents/my-video/

Run codex cli in this folder:

codex

Run these prompts and preview video after each prompt:

Use Remotion best practices skill. Create a 30-second video at 30fps. Start with a hook: white text on black background, Where do you even buy music anymore?' — fade it in, hold for 2 seconds, fade out.

Preview video at http://localhost:3000/MyComp 

Next section: show our logo from public/discdepot-logo.png. Use a warm orange gradient background. Add a subtle tagline below:

"Music, delivered!"

Preview video at http://localhost:3000/MyComp 

In the next section, add a counter that counts up to 12,000 with a plus sign. Label it 'Happy customers. Keep the same background style!

Preview video at http://localhost:3000/MyComp 

In the next section: Create 5 CD album covers as abstract gradient cards. Different color combinations. Animate them sliding in one by one, arranged in a row.

Add artist names and song titles.

Preview video at http://localhost:3000/MyComp 

Let's add a final section:

End with a call to action: 'Find your sound.' and the URL 'discdepot.com'.
Fade to black

Preview video at http://localhost:3000/MyComp 

Render a video

To render a video, run:
npx remotion render

Links to get you started:

  • remotion.dev/docs
  • remotion.dev/prompts

Source for this blogpost: https://www.youtube.com/watch?v=5NRAOnKc3c8

 

Related post

Generating Images from the CLI Using ChatGPT's $20 Plan (Without Getting Blocked by Cloudflare)

How OpenAI's Codex CLI Quietly Unlocked Scriptable Image Generation for Paying Users

Most people using ChatGPT's $20-per-month plan think of it as a chat subscription. That's reasonable, because that's how it's marketed - but it undersells what's actually included by a significant margin. Buried inside that flat monthly fee is access to image generation compute that, priced out through any direct API or credits-based system, would cost most active users several times what they're paying. The problem is that getting that compute to do anything useful outside the browser - to slot into a script, a pipeline, a server-side workflow - has historically been an exercise in frustration, mostly because the chatgpt.com website sits behind Cloudflare's bot detection and actively resists automation. A recent, relatively quiet update to OpenAI's Codex tooling changes that picture considerably, and if you're someone who cares about programmatic access to AI capabilities without paying per-image rates, it's worth understanding what just became possible.