🎧 How a Hollywood Director Uses AI to Make Movies

Dave Clark shows us the future of AI filmmaking

Every illustration/IMDB.

Sponsored By: Gamma

Slides are a thing of the past. Gamma introduces a new way of presenting ideas—faster, more flexible, and powered by AI. With Gamma, you can create stunning content effortlessly, optimized for any device and platform, without wasting time on design or formatting.

TL;DR: Today we’re releasing a new episode of our podcast How Do You Use ChatGPT? I go in depth with Dave Clark, a film director and writer pioneering the use of AI tools to make movies. As we talk, we leverage these tools to make a short film in 60 minutes. Watch on X or YouTube, or listen on Spotify or Apple Podcasts.


You can break into Hollywood with a movie you made alone in your room without using a single camera. Dave Clark showed me how to do it live.

Dave is a film and commercial director with experience working with brands like HP and Intel who is now experimenting with cutting-edge AI technology. He recently produced a popular sci-fi short called Borrowing Time, which has over 110,000 views on X and was mentioned in Forbes. Dave made this film, complete with an intricate storyline and a rich set of characters, only using AI tools like Midjourney, text-to-video model Runway, and generative voice AI platform ElevenLabs

Dave told me that he couldn’t have made Borrowing Time without AI—it’s an expensive project that traditional Hollywood studios would never bankroll. But after Dave’s short went viral, major production houses approached him to make it a full-length movie. I think this is an excellent example of how AI is changing the art of filmmaking, and I came out of this interview convinced that we are on the brink of a new creative age.

We dive deep into the world of AI tools for image and video generation, discussing how aspiring filmmakers can use them to validate their ideas, and potentially even secure funding if they get traction. Dave walks me through how he has integrated AI into his movie-making process, and as we talk, we make a short film featuring Nicolas Cage using a haunted roulette ball to resurrect his dead movie career, live on the show.

Say goodbye to basic slide presentations. Gamma uses cutting-edge AI to revolutionize how we share ideas. It's not just a tool; it's a creative assistant, enabling you to create visually captivating content quickly and seamlessly. Whether you're presenting to a small team or a large audience, Gamma ensures your ideas shine across all devices and platforms. Experience the difference for free!

This episode is a must-watch for creative people interested in bringing their stories to life, movie buffs, and anyone curious about the future of creativity. Here’s a taste:

  • Expanding the horizon of who can make films. Making movies has traditionally been a risky, expensive business reserved for rich industry insiders. Dave believes AI is democratizing the process, empowering him to visualize his sci-fi short and “test the waters with these techniques.” “[E]veryone has a chance to create something incredible,” he says. 
  • Voice all your characters using AI. AI filmmaking tools are also empowering because they allow creators to dub full movies by themselves. Dave, who is Black, used ElevenLabs’s new speech-to-speech feature to voice all the characters in his sci-fi short. He explains that “the white judge in the movie was my voice...and the woman, the mother, was my voice.” 
  • Boost AI clips by altering frame rates. These video-generation tools are still limited by the maximum length of the clips you can generate. To counter this, Dave uses Topaz Labs to alter the “frame rate” (in simple terms, the frequency at which consecutive images are displayed in the video) of the clips he generates with Runway. “[W]hat I'm able to do is make clips last longer and then mix that with quicker cut clips, so it actually makes the…filmmaking more like something we're used to seeing on TV,” he says.

For the last segment of the interview, we use a custom GPT that Dave has fine-tuned on his personal creative style to make a movie. The GPT, called BlasianGPT to represent Dave’s Black and Asian heritage, is a text-to-image generator.

  • Custom GPTs as a personalized source of inspiration. Dave uses his custom GPT, BlasianGPT, because it curates image styles based on his tastes, taking on the role of a “mentor.” “[W]hat you're able to do, which is incredible, is really fine tune it to the type of imagery you like to create. And you don't have to pigeonhole it…you can make it as vast as you want,” he explains.
  • Generating initial images from rough ideas. Dave’s first move is to type out our rough thoughts and ideas as a “brass-tacks, simple” text prompt for BlasianGPT, suggesting that we “refine [the image BlasianGPT generates] from there.” 
  • Leverage AI to brainstorm. The AI-generated image inspires us to explore a different path, and Dave prompts BlasianGPT to create the next scene. “I typically like to say, okay, so what's happened before this or what happens after this?” he says. We also ask GPT about the backstory of objects in these images to give us ideas.
  • Use short prompts to get rich images. We want a different camera angle for one of the images BlasianGPT generated, and Dave suggests using a short description for the prompt to get “the nice kind of frames.”
  • Tell compelling stories with GPT. As we prompt BlasianGPT to generate the next frame, we find ourselves constructing a story, reminding Dave of the advantage text-to-image GPTs have over Midjourney, which doesn’t have a chatbot: “[Y]ou're telling a story together and you're really detailed in your narration…It's almost like you're sitting in a campfire by this person,” he explains.
  • Generating Midjourney prompts with ChatGPT. After making a few frames with BlasianGPT, we want to see how Midjourney would compare. Dave recommends asking GPT to generate a prompt for Midjourney. “I would ask them, you know, give me a detailed prompt to use in Midjourney,” he says. 
  • Customize images in Midjourney. After using BlasianGPT’s Midjourney prompt to generate “intriguing” images “right out of a movie,” we tweak them as Dave typically would, by upscaling the image to make it bigger and clearer, and choosing the “vary strong” option to create more noticeably different variations.  
  • Bring images to life with Runway. We input the images generated by BlasianGPT and Midjourney into Runway, and Dave uses Motion Brush, a new tool that allows users to “select a brush and color over areas that you want to move and then you're able to tell it what kind of movement you want to have,” to bring them to life. 
  • Maximizing Runway’s potential across varied imagery. For images that don’t have an obvious motion element, Dave recommends playing with the app’s “camera controls” and “motion control.” He explains that this is a more “traditional way to use Runway,” and you can also prompt it using a text description.
  • Generate five variations of each frame. Dave’s rule is to create at least five different variations of each frame. “I talked to a lot of people and they're like, ‘How does your stuff look like that? I go on Runway, my stuff looks like crap, it's all warpy.’ I was like, how many generations do you do? ‘Well, just one.’ You gotta do a bunch,” he says.
  • Refining AI-generated clips. After creating these variations, Dave usually selects the best elements of each while producing the final product. “I layer them and I take parts from each clip that I like better,” he explains. 

You can check out the episode on X, Spotify, Apple Podcasts, or YouTube. Links and timestamps are below:

Timestamps:
  1. Introduction 01:33
  2. How AI is enabling everyone with a laptop to be a filmmaker 10:19
  3. The new tool set for making AI films 14:30
  4. How to make your AI-generated clips stand out 16:56
  5. The first prompt in Dave’s custom text-to-image GPT for our movie 25:00
  6. The big advantage text-to-image GPTs have over Midjourney 37:58
  7. The best way to generate Midjourney prompts with a GPT 44:13
  8. Animating the images for our movie in Runway 49:10
  9. First look at our movie! 53:42
  10. How Dave thinks about animating images without an obvious motion element 58:22
  11. Why you need to be persistent while working with generative AI 59:46

What do you use ChatGPT for? Have you found any interesting or surprising use cases? We want to hear from you—and we might even interview you. Reply here to talk to me!

Miss an episode? Catch up on my recent conversations with founder, author, and neuroscientist Anne-Laure Le Cunff, a16z podcast host Steph Smith, OpenAI developer advocate Logan Kilpatrick, clinical psychologist Dr. Gena Gorlin, economist Tyler Cowen, writer and entrepreneur David Perell, software researcher Geoffrey Lit, Waymark founder Nathan Labenz, Notion engineer Linus Lee, writer Nat Eliason, and Gumroad CEO Sahil Lavingia, and learn how they use ChatGPT.

If you’re enjoying my work, here are a few things I recommend:

The episode transcript is for paying subscribers.


Thanks to Rhea Purohit for editorial support.

Dan Shipper is the cofounder and CEO of Every, where he writes the Chain of Thought column and hosts the podcast How Do You Use ChatGPT? You can follow him on X at @danshipper and on LinkedIn, and Every on X at @every and on LinkedIn.

Like this?
Become a subscriber.

Subscribe →

Or, learn more.

Thanks to our Sponsor: Gamma

Thanks again to our sponsor Gamma, the anti-PowerPoint.

Gamma is a breath of fresh air, powered by AI, that makes it possible for you to focus on your ideas and not on formatting. It's time to make every presentation count, without the extra effort.

Read this next:

Chain of Thought

Can a Startup Kill ChatGPT?

Google is dangerous—a founder cracked on Zyn and Diet Coke more so

2 Mar 15, 2024 by Dan Shipper

Chain of Thought

How Sora Works (and What It Means)

OpenAI's new text-to-video model heralds a new form of filmmaking

1 Feb 16, 2024 by Dan Shipper

Chain of Thought

The Knowledge Economy Is Over. Welcome to the Allocation Economy

In the age of AI, every maker becomes a manager

7 Jan 19, 2024 by Dan Shipper

Thanks for rating this post—join the conversation by commenting below.

Comments

You need to login before you can comment.
Don't have an account? Sign up!

Every smart person you know is reading this newsletter

Get one actionable essay a day on AI, tech, and personal development

Subscribe

Already a subscriber? Login