
The transcript of How Do You Use ChatGPT? with Dave Clark is below for paying subscribers.
This episode pairs well with my review of OpenAI’s new text-to-video model, Sora. In the piece, I cited Dave’s experience to argue that AI will change filmmaking by making movie concepts cheaper to test and big budget movies cost less, and leading to the emergence of a new art form. Here’s what I wrote:
“Dave Clark is a traditional filmmaker who has started to make AI-generated videos. He recently produced a sci-fi short called Borrowing Time that was inspired by his father’s experiences as a Black man in the 1960s. He produced it entirely using Midjourney and Runway to generate images and videos. He narrated the movie himself and used ElevenLabs to turn his voice acting into the voices of different characters.
Borrowing Time went viral, and Dave told me he wouldn’t have been able to make it without AI. It would’ve been impossible to get a sci-fi short like his funded by a traditional Hollywood studio. But now that it’s out, and popular, he says that he’s fielding interest from top-tier Hollywood studios who would like to make it into a full-length movie.
This anecdote speaks volumes about the way that AI models like Sora will change filmmaking in the future.”
Check out the full piece.
Timestamps:
- Introduction 01:33
- How AI is enabling everyone with a laptop to be a filmmaker 10:19
- The new tool set for making AI films 14:30
- How to make your AI-generated clips stand out 16:56
- The first prompt in Dave’s custom text-to-image GPT for our movie 25:00
- The big advantage text-to-image GPTs have over Midjourney 37:58
- The best way to generate Midjourney prompts with a GPT 44:13
- Animating the images for our movie in Runway 49:10
- First look at our movie! 53:42
- How Dave thinks about animating images without an obvious motion element 58:22
- Why you need to be persistent while working with generative AI 59:46
Transcript
Dan Shipper (00:00:00)
Just make a movie together.
Nicolas Cage's movie career was over a couple years ago. He just wasn't popular anymore. And now he's sort of back, like he's in a couple movies now. It seems like he went into the underground, he had this Book of the Dead, and he opened it, and he made a deal with the ghosts that came out, and that's how he resurrected his career.
This is amazing.
Dave Clark (00:00:21)
This will be fun to animate. We’re going to have to animate these ones.
Motion Brush is a new tool that was released by Runway that’s really cool. You're able to select a brush and then, you color over areas that you want to move and then you're able to tell what kind of movement you want to have.
Ooh. Nic Cage kind of leans in too.
Dan Shipper (00:00:41)
That is so good!
Hey, I want to just take a quick break from this episode to tell you about a podcast you should know about. It’s hosted by Nathan Labenz, who was actually on this show and did one of the most popular episodes we’ve ever had about how to use AI as a copilot for the mind.
Nathan hosts a podcast called The Cognitive Revolution, where he interviews builders at the edge of AI and discusses the profound shifts that it will unlock over the next decade. He talks to the researchers and entrepreneurs who are shaping this moment, like Logan Kilpatrick of OpenAI, who I interviewed for this show as well, Eugenia Kuyda of Replika, and Suhail Doshi of Playground.
To listen to the show, search Cognitive Revolution in your podcast player. Alright, now let’s get back to How Do You Use ChatGPT?
Dan Shipper (00:01:30)
Dave, welcome to the show.
Dave Clark (00:01:32)
Thanks, Dan, for having me.
Dan Shipper (00:01:33)
Of course. So, for people who don't know, you are a film director and a commercial director. You work with a bunch of big brands like HP, Intel, and Google. You've been a film director for a long time in the traditional movie industry and you are also a prolific AI filmmaker.
You had a short film recently that came out called Borrowing Time, which went really viral. It's based on your father's life and I saw it and I was like, oh, my God, this is actually good. And I just wanted to have you on the show to talk about how you do that.
Dave Clark (00:02:07)
No, that's awesome. And yeah, again, thanks for having me. Yeah, we can kind of dive into that short and then how that was part of my father's upbringing, but yeah, it came from a personal place. So it's cool to see something that's not like Harry Potter or Star Wars go viral?
Dan Shipper (00:02:24)
Yeah, it's really great. Okay. So tell me, tell me about Borrowing Time. Like, what is it about? How is it conceptualized? Let's start there.
Dave Clark (00:02:31)
Yeah. So it was actually a story that my father told me when I was growing up. I'm mixed—my mother's Korean, my father's black. So obviously, he's in his seventies now, so he grew up kind of during the segregation era and Jim Crow laws and stuff like that. So he was telling me the story about when he was I think 12 or 13, he had winked or whistled at a white woman inside of a convenience store. And, he got in real severe trouble for it. He didn't go to prison, but I think that the following year or something, someone was either murdered or went to prison or got life in prison for, what people believe, falsely having something to do with the disappearance of a white woman. So it just goes to show—wrong place, wrong time. That could have been my father that went to jail for the rest of his life. And then the story, because I'm such a huge sci-fi nerd, I was like, man, if that, if my father ended up going to jail, and if I was a lawyer in the present tense, I was able to go back in time and like represent him in court. That's a cool story. I haven't ever seen that before. And, the power of AI allows me to visualize it because if you try to go pitch Hollywood, they're gonna say, eh, period piece, eh, no, been there, done that. And, plus, you're not Steven Spielberg, so, sorry, you're not going to make that movie. But I think because of AI, I'm able to visualize it in a really cool way—and that got on Forbes—that's a little story, a sci-fi period piece that some executives might yawn at it in a room if you're trying to pitch it, but because you can visualize it, they might look at it differently.
Dan Shipper (00:04:16)
No, that is really interesting. I was going to ask you why you made this with AI, but the answer is pretty obvious, but do you think of this as a way to get those people or funders interested? Is it a leg into making a traditional movie with this concept, or do you think of it as a totally separate thing with a separate arc and you're kind of just focused on making AI stuff?
Dave Clark (00:04:37)
I mean, because I'm a traditional filmmaker first, absolutely. Everything that I create, it's an IP, it's an extension of my creative ideas that I have in a notebook. And I will say that, yes, this, and I think the one that I did called Dismal Swamp—I don't know if you saw that one—that has gotten the attention of pretty A-level producers and execs in Hollywood. So it works, right? The purpose of it was to, for Dismal Swamp, it was to create a little one-minute sizzle or rip-o-matic, if you will, using AI-generated footage. If you think about traditionally in Hollywood, a lot of directors will take pieces from other directors' movies and cut them together like a sizzle to pitch an idea. But AI, I took my script, I fed my script into the prompts, and I made it based on all the stuff that was in my head. And couldn't do that five years ago, three years ago. And, I mean, that's incredible.
Dan Shipper (00:05:29)
What you're making me think of is, this has actually happened before in writing, for example. This is done in writing, but it's with tweets, like people started tweeting and then that was a way to test out what articles you would write or your. The articles that you write on your blog are tests for books. And they're super cheap. You can do them pretty quickly, much more quickly than you can write an entire book, which takes years. And there's no gatekeeper. You just throw it online and if it works, then maybe you get the book deal or whatever, or maybe you just self-publish because you don't need the gatekeeper anymore. And it sort of strikes me that that wasn't as available. It probably has been available to some degree in indie filmmaking because you can make a lot of stuff with just a camera and yourself, but there's a lot of stuff you just can't do, because you don't have actors and special effects or whatever. But it strikes me that this new set of tools makes that kind of thing available for a wider range of short films that can act as precursors to larger, better-funded projects in the same way that tweets or blog posts are precursors to books—these could be precursors to films is kind of what I'm getting from you. And, I never thought of that, and I think that's really amazing. And by film, I mean feature-length movies.
Dave Clark (00:06:53)
Yeah. I mean, that's a great point. A prime example is, you might've seen it was like a hybrid live-action AI film that I was working on called Another that went viral on Christmas Day. Okay, so, that's an idea of what I think is the immediate next step with AI in Hollywood is live-action and mixing that with a Stable Diffusion model to create visual effects. And we're in the middle of—I'm actually going to be releasing that, at the Runway AI Film Festival at the end of the month, so I'm rushing to the finish line to get the effects done. And I'm not going to lie, it's not easy. It's been a nightmare trying to figure out how to make Stable Diffusion look like and work with footage that you shot on like an Arri or on a Sony Venice—it needs to match the fidelity. And I'm working with one of what I think is one of the top VFX supervisors in Hollywood—he did John Wick 3, he did The Conjuring, and he's helping me do the visual effects using the Stable Diffusion setup. So again, it's all learning process and we're figuring it out as we go, but it's still very exciting. And exactly what you said: We're able to test the waters with these techniques. And the horror is perfect for that type of thing. If you think about Blair Witch Project or Paranormal Activity, it's always a horror film that creates a new subgenre sometimes. So it’s interesting.
Dan Shipper (00:08:13)
That's really interesting. I didn't know that. What subgenre did The Blair Witch Project create?
Dave Clark (00:08:18)
I mean, it made the found footage genre huge, right? There was like maybe one or two films before it that no one ever saw. But when Blair Witch came out, it was like, oh, that was a thing. And then obviously that led to Paranormal Activity.
Dan Shipper (00:08:30)
And what is found footage is pre-existing footage from some other project or something else.
Dave Clark (00:08:33)
Yeah, it's kind of that handheld style. There's movies like V/H/S that came out where—yeah, it has a DV-tape look on it. When I think about found footage, it's always like a Blair Witch or a Paranormal Activity because these tapes were collected by authorities and they edited it together.
Dan Shipper (00:08:49)
I see. So it's not necessarily prime footage isn't necessarily pre-existing film from somewhere else, but it just looks like it was—it looks like it was collected by someone else and it wasn't intended to be shown.
Dave Clark (00:09:02)
Exactly. Yeah. That's the style of it. Yeah. They try to keep the realism.
Dan Shipper (00:09:05)
That’s really interesting. Okay. I love that. And I guess like I have to ask the obvious question, which is, there's so much backlash, I think, particularly in a community that you're a part of and really familiar with, which is traditional filmmaking and a lot of worry about AI tools and how it will change who gets to make films or what films get made or who gets paid, or there's just a lot of questions. And I'm curious, like how you decided to sort of jump into this, versus not.
Dave Clark (00:09:34)
Yeah, no, again, I think it was a little bit of necessity. It was a little bit of seeing a lane. Being a person of color—I'm not going to lie. I mean, of course, there's brilliant Black filmmakers and female filmmakers that are winning or getting nominated for Oscars—or should be getting nominated for Oscars. But there's still a small amount of people, especially in the sci-fi horror space. If you say, oh yeah, I want to be a horror director. Oh yeah, like Jordan Peele. That's what everyone always says. When you're a Black person and you say, I want to be a horror director. So that just tells you that there's not enough. And I think, for me, my argument to anyone—and I have big time director films and big time film director friends and writers and they're starting to come around, they're starting to kind of see the positivity in it and how it could really enhance and give people opportunities and storytellers opportunities that might not have the opportunity. I think about if you're a white kid from Arkansas, who loves making films in your backyard, but you don't have the connections, you have a chance. If you're a Black kid from the hood—that's how I grew up—you have a chance, everyone has a chance to create something incredible. And that's what's really cool to me. And then, I was able to speak to SAG directly, which was an interesting moment. I was talking to the board of directors through my friend, Rob—he was giving a webinar on just the difference of CGI versus AI, ‘cause there was a whole thing about digital doubles and is AI going to replace me as an actor? And while I can't answer that truly and really know the answer, I do know that AI is going to be kind of used kind of how CGI is—you'll use AI to create large stadiums of crowds and digital doubles of people, but it's all going to be with consent and people, I believe people should get paid for their likeness and all that. So, yes, to kind of circle back on your bigger question is, I think AI is the best opportunity for anyone if they want to become a creator or filmmaker or an artist and create something that can be seen on a large scale, you have a great chance if you use some of these AI tools.
Dan Shipper (00:11:38)
I love that. I mean, it's so interesting ‘cause it's so in line with things that I've felt and seen just in different areas of the world. And I didn't realize how directly it applied to filmmaking, but it obviously does. One of the things I see a lot is AI changes who can make software. Iit dramatically expands the territory of people who are able to build stuff because it can program for you. And I think the same thing seems to be true here. The cost and the level of skill maybe to get started with building just making a very, very small video clip just went down tremendously. And obviously it requires a lot of skill to build that into something that people wanna see. But, it doesn't require that much money. And for me, we were talking about this in the production call before we started this show. And I loved—I so wanted to make movies when I was a kid. I had this Lego set—this Lego, Steven Spielberg, Jurassic Park thing that came with a camera and you could have the little actors and it was sort of almost a stop-motion animation thing. And I was so into it, but you couldn't make anything good. And so then I got into 3D modeling and I was trying to make a Pixar-type thing, but that didn't really work 'cause, at that time, you had to be a fricking genius to like make an actual 3D animated movie in like the mid-2000s. And so then I started making software because like software—especially business software—it's just easy to make forms that people fill out and pay money with a credit card, which I love. I mean, it's really an amazing part of my life. But honestly I think if AI filmmaking tools had been available, I would probably be making movies. And I think it's so amazing that now anyone can just go make a movie and it doesn't require that much money. It's so cool.
Dave Clark (00:13:34)
Yeah, I mean, especially—I have young children and they're artists and they have great imaginations. I was just using ChatGPT the other day to come up with a bedtime story on the fly for my son. He's obviously big into Spider-Man and Venom. And so I created a story really quickly—like, help me create a story with my son who's five and he wants to work with Venom to fight this guy. And it did it really fast. And it was 9 p.m. So I was, on the fly, how can I get my kids to sleep? And he loved it, man. But it's that little thing right there—you couldn't do that before, you would just struggle to get your kid to bed or now they can go to bed happy and you can create any kind of story you want, even for small things like that, which is cool.
Dan Shipper (00:14:17)
That’s so cool. I guess, I'm curious: What would you consider the new tool set for making AI films? And how does that differ from the traditional one? What tools and programs are like part of the new workflow that you're working off of.
Dave Clark (00:14:34)
Yes. It's interesting because I honestly think from what I'm seeing personally is that I'm still using a lot of the old tool set, but I'm actually now figuring out how to work them into this new pipeline—this new production workflow. Stable Diffusion has been out for years, but now I'm able to hone it in and use it for a specific purpose—what I think would work for live action. Same thing with animation, right? You have AnimateDiff. You have stuff like Warp Fusion, which a lot of people are creating some amazing animations. And, I honestly think the first feature-length film will be an animated film, a Pixar-type film, completely done in AI. Might even get the trailer for it this year. That's how fast this stuff is moving, ‘cause I can totally see any of these Diffusion models able to allow it because I mean, and the thing is these guys who really know how to use it, they're not storytellers first. A lot of them are just tech wizards. But when they started getting together with the mes and the other kind of people who have stories, it's going to be game over. I mean, you're going to have so many amazing things coming out and I'm super excited for that.
Dan Shipper (00:15:40)
I love it. I love it. One of the things I'm really fascinated by is how the new powers and the new limitations of a new tool shape the form of the art or the form of work that can get made. And I think there's lots of examples like this through history. And I'm kind of curious about that for the sort of AI film age. I'm noticing things like, for example, it seems like Runway, which is the main kind of way to generate a clip—take an image and turn it into a video clip. It only generates two or three seconds at most, something like that. And so I'm seeing a lot of these AI films, including yours, like you have these couple seconds shots and then it cuts to another one and it cuts to another one. And maybe there's also harder to make the characters as continuous through each different shot. And so the kinds of stories you can tell that pushes you in a specific kind of storytelling direction and it creates rules for what the medium is. And I'm sure it's still really early and these tools are going to get better, but I'm kind of curious what are the rules for what you can make with AI right now?
Dave Clark (00:16:50)
No, that's a great question. Yeah, there are limitations, right? I think four seconds is Runway. Currently, I know Pika Labs, you can extend it up to 15 seconds, I believe. But that's uncharted territory because you're kind of going to lose some fidelity a lot of the time. I use a tool called Topaz Labs. And what I do is I bring in a Runway clip. I not only export it at 4K or 8K sometimes, but I'll also change the frame rate. So if you get a 24 frames per second clip, I'll turn it into 60 frames per second, sometimes 120. And then you can extend the clip. So if it's for my sci-fi, like Borrowing Time is a great example of the scene where—Oppenheimer was a huge reference—but just like the way the time travel scenes were happening and the lights were kind of whipping around—I did a high-frame rate generation and then I slowed it down in post and then sped it up. So what I'm able to do is make clips last longer and then mix that with quicker-cut clips. So it actually makes the storytelling and filmmaking more like something we're used to seeing on TV. I always think about the Tony Scott-type films, like Man on Fire, where there was always these kind of quick cuts or Snyder with 300 and Dawn of the Dead. Those are some of my favorite films, but I also love how they did the editing and I think editing is a huge piece of my type of filmmaking with AI is I'm able to tell a story and not have to use just three-second clip after three-second clip. Sometimes you'll get an eight-second clip because I slowed it down, then you'll get a one-second clip. And I use that cadence to help tell it.
Dan Shipper (00:18:28)
I see. So I think what you're saying is—I kind of missed the tool that you said at the beginning, maybe Topaz Labs. Is that the thing that is changing the frame rate so it makes the clip slower or faster? Yeah. So basically, I guess what you're saying is, yes, there are these specific things like Runway is only exporting four-second clips, but you're doing different things to lengthen or shorten the clips so that it's not just the same length of cut and that there's actually a history that you're pulling from like 300 and Man on Fire where directors of traditional feature films are doing something similar—it's probably not exactly the same, but you're like taking inspiration from that and you can push the AI to do something that is inspired by it.
Dave Clark (00:19:13)
Absolutely. I think that maybe helps my stuff stand out sometimes a little bit more because it's not just three-second clip after three-second clip after three-second clip. Another thing also is there's tools like ElevenLabs where you can generate AI voices and some of them are incredible, but a lot of people tend to use the ones that sound like AI, right? But now they have this whole speech-to-speech, which I used in Borrowing Time. Actually, I think it might have been in beta when I used it because they just announced it, but I can talk and act out how I want—the white judge in the movie was my voice. It was just voice-to-voice, using ElevenLabs and the woman—the mother—was my voice. And then obviously the voiceover was just my plain voice. 'Cause I just thought it sounded better and more natural to have the natural pauses and the things like that. So it makes sense together.
Dan Shipper (00:20:04)
That’s really cool. Okay. So I didn't even realize that that was a thing. So basically, you acted the whole movie yourself. And you used ElevenLabs’ voice-to-voice model to change it into different characters. I didn't know that that was a thing. That's so cool.
Dave Clark (00:20:20)
Yeah, dude, it's awesome. I think about animated films—you could do like all the weird characters. Just ‘cause sometimes—it's not perfect, but you kind of hear it in your head and for the shorter films, it's like, why not edit right?
Dan Shipper (00:20:32)
Yeah. That's amazing. Okay, cool. Well, what I'd love to do is a lot of this show is us seeing how people use ChatGPT and us using it together. And what we planned to do is just make a movie together. And it probably won't be like a full movie, but like make a clip from a movie together and sort of see what this process is like. And I think what's really interesting is you're actually using ChatGPT as—and particularly like a custom GPT—to help you ideate. I know this is sort of a new part of the process, but it's something that you're doing. And so maybe you can start by telling us what this GPT is and how you use it. We can start there and then we can sort of go into exploring making movies.
Dave Clark (00:21:18)
No, absolutely. And I'm still heavily in Midjourney and then sometimes Stable Diffusion with a lot of my image outputs, but I just fell in love with this idea of creating a version of my own text-to-image generator. I'm obviously using the DALL-E platform and it's building on top of it. But what you're able to do, which is incredible, is really fine-tune it to the type of imagery you’d like to create. And you don't have to pigeonhole it to just being live-action-type imagery, you can make it as vast as you want, but you're able to almost create a mentor out of it. So it's a combination of my opinions, a combination of maybe what Steven Spielberg might look for in imagery, what Ridley Scott might look for in imagery. It's really because you can use ChatGPT and kind of tell it to be anything, but you can constantly just tune it and I'm still tuning it. It's still new, so it's not perfect, but it's good to me. It works for me because I'm able— do have a certain look to a lot of my stuff and I know like DALL-E traditionally hasn't given me the outputs that I need versus Midjourney, but now because I'm starting to really fine-tune it now, I'm starting to get imagery that's like, wow, that's on par with what I'm getting from like a Midjourney or Stable Diffusion. That's pretty cool. Yeah. So that's where I'm at now. Still toying with it, but I mean, what an incredible tool. It’s crazy.
Dan Shipper (00:22:38)
Cool. Should we demo that and maybe start there and then we can—so you call this Blasian GPT, which is hilarious. How do you start with this? When are you jumping into it?
Dave Clark (00:23:02)
So, yeah, I mean, I jump into it right away because obviously it's a text-to-image generation tool, but I can also talk to it to get ideas, I can talk to it like it's a mentor. So, yeah, let's see. What would you like to prompt?
Dan Shipper (00:23:16)
Well, let's think about it. Like, okay, so what kind of, what kind of movie do we want to make? Wow, I got to think about that.
Dave Clark (00:23:24)
That's a good question.
Dan Shipper (00:23:25)
So in a previous episode of the show, we built this game with ChatGPT called The Allocator. And the game was, basically this: You could play the president of any historical era. It starts with inauguration day and you get to make decisions as the president and you basically set the budget for the U.S. government. And then the game plays out your decisions, and then you play JFK during the 60s and you decide whether to fund going to the Moon or not—all that kind of stuff. And we could do something that's sort of based on that. One of the interesting things about that game is we had a lot of Nicolas Cage cameos, because ChatGPT generated this image for the GPT that it just looked so much like National Treasure. It was giving us National Treasure vibes. And we were like, I guess there's going to be some Nic Cage cameos in this game. So I don't know if that gives you anything—any ideas to do, but maybe we can throw that in there and see what comes out.
Dave Clark (00:24:40)
Yeah, Nicolas Cage is always—there's always awesome, epic imagery whenever you prompt him. We could—I mean, yeah. Give me an idea. Let's see what happens. We'll just start there.
Dan Shipper (00:24:53)
Okay. I want to do Nicolas Cage getting sworn in as president and ideally it's raining and it's sort of dark. It's gritty, like noir vibes, maybe. And I don't know if that is your aesthetic, so you can tell me and maybe the GPT will correct me if I'm wrong. I'm trying to think about what situation Nicolas Cage would ever become president. And I guess given who our previous president was, it's not maybe not that surprising. But, interesting. So you're saying, “Create an image of Nicolas Cage getting sworn in as president. The image should be cinematic and gritty in the style of the David Fincher film Se7en.” Cool.
Dave Clark (00:26:00)
I think that could help, right?
Dan Shipper (00:26:02)
Alright. That could help. Is there any, do you think there should, we should add anything else into the scene, aside from him getting sworn as president or would you just start with something like this? How would you do it?
Dave Clark (00:26:12)
I kind of start with just a simple brass tax and then we just refine it from there or add the—You've seen those things on Instagram, add this in, now do this, now do that. Yeah. So let's see. And this is the first time Blasian GPT has ever been seen publicly, which is Blasian because I'm Black and Asian, by the way. That's what we call ourselves. Here we go. Never before seen before. I don't know what it's going to do.
Dan Shipper (00:26:37)
We're doing it live folks.
Dave Clark (00:26:40)
It's called a top-tier image generator. So it better show up.
Dan Shipper (00:26:45)
One interesting thing is you're having to go right into generating an image. You're not having it like sort of ideate with you necessarily and saying, well, what do we do this? Or what do you do? It’s just right to the visual. How did you make that decision?
Dave Clark (00:26:58)
Just because I, because I've already trained it on a lot of those conversations. I forgot to do 16:9.
Dan Shipper (00:27:08)
I'm getting Nicolas Cage is in Harry Potter. But it's for some reason we're in the wizard court, like the Wizengamut. So it's not necessarily an inauguration vibes, but I'm not mad about it either. It's kind of interesting.
Dave Clark (00:27:26)
It got the Se7en vibe down right. Yeah. You got the film grain, you got the harsh perfume lighting, which is always funny because I watched the behind the scenes of Se7en and that was what the DP said. He was like, Fincher told me to watch a bunch of perfume ads. Like we wanted to create that, like that new or like, just like you said, where it was like whites are really white and blacks are really black. That's exactly what I did.
Dan Shipper (00:27:47)
Wait. Yeah. What is, what is perfume lighting?
Dave Clark (00:27:52)
That’s related to perfume ads back in the ‘80s and early ‘90s were very dramatic, so it was very like high contrast, almost felt black and white, but there was always like a splash, a wash of color, which is how Se7en looks. But interesting. Okay, so let's build on this. One thing I want to do is I want to make it 16:9.
Dan Shipper (00:28:12)
What is 16 9?
Dave Clark (00:28:14)
That's a widescreen. So that's what you're used to seeing—or 2:1, I usually do 2:1 or 16:9, but we'll do 16:9, so it's a little bigger.
Dan Shipper (00:28:18)
Okay. And do you know what the aspect ratio of this one is?
Dave Clark (00:28:22)
This is 1:1. So this is your Instagram square look, which is your default. I forgot to go in, but you can actually change it, so the first image is always 16:9. You don't have to prompt it. Okay. So what changes do we want to make besides that?
Dan Shipper (00:28:57)
Well, I'm curious. I mean, I can certainly give you some input, but I'm curious how you would think about it. Help guide me through how I would think about what changes I might want to make or what I should be thinking about or seeing.
Dave Clark (00:29:55)
Yeah. So I think for what you want to say, it's missing some core pieces. You talked about it, it doesn't feel very inaugural, doesn't feel like it's in D.C. So I think—Where would this take place, typically?
Dan Shipper (00:29:09)
Well, I mean, it's gotta be, kind of, on the Mall, in front of the Capitol, right? I'm pretty sure that's where it happens. I am sort of curious what you think. This is a mistake, obviously, but then also, I'm kind of intrigued. What did Nicolas Cage do to end up in this situation? He looks like he's in a court and—
Dave Clark (00:29:37)
Well, we could go down that rabbit hole.
Dan Shipper (00:29:40)
Yeah, we kinda could. I'm curious. Do you see anything here where you're like—
Dave Clark (00:29:43)
It almost feels like he's in a church. This is some culty underground—what did he get himself into? It kind of gives me John Wick vibes a little bit, too.
Dan Shipper (00:30:59)
Yeah, it's the cult of Cage, and he's got—what's he holding? Is that a Bible, or what's in his hand?
Dave Clark (00:31:06)
Yeah, we, I mean, could we, we could ask Blasian GPT.
Dan Shipper (00:31:13)
Let's ask.
Dave Clark (00:31:14)
Before we change the aspect ratio—“What is the book Nicolas—” this is fun. I don't really do it this way, ”—Cage is holding in his hand.”
Dan Shipper (00:31:42)
Okay. Interesting. Okay, ChatGPT's playing it straight with us. It's just the Bible or the Constitution.
Dave Clark (00:31:50)
But what if the book was, it almost reminds me of the book from Evil Dead. Did you ever see that movie?
Dan Shipper (00:31:58)
No. What is it?
Dave Clark (00:31:59)
Evil Dead is that film where it was like the book—that cursed book of death—when you read a thing from it causes all these evil spirits to show up.
Dan Shipper (00:31:25)
The transcript of How Do You Use ChatGPT? with Dave Clark is below for paying subscribers.
This episode pairs well with my review of OpenAI’s new text-to-video model, Sora. In the piece, I cited Dave’s experience to argue that AI will change filmmaking by making movie concepts cheaper to test and big budget movies cost less, and leading to the emergence of a new art form. Here’s what I wrote:
“Dave Clark is a traditional filmmaker who has started to make AI-generated videos. He recently produced a sci-fi short called Borrowing Time that was inspired by his father’s experiences as a Black man in the 1960s. He produced it entirely using Midjourney and Runway to generate images and videos. He narrated the movie himself and used ElevenLabs to turn his voice acting into the voices of different characters.
Borrowing Time went viral, and Dave told me he wouldn’t have been able to make it without AI. It would’ve been impossible to get a sci-fi short like his funded by a traditional Hollywood studio. But now that it’s out, and popular, he says that he’s fielding interest from top-tier Hollywood studios who would like to make it into a full-length movie.
This anecdote speaks volumes about the way that AI models like Sora will change filmmaking in the future.”
Check out the full piece.
Timestamps:
- Introduction 01:33
- How AI is enabling everyone with a laptop to be a filmmaker 10:19
- The new tool set for making AI films 14:30
- How to make your AI-generated clips stand out 16:56
- The first prompt in Dave’s custom text-to-image GPT for our movie 25:00
- The big advantage text-to-image GPTs have over Midjourney 37:58
- The best way to generate Midjourney prompts with a GPT 44:13
- Animating the images for our movie in Runway 49:10
- First look at our movie! 53:42
- How Dave thinks about animating images without an obvious motion element 58:22
- Why you need to be persistent while working with generative AI 59:46
Transcript
Dan Shipper (00:00:00)
Just make a movie together.
Nicolas Cage's movie career was over a couple years ago. He just wasn't popular anymore. And now he's sort of back, like he's in a couple movies now. It seems like he went into the underground, he had this Book of the Dead, and he opened it, and he made a deal with the ghosts that came out, and that's how he resurrected his career.
This is amazing.
Dave Clark (00:00:21)
This will be fun to animate. We’re going to have to animate these ones.
Motion Brush is a new tool that was released by Runway that’s really cool. You're able to select a brush and then, you color over areas that you want to move and then you're able to tell what kind of movement you want to have.
Ooh. Nic Cage kind of leans in too.
Dan Shipper (00:00:41)
That is so good!
Hey, I want to just take a quick break from this episode to tell you about a podcast you should know about. It’s hosted by Nathan Labenz, who was actually on this show and did one of the most popular episodes we’ve ever had about how to use AI as a copilot for the mind.
Nathan hosts a podcast called The Cognitive Revolution, where he interviews builders at the edge of AI and discusses the profound shifts that it will unlock over the next decade. He talks to the researchers and entrepreneurs who are shaping this moment, like Logan Kilpatrick of OpenAI, who I interviewed for this show as well, Eugenia Kuyda of Replika, and Suhail Doshi of Playground.
To listen to the show, search Cognitive Revolution in your podcast player. Alright, now let’s get back to How Do You Use ChatGPT?
Dan Shipper (00:01:30)
Dave, welcome to the show.
Dave Clark (00:01:32)
Thanks, Dan, for having me.
Dan Shipper (00:01:33)
Of course. So, for people who don't know, you are a film director and a commercial director. You work with a bunch of big brands like HP, Intel, and Google. You've been a film director for a long time in the traditional movie industry and you are also a prolific AI filmmaker.
You had a short film recently that came out called Borrowing Time, which went really viral. It's based on your father's life and I saw it and I was like, oh, my God, this is actually good. And I just wanted to have you on the show to talk about how you do that.
Dave Clark (00:02:07)
No, that's awesome. And yeah, again, thanks for having me. Yeah, we can kind of dive into that short and then how that was part of my father's upbringing, but yeah, it came from a personal place. So it's cool to see something that's not like Harry Potter or Star Wars go viral?
Dan Shipper (00:02:24)
Yeah, it's really great. Okay. So tell me, tell me about Borrowing Time. Like, what is it about? How is it conceptualized? Let's start there.
Dave Clark (00:02:31)
Yeah. So it was actually a story that my father told me when I was growing up. I'm mixed—my mother's Korean, my father's black. So obviously, he's in his seventies now, so he grew up kind of during the segregation era and Jim Crow laws and stuff like that. So he was telling me the story about when he was I think 12 or 13, he had winked or whistled at a white woman inside of a convenience store. And, he got in real severe trouble for it. He didn't go to prison, but I think that the following year or something, someone was either murdered or went to prison or got life in prison for, what people believe, falsely having something to do with the disappearance of a white woman. So it just goes to show—wrong place, wrong time. That could have been my father that went to jail for the rest of his life. And then the story, because I'm such a huge sci-fi nerd, I was like, man, if that, if my father ended up going to jail, and if I was a lawyer in the present tense, I was able to go back in time and like represent him in court. That's a cool story. I haven't ever seen that before. And, the power of AI allows me to visualize it because if you try to go pitch Hollywood, they're gonna say, eh, period piece, eh, no, been there, done that. And, plus, you're not Steven Spielberg, so, sorry, you're not going to make that movie. But I think because of AI, I'm able to visualize it in a really cool way—and that got on Forbes—that's a little story, a sci-fi period piece that some executives might yawn at it in a room if you're trying to pitch it, but because you can visualize it, they might look at it differently.
Dan Shipper (00:04:16)
No, that is really interesting. I was going to ask you why you made this with AI, but the answer is pretty obvious, but do you think of this as a way to get those people or funders interested? Is it a leg into making a traditional movie with this concept, or do you think of it as a totally separate thing with a separate arc and you're kind of just focused on making AI stuff?
Dave Clark (00:04:37)
I mean, because I'm a traditional filmmaker first, absolutely. Everything that I create, it's an IP, it's an extension of my creative ideas that I have in a notebook. And I will say that, yes, this, and I think the one that I did called Dismal Swamp—I don't know if you saw that one—that has gotten the attention of pretty A-level producers and execs in Hollywood. So it works, right? The purpose of it was to, for Dismal Swamp, it was to create a little one-minute sizzle or rip-o-matic, if you will, using AI-generated footage. If you think about traditionally in Hollywood, a lot of directors will take pieces from other directors' movies and cut them together like a sizzle to pitch an idea. But AI, I took my script, I fed my script into the prompts, and I made it based on all the stuff that was in my head. And couldn't do that five years ago, three years ago. And, I mean, that's incredible.
Dan Shipper (00:05:29)
What you're making me think of is, this has actually happened before in writing, for example. This is done in writing, but it's with tweets, like people started tweeting and then that was a way to test out what articles you would write or your. The articles that you write on your blog are tests for books. And they're super cheap. You can do them pretty quickly, much more quickly than you can write an entire book, which takes years. And there's no gatekeeper. You just throw it online and if it works, then maybe you get the book deal or whatever, or maybe you just self-publish because you don't need the gatekeeper anymore. And it sort of strikes me that that wasn't as available. It probably has been available to some degree in indie filmmaking because you can make a lot of stuff with just a camera and yourself, but there's a lot of stuff you just can't do, because you don't have actors and special effects or whatever. But it strikes me that this new set of tools makes that kind of thing available for a wider range of short films that can act as precursors to larger, better-funded projects in the same way that tweets or blog posts are precursors to books—these could be precursors to films is kind of what I'm getting from you. And, I never thought of that, and I think that's really amazing. And by film, I mean feature-length movies.
Dave Clark (00:06:53)
Yeah. I mean, that's a great point. A prime example is, you might've seen it was like a hybrid live-action AI film that I was working on called Another that went viral on Christmas Day. Okay, so, that's an idea of what I think is the immediate next step with AI in Hollywood is live-action and mixing that with a Stable Diffusion model to create visual effects. And we're in the middle of—I'm actually going to be releasing that, at the Runway AI Film Festival at the end of the month, so I'm rushing to the finish line to get the effects done. And I'm not going to lie, it's not easy. It's been a nightmare trying to figure out how to make Stable Diffusion look like and work with footage that you shot on like an Arri or on a Sony Venice—it needs to match the fidelity. And I'm working with one of what I think is one of the top VFX supervisors in Hollywood—he did John Wick 3, he did The Conjuring, and he's helping me do the visual effects using the Stable Diffusion setup. So again, it's all learning process and we're figuring it out as we go, but it's still very exciting. And exactly what you said: We're able to test the waters with these techniques. And the horror is perfect for that type of thing. If you think about Blair Witch Project or Paranormal Activity, it's always a horror film that creates a new subgenre sometimes. So it’s interesting.
Dan Shipper (00:08:13)
That's really interesting. I didn't know that. What subgenre did The Blair Witch Project create?
Dave Clark (00:08:18)
I mean, it made the found footage genre huge, right? There was like maybe one or two films before it that no one ever saw. But when Blair Witch came out, it was like, oh, that was a thing. And then obviously that led to Paranormal Activity.
Dan Shipper (00:08:30)
And what is found footage is pre-existing footage from some other project or something else.
Dave Clark (00:08:33)
Yeah, it's kind of that handheld style. There's movies like V/H/S that came out where—yeah, it has a DV-tape look on it. When I think about found footage, it's always like a Blair Witch or a Paranormal Activity because these tapes were collected by authorities and they edited it together.
Dan Shipper (00:08:49)
I see. So it's not necessarily prime footage isn't necessarily pre-existing film from somewhere else, but it just looks like it was—it looks like it was collected by someone else and it wasn't intended to be shown.
Dave Clark (00:09:02)
Exactly. Yeah. That's the style of it. Yeah. They try to keep the realism.
Dan Shipper (00:09:05)
That’s really interesting. Okay. I love that. And I guess like I have to ask the obvious question, which is, there's so much backlash, I think, particularly in a community that you're a part of and really familiar with, which is traditional filmmaking and a lot of worry about AI tools and how it will change who gets to make films or what films get made or who gets paid, or there's just a lot of questions. And I'm curious, like how you decided to sort of jump into this, versus not.
Dave Clark (00:09:34)
Yeah, no, again, I think it was a little bit of necessity. It was a little bit of seeing a lane. Being a person of color—I'm not going to lie. I mean, of course, there's brilliant Black filmmakers and female filmmakers that are winning or getting nominated for Oscars—or should be getting nominated for Oscars. But there's still a small amount of people, especially in the sci-fi horror space. If you say, oh yeah, I want to be a horror director. Oh yeah, like Jordan Peele. That's what everyone always says. When you're a Black person and you say, I want to be a horror director. So that just tells you that there's not enough. And I think, for me, my argument to anyone—and I have big time director films and big time film director friends and writers and they're starting to come around, they're starting to kind of see the positivity in it and how it could really enhance and give people opportunities and storytellers opportunities that might not have the opportunity. I think about if you're a white kid from Arkansas, who loves making films in your backyard, but you don't have the connections, you have a chance. If you're a Black kid from the hood—that's how I grew up—you have a chance, everyone has a chance to create something incredible. And that's what's really cool to me. And then, I was able to speak to SAG directly, which was an interesting moment. I was talking to the board of directors through my friend, Rob—he was giving a webinar on just the difference of CGI versus AI, ‘cause there was a whole thing about digital doubles and is AI going to replace me as an actor? And while I can't answer that truly and really know the answer, I do know that AI is going to be kind of used kind of how CGI is—you'll use AI to create large stadiums of crowds and digital doubles of people, but it's all going to be with consent and people, I believe people should get paid for their likeness and all that. So, yes, to kind of circle back on your bigger question is, I think AI is the best opportunity for anyone if they want to become a creator or filmmaker or an artist and create something that can be seen on a large scale, you have a great chance if you use some of these AI tools.
Dan Shipper (00:11:38)
I love that. I mean, it's so interesting ‘cause it's so in line with things that I've felt and seen just in different areas of the world. And I didn't realize how directly it applied to filmmaking, but it obviously does. One of the things I see a lot is AI changes who can make software. Iit dramatically expands the territory of people who are able to build stuff because it can program for you. And I think the same thing seems to be true here. The cost and the level of skill maybe to get started with building just making a very, very small video clip just went down tremendously. And obviously it requires a lot of skill to build that into something that people wanna see. But, it doesn't require that much money. And for me, we were talking about this in the production call before we started this show. And I loved—I so wanted to make movies when I was a kid. I had this Lego set—this Lego, Steven Spielberg, Jurassic Park thing that came with a camera and you could have the little actors and it was sort of almost a stop-motion animation thing. And I was so into it, but you couldn't make anything good. And so then I got into 3D modeling and I was trying to make a Pixar-type thing, but that didn't really work 'cause, at that time, you had to be a fricking genius to like make an actual 3D animated movie in like the mid-2000s. And so then I started making software because like software—especially business software—it's just easy to make forms that people fill out and pay money with a credit card, which I love. I mean, it's really an amazing part of my life. But honestly I think if AI filmmaking tools had been available, I would probably be making movies. And I think it's so amazing that now anyone can just go make a movie and it doesn't require that much money. It's so cool.
Dave Clark (00:13:34)
Yeah, I mean, especially—I have young children and they're artists and they have great imaginations. I was just using ChatGPT the other day to come up with a bedtime story on the fly for my son. He's obviously big into Spider-Man and Venom. And so I created a story really quickly—like, help me create a story with my son who's five and he wants to work with Venom to fight this guy. And it did it really fast. And it was 9 p.m. So I was, on the fly, how can I get my kids to sleep? And he loved it, man. But it's that little thing right there—you couldn't do that before, you would just struggle to get your kid to bed or now they can go to bed happy and you can create any kind of story you want, even for small things like that, which is cool.
Dan Shipper (00:14:17)
That’s so cool. I guess, I'm curious: What would you consider the new tool set for making AI films? And how does that differ from the traditional one? What tools and programs are like part of the new workflow that you're working off of.
Dave Clark (00:14:34)
Yes. It's interesting because I honestly think from what I'm seeing personally is that I'm still using a lot of the old tool set, but I'm actually now figuring out how to work them into this new pipeline—this new production workflow. Stable Diffusion has been out for years, but now I'm able to hone it in and use it for a specific purpose—what I think would work for live action. Same thing with animation, right? You have AnimateDiff. You have stuff like Warp Fusion, which a lot of people are creating some amazing animations. And, I honestly think the first feature-length film will be an animated film, a Pixar-type film, completely done in AI. Might even get the trailer for it this year. That's how fast this stuff is moving, ‘cause I can totally see any of these Diffusion models able to allow it because I mean, and the thing is these guys who really know how to use it, they're not storytellers first. A lot of them are just tech wizards. But when they started getting together with the mes and the other kind of people who have stories, it's going to be game over. I mean, you're going to have so many amazing things coming out and I'm super excited for that.
Dan Shipper (00:15:40)
I love it. I love it. One of the things I'm really fascinated by is how the new powers and the new limitations of a new tool shape the form of the art or the form of work that can get made. And I think there's lots of examples like this through history. And I'm kind of curious about that for the sort of AI film age. I'm noticing things like, for example, it seems like Runway, which is the main kind of way to generate a clip—take an image and turn it into a video clip. It only generates two or three seconds at most, something like that. And so I'm seeing a lot of these AI films, including yours, like you have these couple seconds shots and then it cuts to another one and it cuts to another one. And maybe there's also harder to make the characters as continuous through each different shot. And so the kinds of stories you can tell that pushes you in a specific kind of storytelling direction and it creates rules for what the medium is. And I'm sure it's still really early and these tools are going to get better, but I'm kind of curious what are the rules for what you can make with AI right now?
Dave Clark (00:16:50)
No, that's a great question. Yeah, there are limitations, right? I think four seconds is Runway. Currently, I know Pika Labs, you can extend it up to 15 seconds, I believe. But that's uncharted territory because you're kind of going to lose some fidelity a lot of the time. I use a tool called Topaz Labs. And what I do is I bring in a Runway clip. I not only export it at 4K or 8K sometimes, but I'll also change the frame rate. So if you get a 24 frames per second clip, I'll turn it into 60 frames per second, sometimes 120. And then you can extend the clip. So if it's for my sci-fi, like Borrowing Time is a great example of the scene where—Oppenheimer was a huge reference—but just like the way the time travel scenes were happening and the lights were kind of whipping around—I did a high-frame rate generation and then I slowed it down in post and then sped it up. So what I'm able to do is make clips last longer and then mix that with quicker-cut clips. So it actually makes the storytelling and filmmaking more like something we're used to seeing on TV. I always think about the Tony Scott-type films, like Man on Fire, where there was always these kind of quick cuts or Snyder with 300 and Dawn of the Dead. Those are some of my favorite films, but I also love how they did the editing and I think editing is a huge piece of my type of filmmaking with AI is I'm able to tell a story and not have to use just three-second clip after three-second clip. Sometimes you'll get an eight-second clip because I slowed it down, then you'll get a one-second clip. And I use that cadence to help tell it.
Dan Shipper (00:18:28)
I see. So I think what you're saying is—I kind of missed the tool that you said at the beginning, maybe Topaz Labs. Is that the thing that is changing the frame rate so it makes the clip slower or faster? Yeah. So basically, I guess what you're saying is, yes, there are these specific things like Runway is only exporting four-second clips, but you're doing different things to lengthen or shorten the clips so that it's not just the same length of cut and that there's actually a history that you're pulling from like 300 and Man on Fire where directors of traditional feature films are doing something similar—it's probably not exactly the same, but you're like taking inspiration from that and you can push the AI to do something that is inspired by it.
Dave Clark (00:19:13)
Absolutely. I think that maybe helps my stuff stand out sometimes a little bit more because it's not just three-second clip after three-second clip after three-second clip. Another thing also is there's tools like ElevenLabs where you can generate AI voices and some of them are incredible, but a lot of people tend to use the ones that sound like AI, right? But now they have this whole speech-to-speech, which I used in Borrowing Time. Actually, I think it might have been in beta when I used it because they just announced it, but I can talk and act out how I want—the white judge in the movie was my voice. It was just voice-to-voice, using ElevenLabs and the woman—the mother—was my voice. And then obviously the voiceover was just my plain voice. 'Cause I just thought it sounded better and more natural to have the natural pauses and the things like that. So it makes sense together.
Dan Shipper (00:20:04)
That’s really cool. Okay. So I didn't even realize that that was a thing. So basically, you acted the whole movie yourself. And you used ElevenLabs’ voice-to-voice model to change it into different characters. I didn't know that that was a thing. That's so cool.
Dave Clark (00:20:20)
Yeah, dude, it's awesome. I think about animated films—you could do like all the weird characters. Just ‘cause sometimes—it's not perfect, but you kind of hear it in your head and for the shorter films, it's like, why not edit right?
Dan Shipper (00:20:32)
Yeah. That's amazing. Okay, cool. Well, what I'd love to do is a lot of this show is us seeing how people use ChatGPT and us using it together. And what we planned to do is just make a movie together. And it probably won't be like a full movie, but like make a clip from a movie together and sort of see what this process is like. And I think what's really interesting is you're actually using ChatGPT as—and particularly like a custom GPT—to help you ideate. I know this is sort of a new part of the process, but it's something that you're doing. And so maybe you can start by telling us what this GPT is and how you use it. We can start there and then we can sort of go into exploring making movies.
Dave Clark (00:21:18)
No, absolutely. And I'm still heavily in Midjourney and then sometimes Stable Diffusion with a lot of my image outputs, but I just fell in love with this idea of creating a version of my own text-to-image generator. I'm obviously using the DALL-E platform and it's building on top of it. But what you're able to do, which is incredible, is really fine-tune it to the type of imagery you’d like to create. And you don't have to pigeonhole it to just being live-action-type imagery, you can make it as vast as you want, but you're able to almost create a mentor out of it. So it's a combination of my opinions, a combination of maybe what Steven Spielberg might look for in imagery, what Ridley Scott might look for in imagery. It's really because you can use ChatGPT and kind of tell it to be anything, but you can constantly just tune it and I'm still tuning it. It's still new, so it's not perfect, but it's good to me. It works for me because I'm able— do have a certain look to a lot of my stuff and I know like DALL-E traditionally hasn't given me the outputs that I need versus Midjourney, but now because I'm starting to really fine-tune it now, I'm starting to get imagery that's like, wow, that's on par with what I'm getting from like a Midjourney or Stable Diffusion. That's pretty cool. Yeah. So that's where I'm at now. Still toying with it, but I mean, what an incredible tool. It’s crazy.
Dan Shipper (00:22:38)
Cool. Should we demo that and maybe start there and then we can—so you call this Blasian GPT, which is hilarious. How do you start with this? When are you jumping into it?
Dave Clark (00:23:02)
So, yeah, I mean, I jump into it right away because obviously it's a text-to-image generation tool, but I can also talk to it to get ideas, I can talk to it like it's a mentor. So, yeah, let's see. What would you like to prompt?
Dan Shipper (00:23:16)
Well, let's think about it. Like, okay, so what kind of, what kind of movie do we want to make? Wow, I got to think about that.
Dave Clark (00:23:24)
That's a good question.
Dan Shipper (00:23:25)
So in a previous episode of the show, we built this game with ChatGPT called The Allocator. And the game was, basically this: You could play the president of any historical era. It starts with inauguration day and you get to make decisions as the president and you basically set the budget for the U.S. government. And then the game plays out your decisions, and then you play JFK during the 60s and you decide whether to fund going to the Moon or not—all that kind of stuff. And we could do something that's sort of based on that. One of the interesting things about that game is we had a lot of Nicolas Cage cameos, because ChatGPT generated this image for the GPT that it just looked so much like National Treasure. It was giving us National Treasure vibes. And we were like, I guess there's going to be some Nic Cage cameos in this game. So I don't know if that gives you anything—any ideas to do, but maybe we can throw that in there and see what comes out.
Dave Clark (00:24:40)
Yeah, Nicolas Cage is always—there's always awesome, epic imagery whenever you prompt him. We could—I mean, yeah. Give me an idea. Let's see what happens. We'll just start there.
Dan Shipper (00:24:53)
Okay. I want to do Nicolas Cage getting sworn in as president and ideally it's raining and it's sort of dark. It's gritty, like noir vibes, maybe. And I don't know if that is your aesthetic, so you can tell me and maybe the GPT will correct me if I'm wrong. I'm trying to think about what situation Nicolas Cage would ever become president. And I guess given who our previous president was, it's not maybe not that surprising. But, interesting. So you're saying, “Create an image of Nicolas Cage getting sworn in as president. The image should be cinematic and gritty in the style of the David Fincher film Se7en.” Cool.
Dave Clark (00:26:00)
I think that could help, right?
Dan Shipper (00:26:02)
Alright. That could help. Is there any, do you think there should, we should add anything else into the scene, aside from him getting sworn as president or would you just start with something like this? How would you do it?
Dave Clark (00:26:12)
I kind of start with just a simple brass tax and then we just refine it from there or add the—You've seen those things on Instagram, add this in, now do this, now do that. Yeah. So let's see. And this is the first time Blasian GPT has ever been seen publicly, which is Blasian because I'm Black and Asian, by the way. That's what we call ourselves. Here we go. Never before seen before. I don't know what it's going to do.
Dan Shipper (00:26:37)
We're doing it live folks.
Dave Clark (00:26:40)
It's called a top-tier image generator. So it better show up.
Dan Shipper (00:26:45)
One interesting thing is you're having to go right into generating an image. You're not having it like sort of ideate with you necessarily and saying, well, what do we do this? Or what do you do? It’s just right to the visual. How did you make that decision?
Dave Clark (00:26:58)
Just because I, because I've already trained it on a lot of those conversations. I forgot to do 16:9.
Dan Shipper (00:27:08)
I'm getting Nicolas Cage is in Harry Potter. But it's for some reason we're in the wizard court, like the Wizengamut. So it's not necessarily an inauguration vibes, but I'm not mad about it either. It's kind of interesting.
Dave Clark (00:27:26)
It got the Se7en vibe down right. Yeah. You got the film grain, you got the harsh perfume lighting, which is always funny because I watched the behind the scenes of Se7en and that was what the DP said. He was like, Fincher told me to watch a bunch of perfume ads. Like we wanted to create that, like that new or like, just like you said, where it was like whites are really white and blacks are really black. That's exactly what I did.
Dan Shipper (00:27:47)
Wait. Yeah. What is, what is perfume lighting?
Dave Clark (00:27:52)
That’s related to perfume ads back in the ‘80s and early ‘90s were very dramatic, so it was very like high contrast, almost felt black and white, but there was always like a splash, a wash of color, which is how Se7en looks. But interesting. Okay, so let's build on this. One thing I want to do is I want to make it 16:9.
Dan Shipper (00:28:12)
What is 16 9?
Dave Clark (00:28:14)
That's a widescreen. So that's what you're used to seeing—or 2:1, I usually do 2:1 or 16:9, but we'll do 16:9, so it's a little bigger.
Dan Shipper (00:28:18)
Okay. And do you know what the aspect ratio of this one is?
Dave Clark (00:28:22)
This is 1:1. So this is your Instagram square look, which is your default. I forgot to go in, but you can actually change it, so the first image is always 16:9. You don't have to prompt it. Okay. So what changes do we want to make besides that?
Dan Shipper (00:28:57)
Well, I'm curious. I mean, I can certainly give you some input, but I'm curious how you would think about it. Help guide me through how I would think about what changes I might want to make or what I should be thinking about or seeing.
Dave Clark (00:29:55)
Yeah. So I think for what you want to say, it's missing some core pieces. You talked about it, it doesn't feel very inaugural, doesn't feel like it's in D.C. So I think—Where would this take place, typically?
Dan Shipper (00:29:09)
Well, I mean, it's gotta be, kind of, on the Mall, in front of the Capitol, right? I'm pretty sure that's where it happens. I am sort of curious what you think. This is a mistake, obviously, but then also, I'm kind of intrigued. What did Nicolas Cage do to end up in this situation? He looks like he's in a court and—
Dave Clark (00:29:37)
Well, we could go down that rabbit hole.
Dan Shipper (00:29:40)
Yeah, we kinda could. I'm curious. Do you see anything here where you're like—
Dave Clark (00:29:43)
It almost feels like he's in a church. This is some culty underground—what did he get himself into? It kind of gives me John Wick vibes a little bit, too.
Dan Shipper (00:30:59)
Yeah, it's the cult of Cage, and he's got—what's he holding? Is that a Bible, or what's in his hand?
Dave Clark (00:31:06)
Yeah, we, I mean, could we, we could ask Blasian GPT.
Dan Shipper (00:31:13)
Let's ask.
Dave Clark (00:31:14)
Before we change the aspect ratio—“What is the book Nicolas—” this is fun. I don't really do it this way, ”—Cage is holding in his hand.”
Dan Shipper (00:31:42)
Okay. Interesting. Okay, ChatGPT's playing it straight with us. It's just the Bible or the Constitution.
Dave Clark (00:31:50)
But what if the book was, it almost reminds me of the book from Evil Dead. Did you ever see that movie?
Dan Shipper (00:31:58)
No. What is it?
Dave Clark (00:31:59)
Evil Dead is that film where it was like the book—that cursed book of death—when you read a thing from it causes all these evil spirits to show up.
Dan Shipper (00:31:25)
I love that. It's cool. Yeah. He's holding the cursed Book of Death and he's going to open it. Okay, here's what it is. Here's what it is. Nicolas Cage's movie career was over a couple years ago. He just wasn't popular anymore. And now he's sort of back, he's in a couple movies now. It seems like—and I think that's how he did it. He went into the underground, he had this Book of the Dead, and he opened it, and he made a deal with the ghosts that came out, and that's how he resurrected his career. I feel like that could be it.
Dave Clark (00:31:45)
“Made a deal with the ghost and he resurrected—” This is gonna be absurd. “—resurrected his career. Show me the next scene.”
Dan Shipper (00:32:04)
And so, is that something that you would do is you find an image you like and then and then you do the next scene.
Dave Clark (00:32:11)
Yeah. So if you think about what, well—so, this is different here. So if I was in Midjourney, you get the four variations and they can go down each variation rabbit hole and do more variations. So because you get one image, I typically like to say, okay, so what happened before this or what happens after this? So like I'm intrigued and then. And see what we get.
Dan Shipper (00:32:29)
Okay. Let's see what we get. And if you want to try it in Midjourney too with a similar prompt, I’d be kind of curious to see what the differences are.
Dave Clark (00:32:37)
When we get to the end of this story, we'll say, okay, now give us a few prompts to throw into Midjourney. And then we'll take it in there.
Dan Shipper (00:32:50)
Yeah. I'm kind of curious what drove Nicolas to want to do this deal with the devil, right? ‘Cause, obviously, he's got enough money, I mean, I assume, right? So, I wonder what was going on in his life before this that created this situation such that he decided to end up in this dark cavernous place with the Book of the Dead in his hands. It's an interesting question.
Dave Clark (00:33:20)
Ooh. “The next scene was with a figure resembling Nicolas Cage in the midst of a pivotal supernatural moment making a deal with the ethereal entities that surround them.” This will be fun to animate we're gonna have to animate these ones.
Dan Shipper (00:33:40)
I love it. We have to. Look at that. I mean, this is National Treasure, but make it noir, make it dark.
Dave Clark (00:33:48)
This is pretty incredible. And the fact that it's happening live on Blasian GPT available worldwide, maybe one day, is making me look good. Okay. This is pretty cinematic.
Dan Shipper (00:33:58)
This is great. Okay, so now what? And before we go on, I just want to—just one, one quick thing. Sometimes people listen to this instead of watching, and so you're already doing it, but the more you can deliver the prompts and say what's being pushed out, the easier it will be for people who are listening.
Dave Clark (00:34:29)
For sure. So the image that we're looking at is “Nicolas Cage in the middle of Supernatural Ghost. It's illustrating atmospheric tension and the blend of the supernatural with the gritty cinematic cinematic style, reminiscent of Dave Fincher's work room is ancient and shadow filled. And it becomes the stage for this mythical pact, excuse me, mystical pact, where the boundaries between the realms of the living and the dead blur, making a dramatic resurrection of his career.” This must be what he did: If a ChatGPT knows this and this, this has to be what happened.
Dan Shipper (00:34:56)
I mean, definitely this is true, true to life, for sure. And I love your delivery. You're really making it—you got to narrate this for us. I love it.
Dave Clark (00:35:05)
No. Yeah, I have to—Did you ever see Mandy that he did? It was an incredible film. Yeah. Oh man, it was, I think that film resurrected his career. I mean, but it was like a supernatural-type film so maybe some of this stuff happened.
Dan Shipper (00:35:17)
I mean, maybe Mandy came from the ghosts, maybe they gave him the idea for it or maybe or yeah.
Dave Clark (00:35:25)
Okay, let's think of a next place to go. So after this he's made the deal and what does he have to do next? I feel like he has to go do something crazy like drive to Vegas and find some relic to finalize the spell.
Dan Shipper (00:35:44)
Interesting. That could be cool. Maybe as part of the deal, you don't just open the books and the ghosts do what you want. You have to complete a mission for the ghosts in order to get them to help you with your Hollywood career. So what would the ghosts want in Vegas?
Dave Clark (00:36:17)
The ghost demands that Nicolas Cage go to—what's a weird haunted kind of hotel in Vegas? The Flamingo Hotel.
Dan Shipper (00:36:16)
Ooh, the Flamingo. I was going to say, what's the pyramid one again—the Luxor. That’s got some interesting vibes there.
Dave Clark (00:36:24)
Okay, go to Luxor and steal a relic. A cursed relic.
Dan Shipper (00:36:31)
Ooh, a cursed relic.
Dave Clark (00:36:33)
That is hidden where? Underneath—
Dan Shipper (00:36:35)
I mean, I think it's gotta be a special roulette ball—and it's a roulette ball that's currently in use. But really the roulette ball came from—I don't know. where it came from. I'm getting some images of the mob, like there's some mob murder something to do with the roulette ball, but I'm coming up empty. We could, we could also ask ChatGPT what the—even in another tab—what could possibly be the backstory of the roulette ball or why the ghosts feel that it's important. But let's see: “Steal a cursed roulette ball that is currently in the hands of the top mob boss. The top mob boss who’s—now I'm thinking of Ocean's Eleven, like Terry from Ocean's Eleven. What's his last name?
Dave Clark (00:37:42)
I forgot. Terry—
Dan Shipper (00:37:34)
Ocean's Eleven… Terry Benedict. Or Reuben. Either one. Now we're getting ridiculous.
Dave Clark (00:37:56)
We’ll just say, “Who was Terry Benedict?” Okay. Perfect. “Show this image in 16:9 ratio.”
Dan Shipper (00:38:07)
I kind of want to get one of the balls going around the roulette, like the cursed roulette ball going on.
Dave Clark (00:38:15)
We’ll do that. That could be cool. What I love about ChatGPT, which you can't really do a Midjourney. I mean, you can, but it's not the same experience you can really—you’re telling a story together and you're really detailed in your narration, right? You're really just, it's almost like you're sitting in a campfire by this person and you're literally just coming up. It does remind you of being a kid, right? Picking up a little VHS camera and just saying, let's make something silly. It's really cool. Woah. That's pretty, pretty incredible imagery.
Dan Shipper (00:38:51)
The banks of the casino. It almost looks like computer monitors like it's a casino where they're sending him to the moon or something, but it's like a casino.
Dave Clark (00:39:03)
Control panel, yeah.
Dan Shipper (00:39:11)
Yeah, that's interesting. I really love the face, all the faces and stuff, like the kind of pharaoh face imagery. I don't even know if that's in the Luxor or not, but it's pretty cool.
Dave Clark (00:39:20)
Okay, so ChatGPT has a mind of its own sometimes, but let's ask for—okay, let me actually put this in. “For now on, all images will be 16:9 ratio,” so I don’t have to keep adding that.
Dan Shipper (00:39:27)
Okay. So I guess like, what's next? Do you want to keep going with this? Do we hop into Midjourney? What's your next step in your process? Usually now that you've got a couple frames of something that's looking like a story, do you want to keep going with finishing the story or how does it work?
Dave Clark (00:39:46)
We don't have to finish it, but let's do your last shot of the roulette ball. “Show an image of the cursed roulette ball spinning around the roulette wheel on a table in Vegas.”
Dan Shipper (00:40:03)
This is making me think of—there's an episode in the last season of The Sopranos where Tony goes to Vegas and he does peyote and they're playing roulette and he's like, it's going around the ball. And he's like, it's the same principle as the, the planets orbiting the sun and the universe or whatever. There's something about roulette that to him was like deeply connected to the way the universe works. It just made me think of that for some reason.
Dave Clark (00:40:34)
Yeah, that was such a great show. Okay, so now we're looking at an image that “captures a tense and vibrant moment at the Luxor Casino, where the curse roulette ball spins around the wheel, surrounded by a crowd of onlookers and gamblers. The scene is set under the casino's opulent Egyptian themed decor, with dim lighting adding to the atmosphere of mystery.” I mean, that's interesting. I mean, that's kind of crazy. It looks almost like the painting of Jesus eating—The Last Supper.
Dan Shipper (00:41:07)
Yeah, I kind of feel like we need a closeup of the cursed ball ‘cause I want to see what a cursed roulette ball even looks like.
Dave Clark (00:41:16)
Good idea. Let's get some camera angles in here. “Show us an extreme closeup angle of the–” sometimes if you go with shorter descriptions, it'll get you the nice kind of frames. “—extreme close up angle of the cursed roulette ball.”
Dan Shipper (00:41:36)
Okay, so you're saying, “Show us an extreme close up angle of the cursed roulette ball.” Alright, I'll read for you as you type, so you don't have to read and type. Perfect.
Dave Clark (00:41:47)
I'm still a two-finger typer, even to this day. So I think maybe then we'll take a few of these prompts into Midjourney and see the difference. And then we'll pick what we like the best and animate a couple shots. This is fun. I gotta do this more often. I never really did this before. It's fun.
Dan Shipper (00:42:04)
It's cool to do it with someone else. 'Cause you're bouncing stuff off of each other and then you're bouncing it off ChatGPT and you just create this thing that you don't even—
Dave Clark (00:42:16)
Whoa! Dude! Good call, good call. Dude, that's epic.
Dan Shipper (00:42:20)
It's like a cursed Pokéball roulette ball.
Dave Clark (00:42:24)
Should you read this description to the audience?
Dan Shipper (00:42:25)
You can read it. You got it.
Dave Clark (00:42:27)
Okay. So, “The image presents an extreme closeup of the cursed roulette ball showcasing its intricate details and ethereal glow of ancient symbols and runes etched onto its surface. These mysterious markings suggest a supernatural origin and the unique texture and sheen under the casino's lights further distinguish it from an ordinary game piece. Close up captures the eerie beauty and craftsmanship of the ball.” I mean, yeah, it really is an intricate design we got going on—Indiana Jones-level stuff.
Dan Shipper (00:42:54)
I love it. I wonder if it knows what is cursed about it. What makes it cursed?
Dave Clark (00:42:57)
Yeah. It's kind of scary to think that there's maybe a curse already written on it that we don't even know.
Dan Shipper (00:43:08)
Maybe ask it: Tell us in what way the roulette ball is cursed and how it relates to the ghosts and get and have to give us a couple ideas. Just so we know. I want to know the lore behind this. So you're saying, “Tell us in which way the roulette ball is cursed and how it relates to the ghosts.”
Dave Clark (00:43:26)
“The cursed roulette ball imbued with ancient symbols and glowing ruins serves as a powerful conduit between the realms of the living and the dead. Its curse originates from an old Egyptian ritual designed to harness the energies of the underworld and bind them to the physical object. The intricate symbols etched into the servers are not merely decorative, but are specific sigils that have been used throughout history to summon and communicate with spirits.” They're writing a whole novel about this thing. Here it is: “The deal that Nicolas Cage's character makes with these ghosts involves the ball directly. By agreeing to retrieve the ball for them from a place of power, the Luxor Casino, symbolically tied to ancient Egypt and is mythical—” Interesting that they went there. Duh, that makes sense. “—he unwittingly becomes a pawn in their game. The ghosts promise him the resurrection of his career, leveraging their ability to manipulate fortune and fate through the ball. However, this comes at a price. Tying Cage's fate to the ball and the whims of these spirits who have their own agendas and desires.”
Dan Shipper (00:44:26)
Damn, that's a good one, man. I think we got a winner here.
Dave Clark (00:44:30)
I think so. Do we need to go into Midjourney or should we just?
Dan Shipper (00:44:37)
I want to see how Midjourney does with the ball. But yeah, you tell me what's next. We've got some frames, we've got a story. What would you do? How would you get into Midjourney?
Dave Clark (00:44:47)
So from here, because you can't just go on a Midjourney and say show us an extreme close up angle because it wouldn't work the same way because it doesn't have all the backlog. So what I would do is I would ask them, “Give me a detailed prompt to use in Midjourney that will get us an. Similar to the close up of the cursed roulette ball.” So I do something like that and then would help really kind of detail out what it probably used to create that.
Dan Shipper (00:45:17)
Okay. And so it's saying, “For a platform like Midjourney where detailed and creative prompts can lead to—” actually, I should, let me read what you said first. So you said, “Give me a detailed prompt to use in Midjourney that will get us an image similar to the close up of the cursed roulette ball,” and it's saying, “For a platform like Midjourney where detailed and creative prompts can lead to highly specific and intricate images, crafting a prompt that vividly describes the cursed roulette ball is key. Here's a detailed prompt you can use,” and it just gave you a prompt, like, “Create an ultra detailed hyper realistic image of cursed roulette ball central in the frame with a focus on its ancient mystical symbols and glowing ruins. The ball should exude a dark, ominous aura hinting in its supernatural origin.” And it just sort of keeps going. Alright. Okay, so we're flipping over to Midjourney and we're going to just test out this prompt and see if Midjourney does any better with the roulette wheel. So I'm gonna do slash imagine, which is the start for the prompt, right? And do you want me to just paste it in or how would you do it?
Dave Clark (00:46:18)
Yeah, I would just—I'm just curious to see if we just took the long-winded prompt and pasted it in. Okay, cool.
Dan Shipper (00:46:23)
So we're just pasting in this long winded prompt from ChatGPT. It's, “create an ultra detailed hyper realistic image of a cursed roulette ball,” and nothing else? No, you're not changing the model or the style or like anything?
Dave Clark (00:46:34)
I would change the—so just do dash dash AR. And then space 16, colon, 9.
Dan Shipper (00:46:42)
Okay, cool. And that's just the aspect ratio. That's really changed. And, here we go. I tested it out with a couple of Nic Cage images, which I think are pretty good.
Dave Clark (00:46:55)
Those are pretty cool.
Dan Shipper (00:46:59)
Ooh, starting to take shape.
Dave Clark (00:46:19)
Looks pretty cool. Wow.
Dan Shipper (00:47:05)
What do you think?
Dave Clark (00:47:06)
Let's see. I'm really digging—They're all really cool. The top-right one is intriguing and the bottom-right one. Those two are—
Dan Shipper (00:47:16)
I think I agree. I was going to say bottom-right? I like this one too. This is cool. ‘Cause the contrast is really nice.
Dan Shipper (00:47:22)
Okay. So what would you do? Would you upscale? Would you do different versions?
Dave Clark (00:47:25)
I would upscale. And then maybe we can try some different versions from the one.
Dan Shipper (00:47:30)
Okay. So we're going to do upscale two and four and then we would do different versions. So would you want me to vary it or how do you—
Dave Clark (00:47:40)
I do vary strong.
Dan Shipper (00:47:41)
Okay. Very strong. So we’re going to hit very strong. I actually, I mean, I hate to say it, but I really liked the ChatGPT version, but these are more cinematic. There’s really something good going on.
Dave Clark (00:47:51)
Seriously. Right out of a movie.
Dan Shipper (00:47:53)
Yeah, very strong here too? Okay. And do you think it would do a better job because we still don't have that shot of it going around the wheel. Do you think that it would do a better job of that? Or is that pushing it too far?
Dave Clark (00:48:08)
I would think so. I think I do a better job.
Dan Shipper (00:48:11)
And how do you get the same roulette ball on the wheel? Is that possible? Or do you have to kind of just fudge it a little bit?
Dave Clark (00:48:17)
Yeah, that's when you kind of got to take it into like a photoshop and fudge it. Because yes, I mean, the detail is so intricate. It's not like a red shirt or a yellow hat or something right? But yeah, ‘cause it's a circular thing. That’s an easy comp.
Dan Shipper (00:48:31)
Okay. So here's our variations of the first one. Wow.
Dave Clark (00:48:37)
Those are cool. It was like a time machine on the bottom-right.
Dan Shipper (00:48:41)
That's what I was thinking too. Yeah. Or those helmets that divers wear.
Dave Clark (00:48:47)
Yeah. I'm getting that too.
Dan Shipper (00:48:50)
And then this one: These feel a little bit less cursed to me somehow? There might be something good in there, it might not be bad.
Dave Clark (00:49:01)
Yeah, it looks like an Indiana Jones treasure. Yeah.
Dan Shipper (00:49:06)
Okay, what do we do?
Dave Clark (00:49:08)
So, I still think I did the original.
Dan Shipper (00:49:11)
Okay. So, let's do the original. So, we're gonna keep the original, and then, what's the next step? Do we jump into Runway, or yeah, what do you do?
Dave Clark (00:49:19)
Yeah, I think I'd bring him into Runway. Maybe we'll do this image. And then one of the initial Nicolas Cage images, I think when he's surrounded by the ghost. We'll animate these two and see what we get.
Dan Shipper (00:49:30)
Let's do it. I'm psyched. Alright. So we've got some images from ChatGPT. We've got a Midjourney image and now we're going to do some, do some animating. It looks like we're in Runway. Give us a little bit of background for people that don't know about Runway and tell us what we're going to do.
Dave Clark (00:59:52)
Yeah, so Runway is a great tool. You can actually go in here and do text-to-image-to-video right, right in one app, which is awesome. If you're trying to create in a time crunch, what I like using it for is animating my more cinematic images. Just because the level of control you have, they have this tool called the Motion Brush, which we'll show you, and how you can really kind of show what part of the image you want to move and then how much you want it to move. So it's really cool. So we'll actually start with this image first. It's Nicolas Cage surrounded by the spirits, as he's looking at the Book of the Dead, this is an awesome image. I'm gonna send this to you so you can frame it.
Dan Shipper (00:50:30)
Please. Are you kidding me? I'm going to put it on YouTube. I'm going to get all the views.
Dave Clark (00:50:34)
Let's do it. Heck yeah, man. Alright. So I'm going to click in the Motion Brush. So Motion Brushes is a new tool that was released from Runway. It's really cool. You're able to select a brush and then, basically you color over areas that you want to move and then you're able to tell what kind of movement you want to have. So we'll start with Brush 1. We'll just kind of brush over Nic Cage. I think it'd be cool if he was just kind of standing. I'll give him a little bit of ambient movement, like a 0.5, Brush 2.
Dan Shipper (00:51:08)
And ambient movement is sort of his clothes are rustling or something?
Dave Clark (00:51:11)
Yeah, maybe his hair is moving. Okay, so I'll do this first.
Dan Shipper (00:51:13)
And is it smart enough? Because you kind of nicked his head a little bit—heh, nicked Nic Cage’s head. Is it smart enough to know that it's his head and you don't really want that, or do you have to be really have to—
Dave Clark (00:51:32)
It is. The good news is I can go right back to brush and then fix it. It's not so delicate. Just as long as you have enough and then we'll maybe do a third brush on this—the light from God.
Dan Shipper (00:51:49)
Ooh, I like that. What about the candles too?
Dave Clark (00:51:53)
Yeah, we'll do it. We'll do—
Dan Shipper (00:51:54)
Gotta have the candles flickering.
Dave Clark (00:51:54)
Do a Brush 4 for the candles. So I just like to quickly go paint everything, and then I go back and then mess with the parameters. So, we kind of already did Nic Cage, so for the ghost I want maybe some smoke to kind of go in the air. So I use the—Oops. I use the vertical brush. Maybe a little bit and then—
Dan Shipper (00:52:19)
Interesting. Okay. So that's going to push the ghosts’ mist up.
Dave Clark (00:52:23)
Correct. Yeah. And I'll give it a 2 ambiance. And some of it is trial and error. I've been using it for a while. So I kind of know what I could get. And then if you go to the—so I'm just going to do an ambient noise up to their proximity, I'm going to have the rays kind of feel like they're kind of coming towards us a little bit. And then for the candle lights and just such a simple movement, I'll just do like a little bit of ambient noise there. Yeah. And then that's it. And then you just click done and then you can just generate I like, I like to always do five at a time just ‘cause you don't know what you're going to get.
Dan Shipper (00:53:03)
Interesting. And we'll just hit five, like you just hit generate five times.
Dave Clark (00:53:07)
Correct. Yeah. And I'm on the Unlimited Plan, so I'm also actually part of the Creative Partners Program at Runaway. So I got like a million credits that I still need to get through, but it's pretty awesome.
Dan Shipper (00:53:18)
Dude, I'm so excited to see what this looks like.
Dave Clark (00:53:23)
Me too. Me too. It's like a perfect image to use a Motion Brush too, because it has the smoke. It has light rays. It's built for the Motion Brush. So we should get something good. And we didn't even mess with camera movement. Actually I'm going to do one just so we have an example of let's do a zoom out. Okay, just to see what that gives us. That's cool. Switch to my credits because Unlimited you can only generate five at a time. So you got to use your credits.
Dan Shipper (00:53:51)
Hmm, and how long does it normally take to generate like a single image?
Dave Clark (00:53:56)
I would say but on average like a minute something along those lines. Probably quicker if you're just doing one or two. But I threw so many in there—or generations at a time. It might take a little longer, but it's fairly pretty quick.
Dan Shipper (00:54:10)
And actually, I guess while we're waiting—one of the things that I'm thinking about or keying on is, there's this trope happening right now where—
Dave Clark (00:54:22)
Oh, Oh, should we go to it first?
Dan Shipper (00:54:24)
Let's play it. I want to play it.
Dave Clark (00:54:29)
Ooh. Nic Cage kind of leans into too.
Dan Shipper (00:54:34)
That is so good.
Dave Clark (00:54:35)
And that was the first one out the bat. I don't even look at the rest. What else we got? This is a little bit more subtle and Nic Cage kind of does a little head tilt. But what people actually don't know is you can—what I actually do is I take multiple generations into like a Premiere and like I layer them and like I take parts from each clip that I like better. I might like the smoke here better, but I might like Nic Cage less than the other one. So I'll be able to just kind of mask him in. That's really cool. Maybe it's not ready yet. I was going to say, let's go to the camera. Let's check it.
Dan Shipper (00:55:14)
And so these are all kind of subtly different, but they're not—Oh, wow. Nic Cage is sort of changing.
Dave Clark (00:55:18)
Yeah. This one got kind of funky. It doesn't even look like himself anymore.
Dan Shipper (00:55:24)
Yeah. I mean, maybe they transformed him into another character in order to let him get to the Luxor without being seen—that's possible.
Dave Clark (00:55:38)
Yeah. Alright. Now we've got some camera movement ones. Let's see what we got. This is where I zoom out. Even the light dims down. That's pretty cool.
Dan Shipper (00:55:48)
That’s sick. Oh my god. I love it.
Dave Clark (00:55:56)
Yeah. The Motion Brush is a pretty sick tool.
Dan Shipper (00:55:57)
I would be curious to do a zoom in. It feels like he's opening the book and we're getting a tight shot on him. Great. While we're waiting, are you satisfied with this? Is this the kind of thing where you're like, this is good enough. I'll pick one of these or I'll pick a couple of them and I'll start getting into Premiere and and now moving on to my next generation or are there other things that you tend to do? How long are you spending playing around with all the different parameters and all that kind of stuff?
Dave Clark (00:56:25)
Yeah, so I say because I've had so much time to play with these tools, like I said, I kind of know how to dial them in so I get to something I like quicker. Always do five at a time. So like you said, sometimes you get lucky. The first one was money. Most times it's not the case. Sometimes it takes 20. But I then bring it into an upres-er, like a Topaz and I would upscale it to like 4k ‘cause that helps with quality. And sometimes if I really want Nicolas Cage’s likeness to show, to shine through, I'll actually go back to the original image, mask in his real face and then use some of the movement from around his head. So his face actually stays high-quality.
Dan Shipper (00:57:01)
That’s cool. That's what I was going to ask you about is, I think there's this trope right now where you can do a viral AI movie or AI clip where you're using someone else or something else that's recognizable. And you're like, I want to do Star Wars, but in the style of Wes Anderson or whatever, which I feel like we're, we're doing a little bit here 'cause it's Nicolas Cage, like, is that legal? Are you allowed to do that? What are the conventions around doing all that stuff?
Dave Clark (00:57:27)
Yeah, I think so.I mean I partnered with Curious Refuge. I'm actually teaching an AI advertising course with them. And they're known for their viral Star Wars video. They're the ones who did the Wes Anderson Star Wars video that became a thing and for me, it falls under parody law. You're kind of doing it as a parody. It's kind of an SNL skit in a sense. You're not making money off of it directly, right? I'm not gonna go sell this and make a million dollars off of a Nic Cage generation. So, I think as long as you do it ethically and do it under a parody-type idea, or like you said in a sizzle reel if I'm trying to pitch a movie, I think all those fall under ethical parody laws.
Dan Shipper (00:58:05)
Got it. Okay, that's cool. Alright, so we got our, we got our close-up, our zoom-in. Let's see it. Interesting.
Dave Clark (00:58:14)
They messed up his face. See, this is an instance where I would mask in the real face.
Dan Shipper (00:58:20)
And I don't like that the light is kind of going out as we're going in. But, I think we've already got a shot that's—
Dave Clark (00:58:29)
His whole features are gone. But, the pushing doesn't look too bad.
Dan Shipper (00:58:34)
Yeah. I like the motion for sure. I think the original one with the camera movement zooming out was probably the best.
Amazing. So should we do the roulette ball?
Dave Clark (00:58:55)
Yeah, let's do the roulette ball.
Dan Shipper (00:58:56)
Well, this is an interesting question. So, with the roulette ball or with the Nic Cage thing, there was an obvious motion element, but with the roulette ball, it's a little bit unclear what the motion is going to be. Is it going to be the camera? Is it going to be the light inside the ball? How do you think about that?
Dave Clark (00:59:14)
So usually an image like this, I don't even know if I would go into Motion Brush. I think I would just mess with camera controls and mesh with the motion control. So let's try that for this one. So it's more of a traditional way to use Runway. So, you can also give it a description, right? If you want the light in the center to glow—
Dan Shipper (00:59:37)
Oh, cool. I want the light in the center to glow a red color. So it goes from yellow to like red like it's threatening.
Dave Clark (00:59:46)
So I've actually never done that before—the color change thing. I'm interested to see what happens. So let's do that and let's give it like a—should it be a zoom out or a zoom in?
Dan Shipper (00:59:56)
Well, we had to zoom out on the last one, so maybe we could try to zoom in on this one.
Dave Clark (00:59:59)
Alright, we'll zoom in so that's set. Let's actually take it down just a little bit. Maybe we give it a slight camera roll, so the camera's going to roll over. I always like that move, which goes to show your traditional filmmaking background, it helps understand a lot of these more technical-type questions. But yeah, let's go ahead and generate five. My rule of five. And we'll see what we'll get. Because I talked to a lot of people and they're like, dude, man, how does your stuff look like that? I go on Runway. My stuff looks like crap. It's all warp-y. I was like, how many generations do you do? Well, just one. You gotta do a bunch! That’s the trick.
Dan Shipper (01:00:38)
That's the thing that I've been trying to underscore for people. I think it's true for ChatGPT. It's true for all these tools like most of this stuff is bad. Some of it is amazing. And, I try to tell people, you can't just do one thing and then expect it to be amazing. You are a manager, you're a model manager, and your job is to get out of the model, the best thing you possibly can, like a manager would get the best out of an employee. And so if it's not working, your job is to figure out how to make it work. And there are definitely techniques for what you're doing, which is, do five generations that are going to help you get to the stuff that's really amazing. And I think once people learn that there's just going to be a sort of explosion in people using this stuff, because I think they just come into it with this expectation that it's just going to be magic on the first try. And it is magic, but it takes a little bit more work than that.
Dave Clark (01:01:29)
Yeah, the old one-button press—press one button and you get a movie like that. That's never going to happen. There's always going to be some kind of human intervention. But I think of it like it's almost like these are camera rolls. So you yell cut, you do another take. It's kind of the same idea. You do multiple takes and then you see what’s the best—I see some red going on on this thumbnail. So definitely got the red, right? Let's see, man. I don't know. That might've worked. Oh dude. It did it. First try again, man.
Dan Shipper (01:02:06)
I love it. The AI gods are smiling on us today.
Dave Clark (01:02:10)
Runway better send me a check, man. I'm part of the CPP making this thing look like money. First try. Well, let's just see what else we got here. I see red on all of them, so it's maybe a different type of glow. This is interesting.
Dan Shipper (01:02:24)
Yeah, I like that one a little better. That's nice. The last one got a little bit too red. It went way huge and this one's a little more subtle, which is sort of interesting.
Dave Clark (01:02:35)
Yeah, I like this little leak back here going on. It's kind of cool—cinematic. Yeah, that's cool. Alright, let's check the third one. Whoa. It's crazy how it just goes these different directions. This is super overpowered red. I like all of them differently, right? Maybe you can use pieces of each, right? That's the beauty of it. This one's great. Look at that depth of field that you're getting.
Dan Shipper (01:03:07)
Yeah, I like that one.
Dave Clark (01:03:08)
And we might have to make this Nic Cage movie happen.
Dan Shipper (01:03:14)
I’m in, man. Can I be the EP?
Dave Clark (01:03:16)
Yeah, these visuals are incredible. This one gave us a little white light at the top. That's weird. Look at that little—it's almost like a ghost orb. It's like haunted. There you have it, man. Your experiment.
Dan Shipper (01:03:31)
We got it. So in 60 minutes—90 minutes—we went from nothing to a movie about Nicolas Cage, resurrecting his dead movie career by interacting with a bunch of ghosts who sent him to Luxor to get a haunted roulette ball. And it's all on video. You can watch it. It's amazing. I love this.
Dave Clark (01:03:54)
Yeah, it was awesome. Great idea. Great idea.
Dan Shipper (01:03:57)
Great execution.I think it was a team effort. We did a good job. Yeah. I love this. I feel so happy right now.
Dave Clark (01:04:06)
I've never done that before, especially on camera. So that was pretty fun. And yeah—and Blasian GPT works.
Dan Shipper (01:04:14)
Well, if you make that public, we'll share a link. So let me know about that. And as you have other things coming out please share them with me and share them with us.We'll make sure to. to link people to them. I think you're amazing. You're doing incredible work. You're super, super talented. Thank you for spending the time with me with us for making our movie. Maybe we'll have to actually make it into a real movie and put it on YouTube. So if you have some time you want to do it, let me know. And yeah, I appreciate you.
Dave Clark (01:04:44)
I’ll definitely make a movie poster in the interim just so you can have it, so you can—when you launch this episode, you'll see the Nic Cage movie poster. We got to come up with a title, maybe on Discord, we'll bounce some ideas.
Dan Shipper (01:04:56)
Okay. We can do that. We can do that,
Dave Clark (01:04:58)
Nic Cage at The Luxor sounds cool too. Haunted Luxor.
Dan Shipper (01:05:02)
I love that. Nic Cage and the Haunted Roulette Ball.
Cool. Well, good to meet you. Thank you. Thank you so much.
Dave Clark (01:05:12)
Thank you, Dan. Appreciate it, man.
Thanks to Scott Nover for editorial support.
Dan Shipper is the cofounder and CEO of Every, where he writes the Chain of Thought column and hosts the podcast How Do You Use ChatGPT? You can follow him on X at @danshipper and on LinkedIn, and Every on X at @every and on LinkedIn.
Ideas and Apps to
Thrive in the AI Age
The essential toolkit for those shaping the future
"This might be the best value you
can get from an AI subscription."
- Jay S.
Join 100,000+ leaders, builders, and innovators

Email address
Already have an account? Sign in
What is included in a subscription?
Daily insights from AI pioneers + early access to powerful AI tools
Ideas and Apps to
Thrive in the AI Age
The essential toolkit for those shaping the future
"This might be the best value you
can get from an AI subscription."
- Jay S.
Join 100,000+ leaders, builders, and innovators

Email address
Already have an account? Sign in
What is included in a subscription?
Daily insights from AI pioneers + early access to powerful AI tools