The transcript of How Do You Use ChatGPT? with Dave Clark is below for paying subscribers.
This episode pairs well with my review of OpenAI’s new text-to-video model, Sora. In the piece, I cited Dave’s experience to argue that AI will change filmmaking by making movie concepts cheaper to test and big budget movies cost less, and leading to the emergence of a new art form. Here’s what I wrote:
“Dave Clark is a traditional filmmaker who has started to make AI-generated videos. He recently produced a sci-fi short called Borrowing Time that was inspired by his father’s experiences as a Black man in the 1960s. He produced it entirely using Midjourney and Runway to generate images and videos. He narrated the movie himself and used ElevenLabs to turn his voice acting into the voices of different characters.
Borrowing Time went viral, and Dave told me he wouldn’t have been able to make it without AI. It would’ve been impossible to get a sci-fi short like his funded by a traditional Hollywood studio. But now that it’s out, and popular, he says that he’s fielding interest from top-tier Hollywood studios who would like to make it into a full-length movie.
This anecdote speaks volumes about the way that AI models like Sora will change filmmaking in the future.”
Check out the full piece.
Timestamps:
- Introduction 01:33
- How AI is enabling everyone with a laptop to be a filmmaker 10:19
- The new tool set for making AI films 14:30
- How to make your AI-generated clips stand out 16:56
- The first prompt in Dave’s custom text-to-image GPT for our movie 25:00
- The big advantage text-to-image GPTs have over Midjourney 37:58
- The best way to generate Midjourney prompts with a GPT 44:13
- Animating the images for our movie in Runway 49:10
- First look at our movie! 53:42
- How Dave thinks about animating images without an obvious motion element 58:22
- Why you need to be persistent while working with generative AI 59:46
Transcript
Dan Shipper (00:00:00)
Just make a movie together.
Nicolas Cage's movie career was over a couple years ago. He just wasn't popular anymore. And now he's sort of back, like he's in a couple movies now. It seems like he went into the underground, he had this Book of the Dead, and he opened it, and he made a deal with the ghosts that came out, and that's how he resurrected his career.
This is amazing.
Dave Clark (00:00:21)
This will be fun to animate. We’re going to have to animate these ones.
Motion Brush is a new tool that was released by Runway that’s really cool. You're able to select a brush and then, you color over areas that you want to move and then you're able to tell what kind of movement you want to have.
Ooh. Nic Cage kind of leans in too.
Dan Shipper (00:00:41)
That is so good!
Hey, I want to just take a quick break from this episode to tell you about a podcast you should know about. It’s hosted by Nathan Labenz, who was actually on this show and did one of the most popular episodes we’ve ever had about how to use AI as a copilot for the mind.
Nathan hosts a podcast called The Cognitive Revolution, where he interviews builders at the edge of AI and discusses the profound shifts that it will unlock over the next decade. He talks to the researchers and entrepreneurs who are shaping this moment, like Logan Kilpatrick of OpenAI, who I interviewed for this show as well, Eugenia Kuyda of Replika, and Suhail Doshi of Playground.
To listen to the show, search Cognitive Revolution in your podcast player. Alright, now let’s get back to How Do You Use ChatGPT?
Dan Shipper (00:01:30)
Dave, welcome to the show.
Dave Clark (00:01:32)
Thanks, Dan, for having me.
Dan Shipper (00:01:33)
Of course. So, for people who don't know, you are a film director and a commercial director. You work with a bunch of big brands like HP, Intel, and Google. You've been a film director for a long time in the traditional movie industry and you are also a prolific AI filmmaker.
You had a short film recently that came out called Borrowing Time, which went really viral. It's based on your father's life and I saw it and I was like, oh, my God, this is actually good. And I just wanted to have you on the show to talk about how you do that.
Dave Clark (00:02:07)
No, that's awesome. And yeah, again, thanks for having me. Yeah, we can kind of dive into that short and then how that was part of my father's upbringing, but yeah, it came from a personal place. So it's cool to see something that's not like Harry Potter or Star Wars go viral?
Dan Shipper (00:02:24)
Yeah, it's really great. Okay. So tell me, tell me about Borrowing Time. Like, what is it about? How is it conceptualized? Let's start there.
Dave Clark (00:02:31)
Yeah. So it was actually a story that my father told me when I was growing up. I'm mixed—my mother's Korean, my father's black. So obviously, he's in his seventies now, so he grew up kind of during the segregation era and Jim Crow laws and stuff like that. So he was telling me the story about when he was I think 12 or 13, he had winked or whistled at a white woman inside of a convenience store. And, he got in real severe trouble for it. He didn't go to prison, but I think that the following year or something, someone was either murdered or went to prison or got life in prison for, what people believe, falsely having something to do with the disappearance of a white woman. So it just goes to show—wrong place, wrong time. That could have been my father that went to jail for the rest of his life. And then the story, because I'm such a huge sci-fi nerd, I was like, man, if that, if my father ended up going to jail, and if I was a lawyer in the present tense, I was able to go back in time and like represent him in court. That's a cool story. I haven't ever seen that before. And, the power of AI allows me to visualize it because if you try to go pitch Hollywood, they're gonna say, eh, period piece, eh, no, been there, done that. And, plus, you're not Steven Spielberg, so, sorry, you're not going to make that movie. But I think because of AI, I'm able to visualize it in a really cool way—and that got on Forbes—that's a little story, a sci-fi period piece that some executives might yawn at it in a room if you're trying to pitch it, but because you can visualize it, they might look at it differently.
Dan Shipper (00:04:16)
No, that is really interesting. I was going to ask you why you made this with AI, but the answer is pretty obvious, but do you think of this as a way to get those people or funders interested? Is it a leg into making a traditional movie with this concept, or do you think of it as a totally separate thing with a separate arc and you're kind of just focused on making AI stuff?
Dave Clark (00:04:37)
I mean, because I'm a traditional filmmaker first, absolutely. Everything that I create, it's an IP, it's an extension of my creative ideas that I have in a notebook. And I will say that, yes, this, and I think the one that I did called Dismal Swamp—I don't know if you saw that one—that has gotten the attention of pretty A-level producers and execs in Hollywood. So it works, right? The purpose of it was to, for Dismal Swamp, it was to create a little one-minute sizzle or rip-o-matic, if you will, using AI-generated footage. If you think about traditionally in Hollywood, a lot of directors will take pieces from other directors' movies and cut them together like a sizzle to pitch an idea. But AI, I took my script, I fed my script into the prompts, and I made it based on all the stuff that was in my head. And couldn't do that five years ago, three years ago. And, I mean, that's incredible.
Dan Shipper (00:05:29)
What you're making me think of is, this has actually happened before in writing, for example. This is done in writing, but it's with tweets, like people started tweeting and then that was a way to test out what articles you would write or your. The articles that you write on your blog are tests for books. And they're super cheap. You can do them pretty quickly, much more quickly than you can write an entire book, which takes years. And there's no gatekeeper. You just throw it online and if it works, then maybe you get the book deal or whatever, or maybe you just self-publish because you don't need the gatekeeper anymore. And it sort of strikes me that that wasn't as available. It probably has been available to some degree in indie filmmaking because you can make a lot of stuff with just a camera and yourself, but there's a lot of stuff you just can't do, because you don't have actors and special effects or whatever. But it strikes me that this new set of tools makes that kind of thing available for a wider range of short films that can act as precursors to larger, better-funded projects in the same way that tweets or blog posts are precursors to books—these could be precursors to films is kind of what I'm getting from you. And, I never thought of that, and I think that's really amazing. And by film, I mean feature-length movies.
Dave Clark (00:06:53)
Yeah. I mean, that's a great point. A prime example is, you might've seen it was like a hybrid live-action AI film that I was working on called Another that went viral on Christmas Day. Okay, so, that's an idea of what I think is the immediate next step with AI in Hollywood is live-action and mixing that with a Stable Diffusion model to create visual effects. And we're in the middle of—I'm actually going to be releasing that, at the Runway AI Film Festival at the end of the month, so I'm rushing to the finish line to get the effects done. And I'm not going to lie, it's not easy. It's been a nightmare trying to figure out how to make Stable Diffusion look like and work with footage that you shot on like an Arri or on a Sony Venice—it needs to match the fidelity. And I'm working with one of what I think is one of the top VFX supervisors in Hollywood—he did John Wick 3, he did The Conjuring, and he's helping me do the visual effects using the Stable Diffusion setup. So again, it's all learning process and we're figuring it out as we go, but it's still very exciting. And exactly what you said: We're able to test the waters with these techniques. And the horror is perfect for that type of thing. If you think about Blair Witch Project or Paranormal Activity, it's always a horror film that creates a new subgenre sometimes. So it’s interesting.
Dan Shipper (00:08:13)
That's really interesting. I didn't know that. What subgenre did The Blair Witch Project create?
Dave Clark (00:08:18)
I mean, it made the found footage genre huge, right? There was like maybe one or two films before it that no one ever saw. But when Blair Witch came out, it was like, oh, that was a thing. And then obviously that led to Paranormal Activity.
Dan Shipper (00:08:30)
And what is found footage is pre-existing footage from some other project or something else.
Dave Clark (00:08:33)
Yeah, it's kind of that handheld style. There's movies like V/H/S that came out where—yeah, it has a DV-tape look on it. When I think about found footage, it's always like a Blair Witch or a Paranormal Activity because these tapes were collected by authorities and they edited it together.
Dan Shipper (00:08:49)
I see. So it's not necessarily prime footage isn't necessarily pre-existing film from somewhere else, but it just looks like it was—it looks like it was collected by someone else and it wasn't intended to be shown.
Dave Clark (00:09:02)
Exactly. Yeah. That's the style of it. Yeah. They try to keep the realism.
Dan Shipper (00:09:05)
That’s really interesting. Okay. I love that. And I guess like I have to ask the obvious question, which is, there's so much backlash, I think, particularly in a community that you're a part of and really familiar with, which is traditional filmmaking and a lot of worry about AI tools and how it will change who gets to make films or what films get made or who gets paid, or there's just a lot of questions. And I'm curious, like how you decided to sort of jump into this, versus not.
Dave Clark (00:09:34)
Yeah, no, again, I think it was a little bit of necessity. It was a little bit of seeing a lane. Being a person of color—I'm not going to lie. I mean, of course, there's brilliant Black filmmakers and female filmmakers that are winning or getting nominated for Oscars—or should be getting nominated for Oscars. But there's still a small amount of people, especially in the sci-fi horror space. If you say, oh yeah, I want to be a horror director. Oh yeah, like Jordan Peele. That's what everyone always says. When you're a Black person and you say, I want to be a horror director. So that just tells you that there's not enough. And I think, for me, my argument to anyone—and I have big time director films and big time film director friends and writers and they're starting to come around, they're starting to kind of see the positivity in it and how it could really enhance and give people opportunities and storytellers opportunities that might not have the opportunity. I think about if you're a white kid from Arkansas, who loves making films in your backyard, but you don't have the connections, you have a chance. If you're a Black kid from the hood—that's how I grew up—you have a chance, everyone has a chance to create something incredible. And that's what's really cool to me. And then, I was able to speak to SAG directly, which was an interesting moment. I was talking to the board of directors through my friend, Rob—he was giving a webinar on just the difference of CGI versus AI, ‘cause there was a whole thing about digital doubles and is AI going to replace me as an actor? And while I can't answer that truly and really know the answer, I do know that AI is going to be kind of used kind of how CGI is—you'll use AI to create large stadiums of crowds and digital doubles of people, but it's all going to be with consent and people, I believe people should get paid for their likeness and all that. So, yes, to kind of circle back on your bigger question is, I think AI is the best opportunity for anyone if they want to become a creator or filmmaker or an artist and create something that can be seen on a large scale, you have a great chance if you use some of these AI tools.
Dan Shipper (00:11:38)
I love that. I mean, it's so interesting ‘cause it's so in line with things that I've felt and seen just in different areas of the world. And I didn't realize how directly it applied to filmmaking, but it obviously does. One of the things I see a lot is AI changes who can make software. Iit dramatically expands the territory of people who are able to build stuff because it can program for you. And I think the same thing seems to be true here. The cost and the level of skill maybe to get started with building just making a very, very small video clip just went down tremendously. And obviously it requires a lot of skill to build that into something that people wanna see. But, it doesn't require that much money. And for me, we were talking about this in the production call before we started this show. And I loved—I so wanted to make movies when I was a kid. I had this Lego set—this Lego, Steven Spielberg, Jurassic Park thing that came with a camera and you could have the little actors and it was sort of almost a stop-motion animation thing. And I was so into it, but you couldn't make anything good. And so then I got into 3D modeling and I was trying to make a Pixar-type thing, but that didn't really work 'cause, at that time, you had to be a fricking genius to like make an actual 3D animated movie in like the mid-2000s. And so then I started making software because like software—especially business software—it's just easy to make forms that people fill out and pay money with a credit card, which I love. I mean, it's really an amazing part of my life. But honestly I think if AI filmmaking tools had been available, I would probably be making movies. And I think it's so amazing that now anyone can just go make a movie and it doesn't require that much money. It's so cool.
Dave Clark (00:13:34)
Yeah, I mean, especially—I have young children and they're artists and they have great imaginations. I was just using ChatGPT the other day to come up with a bedtime story on the fly for my son. He's obviously big into Spider-Man and Venom. And so I created a story really quickly—like, help me create a story with my son who's five and he wants to work with Venom to fight this guy. And it did it really fast. And it was 9 p.m. So I was, on the fly, how can I get my kids to sleep? And he loved it, man. But it's that little thing right there—you couldn't do that before, you would just struggle to get your kid to bed or now they can go to bed happy and you can create any kind of story you want, even for small things like that, which is cool.
Dan Shipper (00:14:17)
That’s so cool. I guess, I'm curious: What would you consider the new tool set for making AI films? And how does that differ from the traditional one? What tools and programs are like part of the new workflow that you're working off of.
Dave Clark (00:14:34)
Yes. It's interesting because I honestly think from what I'm seeing personally is that I'm still using a lot of the old tool set, but I'm actually now figuring out how to work them into this new pipeline—this new production workflow. Stable Diffusion has been out for years, but now I'm able to hone it in and use it for a specific purpose—what I think would work for live action. Same thing with animation, right? You have AnimateDiff. You have stuff like Warp Fusion, which a lot of people are creating some amazing animations. And, I honestly think the first feature-length film will be an animated film, a Pixar-type film, completely done in AI. Might even get the trailer for it this year. That's how fast this stuff is moving, ‘cause I can totally see any of these Diffusion models able to allow it because I mean, and the thing is these guys who really know how to use it, they're not storytellers first. A lot of them are just tech wizards. But when they started getting together with the mes and the other kind of people who have stories, it's going to be game over. I mean, you're going to have so many amazing things coming out and I'm super excited for that.
Dan Shipper (00:15:40)
I love it. I love it. One of the things I'm really fascinated by is how the new powers and the new limitations of a new tool shape the form of the art or the form of work that can get made. And I think there's lots of examples like this through history. And I'm kind of curious about that for the sort of AI film age. I'm noticing things like, for example, it seems like Runway, which is the main kind of way to generate a clip—take an image and turn it into a video clip. It only generates two or three seconds at most, something like that. And so I'm seeing a lot of these AI films, including yours, like you have these couple seconds shots and then it cuts to another one and it cuts to another one. And maybe there's also harder to make the characters as continuous through each different shot. And so the kinds of stories you can tell that pushes you in a specific kind of storytelling direction and it creates rules for what the medium is. And I'm sure it's still really early and these tools are going to get better, but I'm kind of curious what are the rules for what you can make with AI right now?
Dave Clark (00:16:50)
No, that's a great question. Yeah, there are limitations, right? I think four seconds is Runway. Currently, I know Pika Labs, you can extend it up to 15 seconds, I believe. But that's uncharted territory because you're kind of going to lose some fidelity a lot of the time. I use a tool called Topaz Labs. And what I do is I bring in a Runway clip. I not only export it at 4K or 8K sometimes, but I'll also change the frame rate. So if you get a 24 frames per second clip, I'll turn it into 60 frames per second, sometimes 120. And then you can extend the clip. So if it's for my sci-fi, like Borrowing Time is a great example of the scene where—Oppenheimer was a huge reference—but just like the way the time travel scenes were happening and the lights were kind of whipping around—I did a high-frame rate generation and then I slowed it down in post and then sped it up. So what I'm able to do is make clips last longer and then mix that with quicker-cut clips. So it actually makes the storytelling and filmmaking more like something we're used to seeing on TV. I always think about the Tony Scott-type films, like Man on Fire, where there was always these kind of quick cuts or Snyder with 300 and Dawn of the Dead. Those are some of my favorite films, but I also love how they did the editing and I think editing is a huge piece of my type of filmmaking with AI is I'm able to tell a story and not have to use just three-second clip after three-second clip. Sometimes you'll get an eight-second clip because I slowed it down, then you'll get a one-second clip. And I use that cadence to help tell it.
Dan Shipper (00:18:28)
I see. So I think what you're saying is—I kind of missed the tool that you said at the beginning, maybe Topaz Labs. Is that the thing that is changing the frame rate so it makes the clip slower or faster? Yeah. So basically, I guess what you're saying is, yes, there are these specific things like Runway is only exporting four-second clips, but you're doing different things to lengthen or shorten the clips so that it's not just the same length of cut and that there's actually a history that you're pulling from like 300 and Man on Fire where directors of traditional feature films are doing something similar—it's probably not exactly the same, but you're like taking inspiration from that and you can push the AI to do something that is inspired by it.
Dave Clark (00:19:13)
Absolutely. I think that maybe helps my stuff stand out sometimes a little bit more because it's not just three-second clip after three-second clip after three-second clip. Another thing also is there's tools like ElevenLabs where you can generate AI voices and some of them are incredible, but a lot of people tend to use the ones that sound like AI, right? But now they have this whole speech-to-speech, which I used in Borrowing Time. Actually, I think it might have been in beta when I used it because they just announced it, but I can talk and act out how I want—the white judge in the movie was my voice. It was just voice-to-voice, using ElevenLabs and the woman—the mother—was my voice. And then obviously the voiceover was just my plain voice. 'Cause I just thought it sounded better and more natural to have the natural pauses and the things like that. So it makes sense together.
Dan Shipper (00:20:04)
That’s really cool. Okay. So I didn't even realize that that was a thing. So basically, you acted the whole movie yourself. And you used ElevenLabs’ voice-to-voice model to change it into different characters. I didn't know that that was a thing. That's so cool.
Dave Clark (00:20:20)
Yeah, dude, it's awesome. I think about animated films—you could do like all the weird characters. Just ‘cause sometimes—it's not perfect, but you kind of hear it in your head and for the shorter films, it's like, why not edit right?
Dan Shipper (00:20:32)
Yeah. That's amazing. Okay, cool. Well, what I'd love to do is a lot of this show is us seeing how people use ChatGPT and us using it together. And what we planned to do is just make a movie together. And it probably won't be like a full movie, but like make a clip from a movie together and sort of see what this process is like. And I think what's really interesting is you're actually using ChatGPT as—and particularly like a custom GPT—to help you ideate. I know this is sort of a new part of the process, but it's something that you're doing. And so maybe you can start by telling us what this GPT is and how you use it. We can start there and then we can sort of go into exploring making movies.
Dave Clark (00:21:18)
No, absolutely. And I'm still heavily in Midjourney and then sometimes Stable Diffusion with a lot of my image outputs, but I just fell in love with this idea of creating a version of my own text-to-image generator. I'm obviously using the DALL-E platform and it's building on top of it. But what you're able to do, which is incredible, is really fine-tune it to the type of imagery you’d like to create. And you don't have to pigeonhole it to just being live-action-type imagery, you can make it as vast as you want, but you're able to almost create a mentor out of it. So it's a combination of my opinions, a combination of maybe what Steven Spielberg might look for in imagery, what Ridley Scott might look for in imagery. It's really because you can use ChatGPT and kind of tell it to be anything, but you can constantly just tune it and I'm still tuning it. It's still new, so it's not perfect, but it's good to me. It works for me because I'm able— do have a certain look to a lot of my stuff and I know like DALL-E traditionally hasn't given me the outputs that I need versus Midjourney, but now because I'm starting to really fine-tune it now, I'm starting to get imagery that's like, wow, that's on par with what I'm getting from like a Midjourney or Stable Diffusion. That's pretty cool. Yeah. So that's where I'm at now. Still toying with it, but I mean, what an incredible tool. It’s crazy.
Dan Shipper (00:22:38)
Cool. Should we demo that and maybe start there and then we can—so you call this Blasian GPT, which is hilarious. How do you start with this? When are you jumping into it?
Dave Clark (00:23:02)
So, yeah, I mean, I jump into it right away because obviously it's a text-to-image generation tool, but I can also talk to it to get ideas, I can talk to it like it's a mentor. So, yeah, let's see. What would you like to prompt?
Dan Shipper (00:23:16)
Well, let's think about it. Like, okay, so what kind of, what kind of movie do we want to make? Wow, I got to think about that.
Dave Clark (00:23:24)
That's a good question.
Dan Shipper (00:23:25)
So in a previous episode of the show, we built this game with ChatGPT called The Allocator. And the game was, basically this: You could play the president of any historical era. It starts with inauguration day and you get to make decisions as the president and you basically set the budget for the U.S. government. And then the game plays out your decisions, and then you play JFK during the 60s and you decide whether to fund going to the Moon or not—all that kind of stuff. And we could do something that's sort of based on that. One of the interesting things about that game is we had a lot of Nicolas Cage cameos, because ChatGPT generated this image for the GPT that it just looked so much like National Treasure. It was giving us National Treasure vibes. And we were like, I guess there's going to be some Nic Cage cameos in this game. So I don't know if that gives you anything—any ideas to do, but maybe we can throw that in there and see what comes out.
Dave Clark (00:24:40)
Yeah, Nicolas Cage is always—there's always awesome, epic imagery whenever you prompt him. We could—I mean, yeah. Give me an idea. Let's see what happens. We'll just start there.
Dan Shipper (00:24:53)
Okay. I want to do Nicolas Cage getting sworn in as president and ideally it's raining and it's sort of dark. It's gritty, like noir vibes, maybe. And I don't know if that is your aesthetic, so you can tell me and maybe the GPT will correct me if I'm wrong. I'm trying to think about what situation Nicolas Cage would ever become president. And I guess given who our previous president was, it's not maybe not that surprising. But, interesting. So you're saying, “Create an image of Nicolas Cage getting sworn in as president. The image should be cinematic and gritty in the style of the David Fincher film Se7en.” Cool.
Dave Clark (00:26:00)
I think that could help, right?
Dan Shipper (00:26:02)
Alright. That could help. Is there any, do you think there should, we should add anything else into the scene, aside from him getting sworn as president or would you just start with something like this? How would you do it?
Dave Clark (00:26:12)
I kind of start with just a simple brass tax and then we just refine it from there or add the—You've seen those things on Instagram, add this in, now do this, now do that. Yeah. So let's see. And this is the first time Blasian GPT has ever been seen publicly, which is Blasian because I'm Black and Asian, by the way. That's what we call ourselves. Here we go. Never before seen before. I don't know what it's going to do.
Dan Shipper (00:26:37)
We're doing it live folks.
Dave Clark (00:26:40)
It's called a top-tier image generator. So it better show up.
Dan Shipper (00:26:45)
One interesting thing is you're having to go right into generating an image. You're not having it like sort of ideate with you necessarily and saying, well, what do we do this? Or what do you do? It’s just right to the visual. How did you make that decision?
Dave Clark (00:26:58)
Just because I, because I've already trained it on a lot of those conversations. I forgot to do 16:9.
Dan Shipper (00:27:08)
I'm getting Nicolas Cage is in Harry Potter. But it's for some reason we're in the wizard court, like the Wizengamut. So it's not necessarily an inauguration vibes, but I'm not mad about it either. It's kind of interesting.
Dave Clark (00:27:26)
It got the Se7en vibe down right. Yeah. You got the film grain, you got the harsh perfume lighting, which is always funny because I watched the behind the scenes of Se7en and that was what the DP said. He was like, Fincher told me to watch a bunch of perfume ads. Like we wanted to create that, like that new or like, just like you said, where it was like whites are really white and blacks are really black. That's exactly what I did.
Dan Shipper (00:27:47)
Wait. Yeah. What is, what is perfume lighting?
Dave Clark (00:27:52)
That’s related to perfume ads back in the ‘80s and early ‘90s were very dramatic, so it was very like high contrast, almost felt black and white, but there was always like a splash, a wash of color, which is how Se7en looks. But interesting. Okay, so let's build on this. One thing I want to do is I want to make it 16:9.
Dan Shipper (00:28:12)
What is 16 9?
Dave Clark (00:28:14)
That's a widescreen. So that's what you're used to seeing—or 2:1, I usually do 2:1 or 16:9, but we'll do 16:9, so it's a little bigger.
Dan Shipper (00:28:18)
Okay. And do you know what the aspect ratio of this one is?
Dave Clark (00:28:22)
This is 1:1. So this is your Instagram square look, which is your default. I forgot to go in, but you can actually change it, so the first image is always 16:9. You don't have to prompt it. Okay. So what changes do we want to make besides that?
Dan Shipper (00:28:57)
Well, I'm curious. I mean, I can certainly give you some input, but I'm curious how you would think about it. Help guide me through how I would think about what changes I might want to make or what I should be thinking about or seeing.
Dave Clark (00:29:55)
Yeah. So I think for what you want to say, it's missing some core pieces. You talked about it, it doesn't feel very inaugural, doesn't feel like it's in D.C. So I think—Where would this take place, typically?
Dan Shipper (00:29:09)
Well, I mean, it's gotta be, kind of, on the Mall, in front of the Capitol, right? I'm pretty sure that's where it happens. I am sort of curious what you think. This is a mistake, obviously, but then also, I'm kind of intrigued. What did Nicolas Cage do to end up in this situation? He looks like he's in a court and—
Dave Clark (00:29:37)
Well, we could go down that rabbit hole.
Dan Shipper (00:29:40)
Yeah, we kinda could. I'm curious. Do you see anything here where you're like—
Dave Clark (00:29:43)
It almost feels like he's in a church. This is some culty underground—what did he get himself into? It kind of gives me John Wick vibes a little bit, too.
Dan Shipper (00:30:59)
Yeah, it's the cult of Cage, and he's got—what's he holding? Is that a Bible, or what's in his hand?
Dave Clark (00:31:06)
Yeah, we, I mean, could we, we could ask Blasian GPT.
Dan Shipper (00:31:13)
Let's ask.
Dave Clark (00:31:14)
Before we change the aspect ratio—“What is the book Nicolas—” this is fun. I don't really do it this way, ”—Cage is holding in his hand.”
Dan Shipper (00:31:42)
Okay. Interesting. Okay, ChatGPT's playing it straight with us. It's just the Bible or the Constitution.
Dave Clark (00:31:50)
But what if the book was, it almost reminds me of the book from Evil Dead. Did you ever see that movie?
Dan Shipper (00:31:58)
No. What is it?
Dave Clark (00:31:59)
Evil Dead is that film where it was like the book—that cursed book of death—when you read a thing from it causes all these evil spirits to show up.
Dan Shipper (00:31:25)
The Only Subscription
You Need to
Stay at the
Edge of AI
The essential toolkit for those shaping the future
"This might be the best value you
can get from an AI subscription."
- Jay S.
Join 100,000+ leaders, builders, and innovators
Email address
Already have an account? Sign in
What is included in a subscription?
Daily insights from AI pioneers + early access to powerful AI tools