Why Influencers Need a Watchdog
A mind-expanding conversation with Kat Tenbarge
#32 - Kat Tenbarge on Why Influencers Need Watchdog Journalism
Li JIn: [00:00:00] Before we start, just a quick content warning. Today's episode contains discussions of sexual misconduct and abuse of power.
Kat Tenbarge: [00:00:09] I think part of the issue is that YouTube doesn't really view itself as, like, a massive employer, but I do [laughs].
Li JIn: [00:00:21] Hey, everyone. I'm Li Jin here along with my co-host Nathan Baschez.
And, this is Means of Creation, a weekly conversation where we deep dive into passion economy and the future of
Kat Tenbarge: [00:00:30] work.
Nathan Baschez: [00:00:30] show is made by Every, a writer collective focused on business. You can find us at Every.to or every week Li and I publish an original essay on
Li JIn: [00:00:39] what's happening in the creator economy.
This week on the show, a conversation about how powerful creators and the platforms that enable them can be held accountable for misdeeds and what we can do to create a safer ecosystem for all involved.
As the industry matures and some creators gain massive wealth and power, it's inevitable that some will abuse that power. But, when that happens, what can be done? On the internet there are no gatekeepers, which is the reason there is so much original amusing content that we all love. But, there's also a downside to that. Too often there's no one there providing accountability for harmful and abusive behavior. This is the problem that today's guest, Kat Tenbarge, wants to solve.
She's an investigative journalist at Insider who earlier this year published an explosive report containing rape allegations by a member of David Dobrik's Vlog Squad during a filming event. The fallout from the story was bigger than anything we've seen in our careers so far. Dobrik lost sponsors, had to sell his ownership in the new L.A. soccer club, Angel City F.C., and had to step down from the photo sharing startup he co-founded called Dispo and VC's ended up severing ties with the company.
And, while this may the biggest story Kat's reported on to date, this is hardly the only one. Her reporting over the past year has solidified into a new beat that she calls influencer watchdog.
In this conversation, we talk about what it's like to go up against some of the world's most famous and beloved creators, what outcomes she wants to see from her reporting, how creators are responding by trying to become uncancelable, and the unique ways that these misdeeds happen in the creator economy versus in the traditional media industry.
So, I'm really excited to have her here and just want to say welcome and thank you for being here today.
Thank you so much for having me.
So, the first thing I want to know is, what did it feel like to publish the Vlog Squad expose?
Kat Tenbarge: [00:02:18] It was sort of this experience that I've never had before with publishing a story. Where you expect the biggest drop to be the day that, obviously, the article is actually published. And, in the past, when I've had stories do really well, that's always been the case. It's been, like, the story comes out, there's a big buzz around it, and then over the next few days, like, it starts to fade and peoples talk about it less. And, with the David Dobrik story, it was the exact opposite. And, it-
... really became more of, like, a David Dobrik story just because Dom, kind of, went underground in those few weeks that it, like, initially played out. And, David was the focus because everyone knows who David is and not very many people, particularly in, like, the venture capital community or people who don't really watch the vlogs, like, a lot of people didn't know who Dom was, but everyone knew David and everyone knew the Vlog Squad. And, I think when the Spark Capital announcement came out, like, venture capital isn't my forte, however, the word that I kept hearing and the word that people kept saying to me was just, like, this is unprecedented. This is unprecedented. So-
... that meant a lot-
... because I had never expected something like that to happen because of the story.
Li JIn: [00:03:26] it the outcome that you were hoping for? Or, was there an outcome you were hoping for?
Kat Tenbarge: [00:03:29] When I first started interviewing Hannah, I think one of the first questions I asked her in, like, one of our initial interviews was sort of, like, what do you think accountability would look like in this situation? And, I was really interested in that because in the past, with stories about sexual assault, particularly ones that happened, like, years ago, obviously, like, the route to justice isn't going to be the actual justice system. Sometimes the statute of limitations-
... passes or, in this case, like, she had never wanted to go to the police. That was never, like, something she wanted to do. And, at the time I remember she said, “I don't really know. I think that's a really good question, but I think that, like, the number one thing I would want to happen is have the Vlog Squad members and Dom recognize what happened so that, hopefully, it doesn't happen again.”
And, so, that was kind of my thought process as well. Was just, you know, let's make the biggest splash possible with this story so that people really have to reckon with the type of content that has become so normalized on YouTube and with this, like, incentivization system of doing these sort of really, like, shocking on-screen, like, comedy moments and, like, how difficult that can actually be for-
... the people filming them. So, I definitely didn't have an outcome in mind of, like, David would have to step down from Dispo, but, when that happened, it didn't feel like a bad thing. It felt like people were finally taking this sort of stuff seriously.
Nathan Baschez: [00:04:52] Right. And, I think, ultimately, it's, like, it sounds like the key thing is just, if you're a person in a David Dobrik type situation, like you've got a big audience, you've got a group of people that are really interested in being close to you, being in your content, they don't want to insult you, they don't want to get on your bad side 'cause then they'll miss out on a lot of opportunities, to really think about the power dynamic and to think about how you're using that rather than just do what in the moment feels like it's gonna be funny or, like, it's gonna get a lot of views or whatever.
And, if there's no accountability, then why would you have anything to fear? 'Cause you've never seen anything happen that's, like, bad to anyone and you don't even realize the impact you're having on other people. Or, maybe you do and you just don't [laughs] care. But, like, now you have to. If you're not gonna care, like, organically-
... you should at least be worried that there will be consequences for you,
Kat Tenbarge: [00:05:34] you know.
Definitely. And, I feel like this whole genre on YouTube of these, like, fast-paced vlogs with, like, ensemble comedy skits, there haven't been consequences and there's very clearly, even just from the viewer's perspective, elements that are problematic, elements that appear to be exploitative and, yet, it went unchecked for so long.
Li JIn: [00:05:55] I remember that there had been various controversial things that the Vlog Squad had done over the years and David Dobrik, in particular, one episode springs to mind, which was, like, the Japanese snack review video where he was, I think, pretending to speak with, like, a pan-Asian accent-
... and people found that offensive and nothing happened. He continued on in his career. So, I agree with you. Like, the kind of, like, pushing people's buttons and doing things that are kind of edgy and, like, on the border of what people would find acceptable behavior-
... um, that seems to be pervasive online and, usually, these folks are just so popular that they can't be held accountable until a line-
... gets crossed, I think. And, do you think that in this case it was, it was because it was a criminal wall that got crossed that finally something was, something happened and there was finally accountability or where do you think that line is if there is even
Kat Tenbarge: [00:06:54] one universally?
I feel like there probably isn't a straight, universal line except for, and obviously this wasn't the case here, but I think if someone were to die [laughs] because of a YouTube video-
... that would be a universal line of, like, outrage-
... and you can't come back from that. But, really, reason I say that is partially just because so many people have gotten injured in, sometimes really grievous ways, because of YouTube stunts and-
... that doesn't always seem to be a line. I think with the David Dobrik story, it was the perfect storm that made it so impactful. I think one thing I've really noticed when covering creators is the impact of a story will depend on how it meshes with that person's reputation. So, like, for example, I've done stories on people like Jeffree Star who is very controversial-
... already and really widely hated by a lot of people, similar to, like, the Paul brothers. Um, he has that kind of a reputation already. So, when I say, oh, Jeffree Star allegedly did this really horrible thing, the reaction tends to be either from his fans, “Well, no he didn't,” or, from the people who already dislike him, “Well, we're not surprised.” With David, you had someone who for so long was, like, the boy next door of YouTube. Like, even if you didn't-
... love David's content, your opinion of him, like, the reputation he had was really good natured. Like, he was a really good guy and so I think the juxtaposition of something as shocking as the allegations with David's character created that line that was then able to be crossed.
Li JIn: [00:08:29] Interesting. So, if your brand is already one that is super controversial and hated, then it might even be harder for them to ever cross a line for fans to turn against them or for advertisers to turn against
Nathan Baschez: [00:08:43] them.
Right. It almost reminds me of, like, Trump. Like, there's so many things that other politicians would, like, never be able to get away with-
... but, like, if you have no shame and the people who love you, like, just don't care-
... like the thing he said he about being able to shoot someone if happened to or whatever is, like, kinda right. And, so, like, it's interesting how if you're, like, you know, Jake Paul or whatever and it's, like, everybody knows, like, just obviously that's your persona is like you're kind of a douche or whatever [laughs]-
... that's, like, his sort of personal brand, right, and, like, he's trying to be funny with it or, like, whatever, but, like, it's like no one's surprised really.
Kind of that, like, you read the Taylor Lorenz story that came out today, I think, or maybe it was yesterday and, um, you're like, “Yeah. Sounds like J. Paul [laughs].” You know what I mean? Like, it doesn't hit the same way as David Dobrik where it's, like, there's some degree of, like, is that fair almost? Like, just because everyone knows someone's sort of probably a jerk, like, should they be punished less because of it? Or, like-
... do they create an incentive for themselves to, like, project that persona and, like, get away with stuff or... I don't know.
Li JIn: [00:09:35] It's...
I'm sure this is not the intended effect or your writing, but part of it raises the question of, like, if you're a creator, what can you do to cultivate this aura of uncancelability?
What do you do? What do you say? How do you, like, create content consistently in a way that makes you more defensible against
Kat Tenbarge: [00:09:54] cancellation?
Yes. I think that perhaps it's not spoken, but I do think that that is the mindset that a ton of popular creators have. And, you see that play out in big ways and little ways because there are so many creators, I think especially right now where we're at in terms of YouTube culture, and, obviously this plays out on other platforms, but you have a lot of people who rose to fame on YouTube because of good reasons. Like, Nikita Dragun is the one who I always-
... use as an example, because-
... Nikita became really famous because she was transitioning. And, her transition was really empowering. It was representative of something that doesn't get shared a lot and, you know, I think she was widely appreciated for sharing that with her audience. Um, particularly in the beauty space. And, then, after Nikita transitioned and, I want to say, maybe a year, two years, a couple years, after that point, her views started to really go down and her content did not have the impact that it had had previously and as her star started to fade a little bit, which we see with every generation of YouTubers. like, they're popular, but it doesn't last forever.
When Nikita's star started to fade-
... she switched up her style so much and now she is in this role of constantly creating controversy, but nothing so big that it would shatter her entire image.
Li JIn: [00:11:17] Hmm.
Yeah. This is like a guiding framework that they have in the back of their minds. Like, how to stay relevant and toe the line, but not cross it and, and cultivate an uncanelability.
Nathan Baschez: [00:11:26] do you think this kind of stuff has uniquely bad and harmful consequences because of the sort of almost, like, decentralized nature of the way that media works on a platform like YouTube versus, like, obviously, you know, the Me Too movement was centered around bad behavior within traditional industries that, like, technically had HR departments or whatever and, like, obviously failed to, the function of HR is really to protect the company, it's, like, [laughs] controlled by the board of directors really. So, like, you know, it's not gonna do a ton for you, like, do you feel like this is worst or just kinda the same thing where it's like, anytime there's power and fame and attention that people need to harness in order to continue their power and fame and money like that, they're just gonna do bad things? Or, like, is there something uniquely harmful about the way that it works in this sort of new open platform world?
I think about this a lot. Like, I think this is one of those types of questions that guides a lot of my reporting philosophy. And, my take on it is, generally, I don't think that YouTubers or influencers and necessarily more likely to be exploitative people. But, I do think that there's
Kat Tenbarge: [00:12:25] a type of personality where you're drawn to fame that is different than maybe other industries that are less in the limelight. But, beyond that, I think that the real issue lies in the total lack of regulation and the total lack of, like, any sort of accountability framework in the influencer industry. And, you see that in comparison to the-
... Hollywood, the entertainment industry because, like, in Hollywood, child stars, I mean this is the perfect example, child stars can only work a certain number of hours a day. All of the money that they make, a percentage of it has to go into, like, a safe account so that they're parents can spend all their money. And, that is so stark compared to family vlogging on YouTube, which, personally, and I think this isn't a hot take, but I, family vlogging is a really damaging [laughs], I think, phenomena in a lot of cases because you have kids growing up on screen-
... everything they do, their whole lives, are in pursuit of getting the most attention online and that is just a really unhealthy dynamic to mature in and to become a fully fledged human being in and there's no laws. There's no regulations. There's no one saying-
... objectively, is this in the child's best interest?
Li JIn: [00:13:40] think it raises really interesting questions about labor laws and the nature of work.
Because, I mean, plenty of folks make home videos and document their children's lives from a young age and as long as they do that for free without making any money, it's, like, normal. It's just, like, part of being a parent. But the moment that you put it on line and start using it to accrue fame and an audience, I think that, then it becomes-
And, like, what does it mean for a child from the age of, which they're a baby, to consent to work and to consent to be monetized?
I don't know.
I mean, same with animals-
... and all of these pet influencers, too. Like, they don't know what's going on. They're not consenting to [laughs]-
... to do work and someone else is monetizing that work. It is really interesting and I think that's just, like, a reflection of how work and labor is shifting, too.
Nathan Baschez: [00:14:30] affirmative]-
Totally. And, even beyond just, like, work versus just capturing an authentic moment or whatever, like, in terms of, like, the same act of, like, videoing or whatever. When you're doing it for views, you have to put your kid in situations that are, people are gonna want to watch. So, [laughs] like, maybe it's-
... harmless a lot of times, but, like, you know, especially when people are feeling kinda like threatened, like, “Oh, we're not getting as many views as we used to.” Then, they start to do crazier things like we talked about with some previous examples. And, that can be really dangerous for kids [laughs], obviously.
Um, dangerous for anyone. I mean, we just today, or whatever, the, like, Jeff Wittek-
... like, getting his face smashed in by David Dobrik while operating a crane or something-
... where he was [laughs] like dangling from it. It's just... If you were to do that on a movie set, like, you'd have an ambulance next door and there'd be a trained, like, stuntman and someone would be operating... It's, like, we don't need for these things to be, like, actually dangerous. There's ways to get the [laughs] entertainment value without rising people's lives literally. Like, it's crazy.
So, a quick break. We wanted to tell you, before we get back to this really fun conversation, about the writing that we do every week. Li, we write an essay every week. How does that work? [laughs]
Li JIn: [00:15:36] Well, it's pretty impressive, if I do say so myself, that we [laughs]-
... publish an original essay every single week covering the creator economy.
Nathan Baschez: [00:15:44] Yeah. Basically, every Monday, we sort of get together and Yosh, who's amazing, works with us to, like, put together a bunch of research and we just talk through a bunch of ideas and kind of come up with this concept and then we work it into an essay every week. And, usually, it's something that's pretty evergreen, but kind of centered on something that's really timely that's happening in the creator economy.
So, examples are, like, Facebook getting into audio, you know, clubhouse, new evaluation, but, like, plunge and downloads. How creators are sort of coping with different things that platforms are doing. Just, like, anything that's interesting or new, we want to be talking about it every week.
So, if you want to sign up, you should go to Every.to. Every is a writer collective focused on business. We're the writers that are focused on business that have collected
Li JIn: [00:16:22] together.
Not only are we covering the creator economy, we're also participants in the creator economy so we-
... really appreciate your support by subscribing to our newsletter at
Kat Tenbarge: [00:16:32] Every.to.
Nathan Baschez: [00:16:33] So, enjoy. I hope you love the writing. I hope you love this conversation. And, now, back to the conversation.
Li JIn: [00:16:39] I want to switch gears a little bit and talk about your role or how you're described on your profile. You're, you're often times described as a influencer watchdog.
Um, that's what it says on your bio on your Insider page. So, you're an influencer watchdog. You have this history of doing investigative pieces and uncovering accusations against prominent creators like David Dobrik or Jeffree Star. Can you talk a little bit more about what it means to be an influencer watchdog and why that role is necessary
Kat Tenbarge: [00:17:07] today?
Definitely. I love that term and I only recently settled on it. Uh, like, really recently. Because for a while, as I was trying to navigate my beat and figure out what my specialty was, I noticed that I was drawn to the same types of stories and the same types of narratives, but I didn't have a word for it. And, when I was in college I studied journalism and I was really fascinated and did a lot of political reporting. Like, I did a bunch of political internships and they were considered watchdog reporters because they kept up with politicians and really had this sort of role of built in accountability for our political system. And, when people say the word “watchdog reporter," they are usually talking about, like, politics or maybe business, but I really liked the term influencer watchdog because that speaks to the nature of what I'm trying to do and that I feel like we have all of these people in our society who are extremely influential through their money, through their audience, through their fame, through their relationships with their fans, and there aren't very many journalists or people, in general, holding those people accountable.
There are definitely some. Like, outside of journalism, the drama and commentary community on YouTube was a sort of self-policing system that originated, but I feel like there's a strong need for a more, like, traditional journalistic role within the influencer industry that I think is really exemplified by what we saw with the Me Too movement and that reporting in the Hollywood sphere. And, that was really inspirational to me as I was studying the field and I just saw this potential for there to be so many stories of those types of abuses where you just don't have reporters looking into them because we don't think of them with the same seriousness as we think of politics and politicians. So, that was kind of where that came from for me.
Li JIn: [00:18:58] What are some of the challenges associated with being an influencer watchdog? Like, do you get scared going up against creators who have these armies of fan bases all around the world? I'm just curious about the dynamics there.
Kat Tenbarge: [00:19:12] so much and I feel like I encounter such new complex dynamics almost every day that I'm at work because there are lots of... It's a big balancing act. One of the things that I definitely have encountered were more, are those stan armies. And, beyond that, it's become pretty routine for me to have a certain subset of fans be angry about a story or be angry about what they perceive as my bias or my opinion toward their favorite creator. And, that makes sense to me. I feel like that's pretty standard. There's also this sort of online reactionary movement to cancel culture that I find to be really interesting.
And, you see them. They're like, some of them have small YouTube channels. Some of them are more, like, they lean toward trolls. Some of them are more like snarkers, like, snarker communities-
... but that has been a really interesting, unexpected thing is that I have all of these little accounts that are really trying to prove that, like, I'm out to get someone. That I'm, like, part of this big plot to, like destabilize certain YouTubers. So, there's, like, a conspiracy underbelly, I think, to influencer reporting-
... which doesn't even really surprise me because, like, in high school and college, I used to be on those guru gossip forums a lot and I think that was kinda what fostered some of my interests in the first place, was the idea of online gossip and online snarking. So, that's one thing.
And, then, what I'm still feeling out a lot in my position is my relationship with some of those creators because I think if you look at how access entertainment journalism has always worked, it's like there's really favorable relationships between journalists and the celebrities that they cover and that's how you get interviews with people and that's how you get exclusive information. In the creator world, it's a little bit more adversary. [laughs]. But it depends on the person.
Some people really want to be friends. Some people really want, like, a shoulder to cry on or what they view as like an objective outside source to help them, like, process information online. And, then, with other people, it becomes, like, a legal battle for stories to get published. I went through that with the David Dobrik stuff. I went through that with the Jeffree Star story. But in those cases, I, like, feel really glad that I ended up at Insider 'cause we have an amazing legal support team. But, it's definitely very-
... clear to me that the type of work that we're doing would not be possible without an organization that has, like, legal resources.
Nathan Baschez: [00:21:31] I'd love to hear about some of the differences in, like, the methodology, not just like in terms of defending against legal, you know, threats or whatever, but also just the basic like reporting methodology that... Like, there are some people in tech who think, like, oh the future is gonna be citizen journalism and everyone's just gonna be able to, like, have a Twitter account with a voice and that's how the truth will get out over time. And, like, I, I kind of feel like they don't really understand what all goes on and, like, the resources that need to support someone to do the kind of work you do. And, I'm just curious, like, if you could tell us a little bit more about what all goes into what you do and how it's really different from, like, you know, like you alluded to earlier, like a gossip channel or something where it's, like, it doesn't have that kind of structure or, or tradition of journalistic practices?
Kat Tenbarge: [00:22:10] Yeah. I think that's one thing that I didn't expect is sort of almost taking on a little bit of an educational role sometimes because you do get people who don't really know what journalism is and I think that's completely fair. Like, it's a very murky field that has changed a lot and continues to change. But people really do not always recognize the work that goes into an investigation at a journalistic publication verus, like, a one hour commentary video that is a deep dive. Like, people see those as equivalent, but there's so many moving parts when it comes to an investigation like this because not only do you often have multiple reporters working on different elements, when I first started the reporting I was initially doing it as, like, part of a duo with one of my coworkers and we were both interviewing various people in the Vlog Squad sphere and then eventually I got a tip about what would then become Hannah's story. And, so I handled that part solo, just in terms of doing those interviews and writing the story itself.
However, that's only step one because then it goes to, like, six or seven different editors. Obviously, like, my primary editor goes back and forth with me to get it into as good of a shape as it can get, but then it has to go through our investigations team. And, our investigation team will have all of these, like, extra things to do and, in terms of fact checking that I wouldn't even necessarily have heard of before. Like, I remember in the process of doing this story, our investigations editor had me send him all of the screen shots and photos and videos that the women provided to me to show that they were there and, like, oh I took this picture of David while I was there, et cetera.
Our investigations editor, like-
... stripped all of them to get all the metadata to prove that, like, this photo was actually taken at this time on this date in this location. So, it's, like, that type of work goes into it. And, then, I think the most strenuous part of the process is often that legal back and forth, which people don't see. And, of course, you can only share so much of it with your audience, but, I mean, that's a lot of work 'cause you'll have them come back and be like, “Nothing in this story is true.” And, then you have to kind of, like, prove it to them before you can publish that, like, you have done your due diligence ultimately.
Right. It sounds
Li JIn: [00:24:23] like the kind of work and process and resource intensiveness that would be really challenging for one person with
Kat Tenbarge: [00:24:31] the self-
Absolutely. I know that, like, personally, I don't think I could ever do this work without an organization behind me.
Li JIn: [00:24:36] to play devil's advocate for a moment on the role of an influencer watchdog.
I'm cool with that.
So, and I'd love to hear your thoughts on this. So, I can definitely see the need for a watchdog reporter to cover politics, to cover tech companies that are pervasive in our daily lives, um, other, like, influential industries and people. For influencers, I think the counterargument that could be made is these are people who didn't really wa-, like, maybe didn't even want to be famous. They just accidentally became so. They, they were doing this content creation as a labor of love, as a hobby, and then amassed an audience and now they're famous. They've sort of been elected by the people into this position without necessarily seeking it out themselves.
And, I, I think, secondly, the other counterargument is, well, you know, if your politician is doing something that's not great, like, you can't just up and leave and move to another country, another state as easily or you can't, you know, stop using your Apple phone, stop using app store, Facebook, all of these products that are critical now, essential to our daily lives, but for an influencer, if, if they're doing something that you don't like and abusing their influence in some way, you can just elect to stop watching their content. You could unsubscribe, block them, mute them, whatever. The power is decentralized and people can just elect to stop following them. And, so, those would be, I think, the counterarguments for why influencers deserve the level of scrutiny as some of the other watchdog reporting disciplines and I want to hear your thoughts on that.
Kat Tenbarge: [00:26:07] Definitely. And, I think that those are really good, like, trains of thought to pursue. And, I think, in terms of the first one. The way that I view investigative journalism, and I think the way that it traditionally is viewed, is you look to who has the power. And, so, in the influencer sphere, I think people still don't necessarily even grasp how much power these people have. Because I-
... think we traditionally view power as accumulating a lot of money, accumulating, you know, power politically, through the political process. Um, or by having lots of, like, real estate, for example. We think of, like, oh, well that person owns every building on this block. They have so much power here. Um, hopefully somebody is making sure that they're paying their bills and not exploiting people.
With influencers, and I think this even feeds into the second train of thought a little bit, the power is something that is a little more intangible. The power can have something to do with money, it can have something to do with assets, but a lot of times, it's the power that the creator has over the fan. A lot of people refer to that as, like, a parasocial relationship. And, those relationships-
... come into play so much with the field and these types of abuses, like, with the David Dobrik story, for example, like people have described that as, like, a classic date rape scenario, which I don't entirely disagree with, but I think the main difference is that the only reason that the women were in the apartment with, that night was because of the allure of viral fame and the idea that they could trust those guys because they knew them because they had watched so much of their content and, like, thought that they had this idea of who these guys really were. And, so, there was, like, a level of trust there and that thing that drew them in to that environment, that ended up being really, really unsafe. So, I feel like that is one way that that parasocial relationship can be manipulated and I feel like we're only beginning to grasp that those types of relationships exist.
I know I was having a conversation the other day, uh, with someone in the commentary YouTube space and he was telling me, from his perspective, 10 years ago, uh, when Minecraft, the video game, was booming on YouTube and you had all of these Minecraft YouTubers building these massive fan bases of young kids, he was like, you know, at the time we didn't call those parasocial relationships. At this point in time, not only do we recognize them for what they were and are, but we're also just beginning to realize all of the consequences of that. Because you have entire, like, generations of fans who evolved the way that they think and the way that they act because of what these creators were serving to them and then you also have all of these fans that were abused in some way.
Uh, there's so much grooming that goes on in these, like, online spaces. And, I feel like that's one thing that drives me, is that I fear that, in terms of media literacy, parents of children who watch and consume this stuff all the time, I don't think everybody is aware of how much power these creators have over their kids, but then, also, the accessibility where your kid could be, like, messaging someone and you wouldn't even necessarily know. So, I feel like there's almost, like, a public safety element to it and with the current James Charles scandal that's going on, like, I think that's really exemplified because it's, like, you have this super, super powerful creator abusing his power with possibly dozens of young child fans in ways that could really harm them.
Li JIn: [00:29:32] Yeah. Those are all really great points. I want to get your thoughts on, like, what needs to happen in the future to serve as a check and balance system on this class of increasingly powerful individuals who don't really have that much oversight.
So, obviously, there's influencer watchdogs like yourselves who are covering the industry for media companies. I've also noticed that platforms are rolling out different creator codes, creator guidelines as a way to sort of stipulate the rules of engagement and essentially hold the creators on their platforms to a set of guidelines with regards to the content that they're posting and their behavior. But all of those seem to be kind of, like, wishy washy and-
... and there's not really any real enforcement or consequences-
... outside of the platform itself. Like, the worst that can happen, I think, is just, like, your content gets suspended or your account gets taken down or something like that, but nothing in the real world really happens as a consequence.
So, what do you think needs to happen in order to push all of that forward and to keep these creators accountable to the powers, to the power that they do have? Like, does regulation need to evolve? What would you like to see happen?
Kat Tenbarge: [00:30:39] like you almost need a grab bag of different things implemented to start down that path of how do we make these online landscapes more healthy. And, I think part of the issue is that YouTube, if we just are looking at YouTube, YouTube doesn't really view itself as, like, a massive employer, but I do. [laughs]
... not only do you... Yeah.
Not only do you have-
... all of these, like, celebrity YouTubers, but there's this massive YouTube middle and lower class, um, of creators who make at least some of their income from YouTube. But, YouTube doesn't have, like, an HR structure. You don't have, like, a boss at YouTube and one thing that I've covered a lot over the past few years is when a smaller creator gets an action taken against them by YouTube's automated systems, like their whole channel can get, you know, demonetized or they can have a video removed, and they didn't actually break whatever rule or guideline is that the bot is claiming they did, that there's no one for them to talk to. There's no recourse. There's no manager-
... to, like, help you if you are a YouTuber with 50,000 subscribers. You're just kind of on your own.
Li JIn: [00:31:51] There's no, like, judicial system for them to go to-
... and be like, this law was applied to me and it should
Kat Tenbarge: [00:31:56] not have applied-
Yeah. So, I think that YouTube needs to really invest in, and other tech companies as well, that do kind of serve as almost [inaudible 00:32:04] to employers. I think there needs to be more of an investment in, like, a human structure, um, with these industries. But, then, beyond that, I think that we really just need, like, a politician to take on this idea of online exploitation as policy because there's so much that I think that could be done, not only with children, I think children is the most pressing issue, but, beyond that, there really aren't particularly, like, rights for creators that fit the-
... digital age we're in. We don't really have any sort of legislative-
... framework for the internet that is effective.
Yeah. I could
Li JIn: [00:32:35] talk about this all day.
Um, I am very interested in this topic and I just wrote a post that got published yesterday about UCI, Universal Creative Income-
...kind of the online creator's equivalent to UBI, Universal Basic Income, and I think there's a lot of parallels in the ideas here around how these companies are effectively employers. They're effectively creating entire economies on their platforms. Economies in which there's suppliers, consumers, there's public spaces, private spaces, people who are earning a lot, people who are earning very little, and they're, like, nascent little worlds comparable to, like, nation-states, sometimes, in their level of power and resources that are available to them in the richness of their ecosystems. But, there's not really any sort of support structures or infrastructure to help the bottom of the economy in these platform ecosystems the same way that exists in the offline world.
And, so, this piece yesterday was and exploration of how do we nurture the next generation of talent and the long tale of creative people who are doing work in these ecosystems, but not getting compensated for it. And, I think this conversation extends that beyond just compensation and money and income to, like, what are the protections that need to be in place for people, for consumers, and processes to keep in check these digital, they're almost like digital politicians, like-
... people who are really influential, large businesses and corporations in these online ecosystems, their level of power.
Nathan Baschez: [00:34:04] It's interesting, like, one analogy that comes to mind for me, for the kind of incentive structure that's set up, most maturely on YouTube, but it's coming to a lot of other places on the internet now, with, like, creators making money, is it almost be like if like DoorDash paid you exponentially and more the shorter your delivery time was.
So, if you, like, shaved off a minute and you got, like a hundred extra dollars or you shaved off, like, two minutes and you got, like, a thousand extra dollars, people would be going crazy on the streets, everywhere, and a lot of people who don't want to do that would opt out, but the people who kinda don't mind it or maybe even like it would, like, be the ones who really succeed at it and then there'd be, like, a ton of car crashes and just terrible stuff that happens.
And, with YouTube, it's kinda like that, in a sense, because they're paying people a lot of money for generating a lot of views. The more views you generate, the more money you make. And, so, it's like, well what can you do to create views? It's kind of like YouTube's, like, not our problem, basically, but it's like this extra anality where it's like, oh, I know the reason why prediction markets are illegal. It's because let's say, if you could bet on the timing that would someone would die and you, like, put a million dollars on it and then you go kill that person, it's like you don't want to have a, a market way to, like, create incentives for people to do things that cause damage to other people [laughs] or death or whatever, like, terrible situations for other people.
And, it's, like, YouTube has this. It has an extra anality and it's like there's shared blame, personally. It's, like, there's blame for the creators, but there's also blames for the platforms that just say, “Well, if people do something bad then, like, whatever. I guess it's not us.” 'Cause the, they're dealing in a dynamic system where people will select in. It's totally open. There's no filter and they're not enforcing any sort of
Kat Tenbarge: [00:35:28] standard.
I think that's a really good metaphor. 'Cause they, I always think that in, like, a philosophical sense, you look at what we are capitalizing off on these platforms, but then I also just question the type of human behavior that's being incentivized on these platforms and in these cultures.
I was talking to a young woman just a couple days ago who had this TikTok blow up after the David Dobrik story came out where she was describing how her mother, who has unfortunately passed away, was in a David Dobrik video. And, in the video, her mom, who would experience homelessness, particularly in the later years of her life, and, during those bouts of homelessness, she was addicted to drugs so a lot of times she would be undergoing, like, some sort of, like, psychosis. And, in the video, she's in the middle of the street and, like, David pulls up to her and they have, like, a conversation with her and she, like, runs away and they kind of laugh at her. They make her into the butt of the joke. And, that is so disturbing to me on so many levels [laughs] 'cause, like, this video came out and millions of people saw it-
... and no one questioned that. And, it just really makes me think, like, what types of behavior are we modeling for the younger generation that consumes this content? And-
... also, what's the incentive there? Like, this is someone we should be feeling bad for not be laughing at. This is, like, someone that we-
... should be thinking, like, how can we help this person? Not, like, oh [laughs]. Like, people weren't objects, but the way that it's framed kind of-
... turned her into an object and so that's something that I think, even though it's not YouTube's fault that, like, that happened, it's not any one individual at YouTube's fault that this sort of perverse incentive has played out. That doesn't mean it's any less consequential. And, so I think we kind of have to reckon with that.
Li JIn: [00:37:06] agree with that. Going back to what you were describing before with each company is essentially employing this large labor force whose working on the platform, but it doesn't have an HR system or any sort of processes for managing this large workforce and they, they should implement things like that. I almost feel like there are downside risks to each company doing that independently on their own-
... and, having no consistency across the entire industry. Like, I think what will arise from that is kind of a fragmented landscape where different platforms have different guidelines for what's okay, what's not okay, content that will be promoted versus unpromoted.
[laughs]. Whatever that word is I can't think of right now. It kind of reminds me of the way that moderation-
... is done today by these tech platforms where everyone is trying to, like, navigate where the lines are drawn and I remember that week with the insurrection-
... at the U.S. Capitol. Like, it felt like every social platform was having emergency meetings internally to decide, like, do we de-platform Trump? Like, what's going on? What is everyone else doing? And, trying to figure out the rules on their own leading to, like, a very disjointed ecosystem of, like, what's okay to say on one platform versus not okay to say on another platform? And, I think it would be beneficial, potentially, to all of the platform companies, as well as to creators, to just have one standardized set-
... of, like, rules and a code of conduct so that they don't screw it up and they don't spend the time navigating, exerting those cycles to figure out, like, what's okay on one platform versus not okay on another. I mean, I feel like what I'm describing might have consequences of its own, but leaving every company to implement their own, like, "HR system for creators," that feels messy and challenging for creators to navigate in this, like, multi-platform
Kat Tenbarge: [00:38:58] fragmented world.
Definitely. And, I think it kind of is almost like we have an opportunity here in that the internet has been heralded as something that can bring us all together and you can access the same YouTuber regardless of where you live and it's this idea of, like, open borders in a lot of ways.
And, I think, like, obviously, there would be a set of challenges when coming up with a standardized code of conduct, but it would also, to me, have an opportunity to be, like, this is a version of our world that we can shape to be better [laughs]. And, these are, like-
... you know, like, the digital currency, the digital, like, politicians. All of those things, we have so much of an opportunity to evolve them and I think that can be terrifying, but, if you're optimistic about it, it could also be a good thing.
Nathan Baschez: [00:39:42] Yeah. Totally. It's interesting what you said about, like, oh, like, the internet can bring everyone together or whatever 'cause I think it brings up the point of, like, it feels like it's tribalized us a lot more and you brought up earlier the people who are, like, kind of reactionary to the cancel culture idea, like, you know, being kind of... It doesn't even matter if you're reporting on someone who they care about at all. It's not like they're a fan. They're just against, almost, just the idea of accountability.
Because they see it as cancellation. They see it as unfair or f- for whatever reason and, um, it reminded me of... I was reading this thing, oh, on just [laughs], like, a, the Wikipedia article for, like, the printing press and it was like, the printing press lead to blah, blah and all the stuff that we know like the Protestant Reformation or whatever and, like, changed society or all this stuff, but the thing is, that I didn't know, was that it, it led to, like, the solidification of different languages in Europe 'cause it used to just be like everyone who was literate spoke Latin, or not spoke, but, like, wrote things in Latin and then it was, like, the kinda like lingua franca, but, then, it was like all the different national languages, like, you know, like, Spanish, Portuguese, French made the transition into writing to a greater extent than it had previously.
A national identity becam-, like, literacy spread and national identity spread and they're, like, there's this book that talks about it that I bought and I haven't read yet, but it's interesting how it's fragmented us further and kind of like, you know, lead to your point of can there just be, like, one standard. I think I would love that to, like, see, you know, it's like maybe we can treat it how the FDA treats the food and, and drink-
... and drugs or whatever that are, like, allowed to be sold in this country. But, like, [laughs] you know, we need similar standards for this, too, but it's, like, it's so hard to create anything like that anymore because it seems like there's been this kind of, like, you know, curdling of culture where it's like there's just these different little clumps and-
... there's no, like, space between them to communicate
Li JIn: [00:41:13] any way.
I think people would decry it as censorship, though, if there was one standard set of rules because I think these platforms have such, collectively, such large market share that, effectively, it would be one set of rules for what you can and cannot say.
Kat Tenbarge: [00:41:27] And, I think that's a really valid criticism that people could have. It's such a difficult thing to even imagine with where we're at because online platforms are so messy and divided right now that it's so difficult to-
... envision this sort of coming together to do something that would apply to everyone equally. Like, that is so difficult to even idealize, but I love the idea of, like, comparing all, in a lot of ways, like, the internet to the printing press and I am very fascinated by the creation of identity as it related to the printing press and the creation of identity as it related to the internet.
Nathan Baschez: [00:42:01] It's interesting, too, because, like, I think the people who are kind of most reactionary, like anti-cancel culture, tend to identify as, like, maybe Libertarian-ish, but, like, in a lot of ways, the most Libertarian solution to this problem is what you're doing. It's like, if you think the solution to speech is, like, more speech, then, like, all you're doing is speech. You're just saying, like, here I have validated some facts. This is what happened. You can decide what to do with it. You're not censoring anyone, you know [laughs]? Like, it's... You're just providing information out to the commons that people will do with what they may individually.
It's really interesting how that sort of contradiction doesn't seem to occur to
Kat Tenbarge: [00:42:34] people [laughs].
Yeah. And, I feel like a lot of times, when people are frustrated with just human nature, in general, they want to find, like, some sort of scarecrow that can be like, oh, it's her fault. And, I feel like with some of the criticisms I get from the anti-cancel culture crowd, it's like what you're really mad at is just the mob and how people perceive information in large groups. 'Cause I know, like, one repeated thing that kept coming up with this story was, oh, you're attacking David. You're focusing on David. Dom is the perpetrator. And, it's, like, that's so interesting to me because if you actually read the article, it's 90 percent about Dom. We only, didn't really talk about David, like, in the beginning and then throughout whenever he is relevant.
But what they're actually complaining about is the fact that David's the more famous one so, of course, he's going to get the most attention [laughs]. Like, they're actually just complaining about-
... you know, the cultural attitudes and how people process information.
Li JIn: [00:43:25] On that note, I want to end our discussion with a topic I've been thinking a lot about and, I'm sure, is a topic that you spend a lot of time thinking about as well, which is, like, how can we move to an internet with a healthier content ecosystem? It seems like a lot of the issues that you've uncovered, a lot of the things that have happened and people who've amassed fandom online are doing extreme behavior that is pushing various boundaries in a ploy to get attention and increased views and clicks and just fandom. And, that's the way that the internet works when ad dollars and reach is correlated with each other, like, and the predominant way to monetize is through advertising. So, what needs to happen for us to incentivize a content ecosystem that has content that is enriching and nourishing to us versus just trying to get as many views and clicks as
Kat Tenbarge: [00:44:23] possible?
I feel like that is such a fascinating discussion and I feel like you see platforms trying to approach that. Like, you have Instagram thinking about, like, removing likes.
It, like, little measures like that in terms of just how the platforms actually operate, that would drastically change user experience. I love those types of conversations around, like, what would these implementations do? How would they affect the way that we interact with platforms. And, I think that would really be a good starting point because I don't think, at this point, we can just, like, scratch the platforms altogether and, like, start with a whole new fresh idealized version of, of the-
... social media platforms. I think it would be so cool to experiment with the idea of, for Twitter, for example, like, you can only tweet a certain number of times a day. I think, personally [laughs], if I could only tweet-
... like, five times a day, you'd have to put so much more thought into those five tweets and I feel like things like that, even though they sound a little silly, like, I spent a lot of time thinking about how do I make my own experience better. Like, for myself, since, like, so much of my life and identity is shaped by the platforms that I use. And, so it's, like, with Twitter, I know that the way to have a better experience with it is to just use it less [laughs].
Like, I'm taking the app off my phone. Like, I can only use it on my computer. But, not every app is that way. Like, with TikTok, I don't feel like I need to use it less to have a better experience. So, then, I'm kind of like, well what does TikTok have that Twitter doesn't have or, like, what is the difference between those two platforms? And, then I feel like so much of it comes back to the design. The, like the design of the platforms and how that affects the way users use them. So, I think platform design is, like, a really big chunk of that, but then I also think, hopefully, as discourse around these types of issues, like, continues to develop, you hope that, like, cultural attitudes will shift in the right direction.
And, I think that we already are seeing that, particularly with just how women and consent is viewed on the internet. And, not just women, but I think that, like, consent, in general, and how its viewed on the internet. In the early 2000s, I think you had this string of young girls who became big internet targets in a way that you don't really see anymore. Like, the My Space era was really toxic toward young women. And-
... you, you, like, we're even still just starting to understand some of the consequences of that era of the internet. In, like, today's era of the internet, is, obviously, not perfect. It's not better in a lot of ways. It's worse in other ways, but you do see that, like, with shifting cultural attitudes toward minorities and women and gay people and, like, that comes into play-
... in terms of how we regard each other online and in terms of how we create content online. Like, what is acceptable and what is not. So, my ultimate hope is just that with, like, accountability journalism, of course, but then, also, just general social and cultural progress, that the internet will become a healthier place-
... as well.
Li JIn: [00:47:11] I agree with that. I think it can't just be the platforms unilaterally changing their algorithms to favor healthier content because there's this entire other side, which is the consumer side-
... and I think both have to go hand-in-hand and consumers need to educate themselves and change their own behavior of what they elect to consume and the platforms need to also builds to enable that. And, together, hopefully, we can have a more socially and morally responsible internet.
Kat Tenbarge: [00:47:42] Agreed.
That's a pretty beautiful-
Nathan Baschez: [00:47:43] [laughs]
Kat Tenbarge: [00:47:45] ... place to end.
Yeah. It's really good.
I agree. Mic
Li JIn: [00:47:47] drop. [laughs]
Well, amazing. So, thank you so much, Kat, for being here with us today. I really enjoyed this conversation, learned a lot, um, gave us a lot more topics to think about and, honestly, leaving with a lot more questions and just one of those really awesome mind-expanding conversations that we love to have. So, thank you, again, for being here.
Kat Tenbarge: [00:48:09] for having me. I loved it. I feel like these types of philosophical internet conversations are so fun. So, thank you so much. [laughs]
Nathan Baschez: [00:48:17] Well, thanks for having in with us and, uh, we'll, we'll have to regroup again in, like, you know, maybe six months or a year and see if we've made any of that progress there.
Hoping for a culturally, technologically, or in whatever way-
Let's do it.
... uh, you know, we can get, [laughs] I
Kat Tenbarge: [00:48:30] guess. Cool.