Will the unseen labor of the creator economy be replaced by an AI race to the bottom?
An industry of "ghost-chatters" are arising to allow creators to scale their personal brands. At what cost?
Last week, a story about the sex work-friendly creator platform OnlyFans caught my eye. The publication Rest of World reported that male voyeurs on OnlyFans are spending hours online chatting intimately with personas who they believe to be their favorite OnlyFans stars.
But instead of sexting or spilling their secrets to these crushes and fantasies, they are actually talking with foreign men and women who have just learned how to flirt over chat incredibly well, assuming a popular OnlyFans creator’s persona. These freelancers type flirty messages from dingy closet-like rooms in the Philippines, where they work twelve-hour “chatting” shifts and are monitored closely by supervisors for response time and engagement levels.
Chatting has become a sizable stream of revenue for OnlyFans, especially because “whales” —big spenders on the platform who make up more than 20% of OnlyFans’ revenue—will pay thousands of dollars for conversations with their favorite “stars,” leaving generous tips or purchasing extra photos and videos in addition to the usual subscription revenue.
And these chatting skills are paying off: OnlyFans made a reported $7.22 billion in gross revenue this year, up 9% from last. Its CEO earned $497 million in dividends for the 2024 fiscal year, up from $472 million the year prior.
The company claims that artificial intelligence will start to take over as full-time chatbots from low performers. But unsung laborers who have honed their chatting skills will remain at the frontlines, as AI is not yet at the stage where it can perfectly mirror how an imperfect human might spell (or misspell)—nor can it share certain types of more complex emotions around desire or silliness. For the foreseeable future, human chatters will remain many of the impersonators—in many cases serving as therapists to lonely men, maintaining the illusion of connection to rake in money for platforms.
Since the dawn of social media, parasocial relationships have risen. Fans of celebrities build up admiration for a celebrity or creator, concocting a one-sided intimacy that does not exist outside of their imagination. These fans are often drawing their fantasy relationship from the polished content that celebrities or creators choose to share—much of which is propped up by teams of social media managers, ghostwriters, and now, in more intimate forums, ghost-chatters.
Before, parasocial relationships could remain in the minds of the voyeurs, but today, companies are devising plans for how to monetize them. And in this era where anyone can monetize the self, the pressure for hyper-speed scale has crept in. As one venture capital firm described this trend: creators are the next generation of “small businesses.” And with platforms like OnlyFans attributing their year-over-year revenue increase to fan payments to individual creators on the platform, it would make sense that their top creators are facing pressure to remain more and more online, squeezing every drop of out of a social media personality’s being to entice fans seeking the illusion of connection. Ghost-chatting allows a creator to multiply their presence and energy.
While top-performing ghost-chatters can make more money than other types of invisible freelance work like content moderation or call center gigs, it comes at a real cost to their well-being. Some of the chatters interviewed by Rest of World describe long-lasting psychological ramifications, like being unable to form intimate connections with people in their real lives, or feeling stripped of their sense of self while impersonating someone else via sexting for twelve straight hours.
This unseen labor behind the creator economy may only balloon, as it’s an industry worth $250 billion today that is predicted to reach half a trillion dollars by 2030. According to one survey, the top two career aspirations among Gen Alpha across the U.S. are YouTuber and TikTok creator—while Gen Z expresses an increasing desire to make extra income through remote work and creator economy side hustles. Even the Wall Street Journal hiring a “talent coach” so that they can train their journalists to learn the skills of video creation and engaging with an audience over social media.
We want to get our news and our product recommendations from people now, not necessarily from faceless organizations or brands. Combine that with the frictionless experience of chatting over a screen rather than face-to-face—and paving a path to fake online relationships with popular people becomes a strategy for creator economy platforms to bring in more engagement.
But as more and more people jump head-first into the creator economy without a “coach” or an infrastructure, mental health problems can persist.
For micro-influencers with tens of thousands of followers, corresponding directly with fans or their audience can be rewarding, but also draining or even dangerous depending on the context. Harassment and bullying is rampant, especially for women and marginalized groups—and trust and safety teams at tech companies are some of the first to go when job cuts happen. Meanwhile, for those who have the platform and funding to hire people as buffers to correspond with a digital fanbase, there is a monumental offloading of emotional labor that needs to be accounted for.
Platforms will say that AI text, images, and video can swoop in to save this problem of scaling one online personality or sifting through the worst of online human interaction (which comes with its own problems, of course—as people are then building relationships with not only traumatized impersonators, but chatbots that are not real). But so long as human beings remain the engine under the hood of creator personalities, mental health certainly needs to be at the forefront of the conversation. OnlyFans’ owner is personally earning in hundreds of millions of dollars in dividends off the backs of these workers. Pooling money for the mental health of the humans behind its top-earning creators shouldn’t seem that far-fetched.
Here’s what else we’re reading this week…
Meta’s former head of public affairs has a new book out about Big Tech’s role in an “America-First” era. I particularly enjoyed this quote, from an interview with Bloomberg:
“Silicon Valley is a place of stampedes and fads. The funniest thing is that Silicon Valley is a place that prides itself on challenging orthodoxy, conformity and conventional wisdom — and yet in many ways, it’s the most conformist place I've ever lived in my life: Everyone dresses the same, they drive the same cars, they listen to the same podcasts, they claim to read the same books. You get this extraordinary herd behavior and everyone thinks they’re being super insightful, but in fact everybody’s kind of following the same trend. And that’s what's going to happen again, because guess what? The political weather will change! Politics is always fluid.”
A Teen Was Suicidal. ChatGPT Was the Friend He Confided In—A chilling story about a 16-year-old boy who confided in ChatGPT about plans to take his own life. In chat conversations reviewed by Kashmir Hill at the New York Times, ChatGPT advised him on how he might best set up a noose.
TikTok puts hundreds of UK content moderator jobs at risk—Human content moderators at TikTok are being replaced by AI, with job cuts happening just before a union vote was set to take place. A spokesperson for the Communication Workers Union (CWU) says that “TikTok workers have long been sounding the alarm over the real-world costs of cutting human moderation teams in favour of hastily developed, immature AI alternatives.”
Microsoft Asked FBI for Help Tracking Palestinian Protests—A few months ago I interviewed ex-Microsoft engineer Hossam Nasr about his work. It turns out that as he shifted towards activism, Microsoft asked the FBI for help tracking him.


