Authored by Autumn Spredemann via The Epoch Times,
Across the artificial intelligence (AI) supply chain, insiders describe a precarious, high-turnover workforce with limited support and stability.
This “invisible” human labor that labels data, evaluates outputs, and filters harmful material has become a revolving door of talent that navigates high-pressure gigs and burnout. Moreover, workers and industry experts say this talent churn can degrade the very AI models workers are paid to improve.
Across the board, workers who are hired to support, evaluate, or operationalize AI systems face similar challenges: high-stress environments that often involve complex tasks, unrealistic timelines, job instability, and low wages.
It’s no secret that the tech industry has long suffered from high turnover rates. Numbers vary, but many studies put the average rate of talent churn in the tech sector at between 13 percent and 18 percent.
This becomes clear when considering the cost of replacing tech talent, which can be up to 150 percent of a worker’s salary, including recruitment expenses, onboarding time, productivity losses, and impacts on customer relationships.
Some believe that the loss of institutional knowledge alone makes worker retention critical.
“People love to talk about the ‘magic’ of AI, but the work culture behind it is a meat grinder. I’ve seen talent turnover in model evaluation hit record highs because the work is repetitive and psychologically draining,” Barry Kunst, vice president of marketing at Solix Technologies, told The Epoch Times.
“When you lose a lead researcher to churn, you don’t just lose a body; you lose the ‘why’ behind the model’s safety guardrails,” Kunst said.
This is why he’s adamant about AI workforce stability, which he said correlates directly with model reliability: “If you’re rotating contractors every six months to keep labor costs low, your data governance will fail, period.”
Sovic Chakrabarti, the director of digital marketing agency Icy Tales, said, “Team turnover is more common than people expect.
“In some groups, especially those tied to model training, evaluation, or data labeling pipelines, churn can happen every few months. Short contracts, project-based funding, and constant reorganization mean people cycle in and out quickly,” he told The Epoch Times.

A technician works at an Amazon Web Services AI data center in New Carlisle, Ind., on Oct. 2, 2025. Noah Berger for AWS/Reuters
Chakrabarti has worked on the development and support side of AI systems long enough to see patterns that, as he put it, “rarely make it into public discussions.”
“That [workforce] churn absolutely leads to lost knowledge,” he said. “Important context about why a dataset was filtered a certain way, why a safety rule exists, or why a model behaved oddly in testing often lives in someone’s head.”
When that person leaves, documentation rarely captures the full story, according to Chakrabarti. “New hires inherit systems without understanding the original tradeoffs, which can quietly introduce risks,” he said.
The Human Cost
Burnout rates among information technology (IT) workers are high. LeadDev’s Engineering Leadership Report 2025 found that 22 percent of the 617 polled engineering leaders and developers felt critically burned out at work.
An additional 24 percent of respondents reported feeling “moderately” burned out, while 33 percent reported low levels of burnout.
Some of this is driven by job-security fears after two years of layoffs at big tech companies, but the pay for many of the workers fueling the AI revolution is often low.
The Alphabet Workers Union (AWU), Communications Workers of America (CWA), and TechEquity led a study on the working conditions of U.S.-based data workers and found conditions similar to those of tech contractors in developing countries.
In a survey of 160 U.S. data workers, 86 percent worried about being able to pay their bills, and 25 percent relied on public assistance to get by. The same group reported a median hourly wage of $15, with a median annual salary of $22,620.
Eighty-five percent of the study group said they’re expected to be “on call” for work, but only 30 percent reported being paid for this time. More than a quarter of respondents reported spending more than 8 hours per week on call.
“If there’s anything I wanted the general public to know, it is that there are low paid people [in the United States] who are not even treated as humans—just little more than employee ID numbers —out there making the 1 billion dollar, trillion dollar AI systems that are supposed to lead our entire society and civilization into the future,” Kirn Gill II, a search quality rater working on Google products at Telus, told the CWA.
Chakrabarti said the work culture behind AI fuels these challenges.
“There is real pressure to keep labor costs low. I have seen unrealistic timelines, understaffed teams, and expectations to ‘do more with less’ while the stakes keep rising. That tension creates stress, especially when the systems affect millions of users,” he said.

Chat GPT app icon is seen on a smartphone screen, in Chicago, on Aug. 4, 2025. AP Photo/Kiichiro Sato, File
He added that being part of the shadow workforce behind AI can also be psychologically demanding.
“You carry responsibility without always having authority or time to do things properly. … As tools evolve, roles shift fast, and many people feel replaceable even while being essential,” Chakrabarti said.
Nicky Zhu, an AI Interaction Product Manager at Dymesty, agrees that the cost-containment pressure on data workers is “unrealistic” and is fueling the burnout phenomenon.
“Companies employ contractors instead of using permanent staff, mandate 60-hour crunch weeks, and expect rapid learning of intricate systems. I have witnessed multiple capable engineers exit the field of AI completely because of the high levels of instability and the unmanageable workload,” Zhu told The Epoch Times.
Zhu said the mental strain associated with data work is often unacknowledged.
“Staff are regularly exposed to disturbing material during safety testing, including assessing harmful content. Knowing that your work impacts millions of users increases the stress. The combination of rapid AI development, job uncertainty, and high turnover is mentally overwhelming,” she said.
In the data worker conditions analysis, respondents reported limited or no access to mental health benefits, despite being what the study authors called a “first line of defense, protecting millions of people from harmful content and imperfect AI systems.”
Only 23 percent of data workers surveyed reported having employer-provided health benefits.
The International Labor Organization noted that large language AI models such as ChatGPT and Claude still require “invisible workers” who fine-tune AI responses, mitigate biases, and eliminate toxic or disturbing content behind the scenes.
“As a result, workers are routinely exposed to graphic violence, hate speech, child exploitation, and other objectionable material. Such constant exposure can take a toll on their mental health and trigger post-traumatic stress disorder, depression, and reduced ability to feel empathy,” the International Labor Organization stated.
Revolving Door Risks
A knock-on effect of AI’s constant labor change is an increase in cybersecurity risks.
“Labor turnover literally impacts the quality, safety, and reliability of models,” Janero Washington, education director at ACSMI Cybersecurity Certification, told The Epoch Times.
“Large turnover interferes with domain knowledge, delays in the iteration process, and the probability of missing key details in the development.”
Washington said this could have a “direct influence on the accuracy and strength of [AI] models, particularly during deployment phases.”
He added that low labor costs are the primary pressure point in AI projects, which tend to prioritize cost-efficiency over balanced investment in skilled labor.
“It may result in corners being cut, including overworking teams, unrealistic deadlines, or having to use less experienced hires to keep budgets,” he said.
Zhu has seen firsthand how workforce churn affects the efficiency of AI tools: “Knowledge is lost faster than it is documented. Important information about model edge cases, limitations, safety procedures, and related details is lost when contractors leave after six or 12 months.”
When she started her current position, Zhu found that three teams had attempted to resolve the same set of problems using an AI feature that had already been built.
“Still, no one had documented the rationale for the different design decisions. Ultimately, we had to remake previously developed design solutions for problems that had already been solved. This is an all-too-common reality for the industry,” she said.
The data security platform Cyberhaven observed that 24 hours before a layoff or employee resignation, organizations can experience a 720 percent surge in data exfiltration. This includes everything from downloading sensitive files to forwarding emails or copying customer lists, all of which can have significant consequences.
Washington said that critical knowledge or details can be easily lost when a data team is reliant on a short-term contract or experiencing a high talent turnover.
“This affects continuity of knowledge of datasets, edge cases, or versioning issues, causing inefficiencies and possibly a rework of the same issue,” he said.
Chakrabarti agreed. “When teams are stretched thin or constantly rebuilding, issues get patched instead of deeply solved,” he said.
Loading recommendations…









