AI Trained by Lawyers and Scientists to Replace Them

▼ Summary
– Katya, an unemployed professional, reluctantly took a job with Mercor to create AI training data, a role that involved writing prompts and grading chatbot responses but offered no job security.
– AI companies are paying billions to hire highly educated professionals from diverse fields to produce specialized training data, creating a massive new gig economy for human expertise.
– Workers at companies like Mercor face precarious conditions, including constant surveillance, abrupt project cancellations, decreasing pay, and pressure from inexperienced young managers.
– The work is structured to keep laborers in the dark, with strict confidentiality, unpredictable hours, and the inherent irony of training AI systems that may replace their own professions.
– This system creates extreme worker precarity, reminiscent of other platform labor, and is leading to legal challenges over worker misclassification and fears of a race to the bottom for professional wages.
The rapid advancement of artificial intelligence is creating a paradoxical new job market, where professionals are hired to train the very systems that may one day replace them. This emerging field of AI data work is drawing in lawyers, scientists, writers, and countless other experts, offering a financial lifeline while simultaneously fueling anxieties about the future of their professions. For many highly educated workers facing a tough job market, these gigs represent a surreal and precarious new reality.
One worker, Katya, found herself in this exact situation. After losing her job, she was recruited by a company called Mercor to help train an AI model. The process involved creating detailed prompts and ideal responses for a chatbot, work she initially found engaging and well-compensated. However, the project was abruptly canceled without warning, leaving her financially stranded. This pattern of unstable, on-demand work is common in the industry. Companies like Mercor, Scale AI, and Surge AI act as intermediaries, hiring legions of professionals to generate the specialized data that teaches AI systems to perform complex tasks.
The demand for high-quality training data is insatiable, leading these firms to recruit experts from virtually every field imaginable. Job listings seek everything from Supreme Court litigators and management consultants to experts in “North American early to mid-teen humor.” The goal is to create exhaustive criteria, or “rubrics,” that define a perfect output for an AI, whether it’s writing legal briefs, generating financial analysis, or crafting advertising copy. This process has been described as the largest harvesting of human expertise ever attempted.
For the workers, the experience is often defined by instability and intense pressure. Projects can start and stop without notice, dictated by the shifting needs of the AI labs that are the ultimate clients. Pay rates frequently drop as projects progress, while time expectations shrink. Workers are monitored by tracking software that measures their productivity to the second, penalizing them for any unproductive time. The constant fear of being “offboarded,” or removed from a project, leads many to work off the clock or even use AI tools to help them complete tasks faster, despite strict rules against it.
The management structure at many of these startups adds to the frustration. Workers frequently report being managed by people in their early twenties with little professional experience, who issue contradictory instructions and enforce demanding schedules. The culture often prioritizes relentless availability, with messages and tasks appearing at all hours. For mid-career professionals, taking direction from managers who may not understand their field is a particular point of contention.
This system creates a profound power imbalance. Strict confidentiality agreements prevent workers from disclosing who they work for or the specifics of their projects, making it impossible to leverage their experience or organize collectively. They exist in a state of perpetual competition, easily replaced by a global pool of other qualified candidates. When a project ends, they are often invited to apply for new ones, throwing their application into a pool with thousands of others.
The transient nature of the work is built into the AI development cycle itself. A lab identifies a weakness in its model, say, a lack of chemistry knowledge, and orders a batch of data from chemists. Once that data is delivered and tested, the job pauses. It might resume with new requirements, be canceled entirely, or shift to an entirely different domain like biology or architecture. Data companies maintain a large, on-call workforce to meet these unpredictable demands, bearing no cost for overhiring since workers are classified as independent contractors.
This arrangement allows employers to treat labor as a utility, turning it on and off like a tap. While some workers can earn significant money if they move fast and have in-demand skills, there is an inescapable awareness that they are training their own replacements. As models improve, the specific expertise they are selling becomes obsolete. A linguist who spent a year creating challenging tasks for AI found the work drying up as the models mastered every obscure theory he could devise.
Legal challenges are beginning to emerge, with lawsuits alleging that companies misclassify workers as independent contractors given the extreme control they exert. However, the global nature of the work makes collective action difficult. Unlike ride-share drivers who can organize locally, data workers can be recruited from anywhere in the world, creating a race to the bottom on wages.
For Katya, the psychological toll and lack of stability eventually became too much. After witnessing pay cuts and chaotic project management, she decided to apply for a job at a local coffee shop. She hoped for the basic security of known hours and human interaction, a stark contrast to the isolated, precarious work of training AI. Yet, as she contemplated this shift, her phone chimed with a notification, another project was starting again. The cycle, for her and thousands like her, continues.
(Source: The Verge)



