Your AI-Generated Work Is Annoying Your Colleagues

▼ Summary
– Workers are creating low-quality “workslop” by over-relying on AI tools like ChatGPT and Gemini for tasks such as writing and coding.
– This AI-generated content appears adequate but lacks substance, forcing peers or managers to spend nearly two hours correcting it.
– Workslop disproportionately affects professional services and technology industries, shifting the cognitive workload from creator to receiver.
– Employees who produce workslop are viewed more negatively, with half of respondents seeing them as less creative, reliable, and capable.
– Despite promises of productivity gains, AI’s return on investment remains unclear, with only 5% of companies reporting measurable benefits.
The promise of artificial intelligence in the workplace often centers on a dramatic boost in efficiency. However, a troubling side effect is emerging: a flood of low-quality, AI-generated content that colleagues are dubbing “workslop.” This phenomenon isn’t just an annoyance; it’s creating significant extra work and damaging professional reputations. New research indicates that this trend is forcing managers and peers to pick up the slack, ultimately harming careers and casting doubt on the immediate return on investment for AI tools.
The term “workslop” describes output that appears competent on the surface but lacks the substance needed to genuinely move a project forward. According to a collaborative study, a startling 40% of employees reported receiving such subpar work from colleagues within the last month. While this low-effort content is most commonly exchanged between peers, it is also being passed up the chain to managers by their direct reports.
While cutting corners is not a new workplace behavior, the tools enabling it are. Applications like ChatGPT and various coding assistants are now being used to draft reports, fix software bugs, and create presentations. The core problem arises when employees delegate too much cognitive labor to these systems. The individual does less of the actual thinking, leading to results that are often superficial or incorrect. This then forces someone else, a coworker or a supervisor, to invest time in interpreting, correcting, or completely redoing the assignment.
This issue is pervasive but seems to hit professional services and technology sectors particularly hard. The researchers point out that the real damage of workslop is how it redistributes effort. It effectively transfers the burden of work from the original creator to the receiver, creating a hidden tax on productivity. This isn’t merely about over-reliance on technology; it’s about using machines to offload cognitive work onto another human being.
The consequences for those who produce workslop are tangible. The survey data reveals that half of all respondents form a more negative opinion of colleagues who submit such work, viewing them as less creative, reliable, and capable. This perception can stall career advancement and erode trust within teams.
Furthermore, workslop highlights a glaring contradiction in the narrative surrounding AI. Despite bold claims from tech companies about supercharging productivity, the return on investment remains unclear for most organizations. One report suggests that a mere 5% of companies have actually realized a financial return from their AI investments. The research on workslop quantifies the problem, finding that receiving a piece of AI-generated content adds nearly two hours of extra work for the person who has to deal with it.
One survey participant captured the frustration perfectly, explaining, “I had to waste more time following up on the information and checking it with my own research. I then had to waste even more time setting up meetings with other supervisors to address the issue. Then I continued to waste my own time having to redo the work myself.” This sentiment underscores that the promised efficiency gains of AI can be entirely negated when the output simply creates more work for everyone else.
(Source: ZDNET)





