‘In the end, you feel empty’: India’s female workers watch hours of abusive content to train AI global development

by
0 comments
'In the end, you feel empty': India's female workers watch hours of abusive content to train AI global development

heyOn the verandah of her family home, with her laptop on a clay slab built into the wall, Monsumi Murmu works from one of the few spots where there is a mobile signal. The familiar sounds of domestic life come from inside the house: the clinking of pots, the sound of footsteps, voices.

A very different scene plays on her screen: a woman is pinned down by a group of men, camera shakes, sounds of screaming and breathing. The video is so disturbing that Murmu speeded it up, but his work required him to watch till the end.

Murmu, 26, is a content moderator for a global technology company, logging in from her village in the Indian state of Jharkhand. His job is to classify images, videos and text that have been flagged by automated systems as potentially violating the platform’s rules.

On average, she views 800 videos and photos a day and makes decisions that train algorithms to recognize violence, abuse and harm.

Monsumi Murmu in the forest near her house. Photo: Anuj Bahl

This work sits at the core of the recent successes of machine learning, which are based on the fact that AI is only as good as the data it is trained on. In India, this labor is increasingly performed by women, who are part of the workforce that is often described as “Ghost Worker”.

“For the first few months, I couldn’t sleep,” she says. “I would close my eyes and still see the screen loading.” Images haunted her in her dreams: of fatal accidents, of losing family members, of sexual violence she could not stop or escape. On those nights, she says, her mother would wake up and sit with her.

She says, now these pictures do not shock her like before. “In the end, you don’t feel upset—you feel empty.” There are still some nights, she says, when the dreams return. “That’s when you realize the job has done something to you.”

Researchers say this emotional numbing – followed by delayed psychological consequences – is a defining characteristic of content moderation work. “There may be moderators that avoid psychological harm, but I have not yet seen evidence of that,” says Milagros Miceli, a sociologist. Data Workers InquiryA project examining the role of workers in AI.

“In terms of risk,” she says, “content moderation falls into the category of dangerous work compared to any other deadly industry.”

Studies indicate content moderation Triggers lasting cognitive and emotional stress, often resulting in behavioral changes such as increased alertness. Employees report intrusive thoughts, anxiety, and sleep disturbances.

A Study of Content Moderator A study published last December, which also included workers in India, identified traumatic stress as the most obvious psychological risk. The study found that even where workplace interventions and support systems were in place, significant levels of secondary trauma persisted.

A slab carved from the mud wall of his house serves as Murmu’s table. She uses a secondhand laptop to do content moderation work. Photo: Anuj Bahl

As of early 2021, an estimated 70,000 people were working in data annotation in India, including a Market value in 2021 around $250m (£180m).According to the country’s IT industry body NASSCOM. About 60% of the revenue came from the US, while only 10% came from India.

About this 80% data-annotation And content moderation workers come from rural, semi-rural, marginal backgrounds. Companies deliberately operate from smaller cities and towns, where rent and labor costs are lower, and a growing group of first-generation graduates are looking for jobs.

Improvement in the internet connectivity This has made it possible to plug these locations directly into global AI supply chains, without having to move workers to cities.

Half or more of this workforce is women. For companies, women are seen as reliable, detail-oriented, and more likely to accept home-based or contract work that may be seen as “safe” or “respectable”. These jobs provide rare access to income without migration.

A large number of employees in these centers come from Dalit and tribal communities. For many of them, any kind of digital work represents an upward shift; Cleaner, more regular and better paying jobs than agricultural labor or mining.

A data annotation office in Ranchi, Jharkhand. Tech companies often set up offices in smaller cities. Photo: Anuj Bahl

But working from or close to home may also reinforce women’s marginal position, according to Priyam Vadalia, a researcher working on AI and data labour, who was previously at the Bengaluru-based Apti Institute.

She says, “The respectability of work, and the fact that it comes to the door as a rare source of paid employment, often creates an expectation of gratitude.” “This expectation may discourage workers from questioning the psychological harm it causes.”

Raina Singh was 24 years old when she started data-annotation work. Having recently graduated, teaching was her plan, but she needed the certainty of a monthly income before pursuing it.

She returned to her hometown Bareilly in Uttar Pradesh and logged in from her bedroom every morning to work through a third-party firm that contracted for global technology platforms. The salary – around £330 a month – seemed reasonable. The job description was vague, but the work seemed manageable.

Their early work included text-based tasks: screening short messages, flagging spam, identifying scam-like language. “It didn’t feel worrisome,” she says. “Just boring. But there was also something exciting. I felt like I was working behind an AI. To my friends, AI was just chatgpt. I was seeing what it did.”

But about six months in, the workload changed. Without any notice, Singh was transferred to a new project involving an adult entertainment platform. Their job was to flag and remove material related to child sexual abuse.

“I never thought this would be part of the job,” she says. The material was graphic and relentless. When she raised concerns with her manager, she remembers being told: “It’s God’s job – you’re keeping the kids safe.”

Raina working on her laptop: ‘It didn’t feel worrisome, just boring. But there was something exciting too. Photo: Anuj Bahl

Soon after, the work shifted again. Raina and six others in his team were instructed to classify obscene material. “I can’t even count how much porn I was exposed to,” she says. “It was constant, hour after hour.”

Work affected his personal life. “I became disgusted by the idea of ​​sex,” she says. She withdrew from intimacy and felt increasingly isolated from her partner.

When Singh complained, the response was clear: ‘Your contract says data annotation – it is data annotation.’ She quit the job, but after a year, she says thinking about sex could trigger feelings of nausea or dissatisfaction. “Sometimes, when I’m with my partner, I feel like a stranger in my own body. I want closeness, but my mind keeps pulling away.”

Vadalia says job listings rarely reveal what the work actually involves. “People are hired under vague labels, but only after the contract is signed and training begins do they realize what the real work is.”

Remote and part-time roles are aggressively promoted online, and disseminated as “easy money” or “zero-investment” opportunities. Youtube video, LinkedIn posts, Telegram channels and influencer-led tutorials that make work flexible, low-skilled and safe.

Hyderabad is home to India’s AI industry – a far cry from the scattered rural locations where data is actually labeled. Photo: Anuj Bahl

The Guardian spoke to eight data-annotation and content-moderation companies in India. Only two said they provided psychological support to workers; Others argued that the work was not demanding enough to require mental health care.

Wadalia says that where support is there, one has to seek it out, shifting the burden of care onto workers. “This ignores the reality that many data workers, especially those from remote or marginalized backgrounds, don’t even have the language to describe what they’re experiencing,” she says.

Absence of Legal recognition of psychological harm She says India’s labor laws also leave workers without meaningful protections.

Monsumi Murmu takes a walk in the forest to help deal with work stress. ‘I sit under the open sky and try to notice the peace around me.’ Photo: Anuj Bahl

The psychological impact of isolation is intense. Content moderators and data workers are bound by strict rules Non-Disclosure Agreement (NDA) Which prevents them from talking about their work, even with family and friends. Violating an NDA may result in termination or legal action.

Murmu feared that if her family understood her work, she, like many other girls in her village, would be forced to leave a paid job and get married.

With only four months left on her contract, which pays around £260 a month, the specter of unemployment stops her worrying about her mental health. “I’m more worried about finding another job than working,” she says.

In the meantime, he has found ways to live with the crisis. “I go for long walks in the woods. I sit under the open sky and try to notice the peace around me.”

On some days, she collects mineral stones from the land near her house or draws traditional geometric patterns on the walls of the house. “I don’t know if it really cures anything,” says Murmu. “But I’m feeling a little better.”

Related Articles

Leave a Comment