Algorithms Now Decide Wages, Work, Punishment in India’s App Economy

A New Study Shows How Opaque Systems Control Workers

December 26, 2025

A delivery agent standing next to his scooter, with a box in his hand.

A new study on gig workers employed through digital platforms has found that app-based companies are using algorithms to make decisions about pay, work assignments and punishment without offering any explanation or way to appeal. The system deprives workers of basic rights, treats them as disposable and creates conditions that must be called out as unjust and dangerous.

One key finding is what the researchers call “machinic dispossession,” or how platforms reduce workers to data points and strip them of control over their time and income, according to the study, titled “There’s No Ghost in the Machine,” conducted by researchers from the Centre for Labour Studies at the National Law School of India University and the organisation IT for Change.

Amazon warehouse workers described being forced to meet hourly targets set by a software system called ADAPT, say the researchers, who interviewed 26 workers and five union organisers across sectors including Amazon warehouses, food delivery, ride-hailing, and urban services such as cleaning and beauty.

The workers reported having to scan 1,200 items per hour while also loading goods, with monthly earnings between 14,000 and 18,000 rupees, barely enough to cover rent and family needs.

A separate 2024 survey cited in the study found that four out of five Amazon workers said targets were difficult or very difficult to meet.

Urban Company workers reported similar pressure from the platform’s auto-acceptance feature, which forces workers to take jobs without the option to refuse. Women with caregiving duties were pushed to avoid this feature, leading to lower earnings.

Food delivery workers described how incentive-linked targets were hard to achieve, with platforms appearing to reduce order allocations just as workers neared payout thresholds.

The study found that algorithms operate in combination with human managers, company policies and business goals. The combined system decides how work is allocated, how much workers are paid, how performance is judged and when penalties are issued. Workers called the result a form of control that left them overworked, constantly monitored and unable to challenge decisions.

The second part of dispossession involved performance ratings. Workers across platforms were ranked using customer reviews and internal scores.

On Swiggy, these ratings determined whether a worker would be placed in the gold, silver or bronze tier. Higher tiers got access to better work slots. Workers said they struggled to maintain high ratings, and even when they did, the promised benefits were unclear. Low ratings, on the other hand, led to punishment such as loss of access to jobs, reduced pay, or suspension.

Surveillance formed the third layer of dispossession.

Amazon warehouses had around 2,000 cameras, tracking worker movement and measuring idle time in milliseconds. Food delivery workers in Bengaluru said the apps used GPS to detect where workers were gathering and would assign orders that sent them in opposite directions. Workers believed this was done to prevent them from talking or organising as a group.

The study’s second major finding was what it called “algorithmic anxiety,” or the mental stress caused by unpredictable and opaque systems. Workers had no way to understand why they were given certain jobs, why they lost access to others, or why penalties were imposed.

Urban Company workers said they could not figure out how to qualify for higher-paying “premium” jobs. Amazon workers talked about automatic punishments from the ADAPT system, such as loss of shifts or warnings, with no explanation. Some workers went days or weeks without work, eventually being forced to leave cities and return to their villages.

Workers also reported that grievance mechanisms were ineffective.

Amazon’s “My Voice” app was called a dead end where complaints disappeared. Managers refused to intervene, claiming that the algorithm made the decision and they could not override it. In some cases, this included false accusations or software glitches that led to suspensions or pay cuts.

The study points out that while India has begun to pass laws recognising gig workers, current laws do not properly address algorithmic control. A few states grant limited rights to information, but there are no rules that allow workers to challenge or change the systems that control their work.

No public institution is empowered to review these systems, and workers have no collective rights to push back.

The system creates a severe imbalance of power.

Companies have perfect information about workers and total control over how decisions are made, while workers have no visibility, no say, and no way to protect themselves. It’s a case of extreme information asymmetry. Without rules to ensure fairness, such markets push workers into insecurity and remove any bargaining power they might have.

Algorithmic control also erodes the right to decent work.

The International Labour Organization defines decent work as work that offers fair income, security and dignity. None of these are possible in an environment where targets are unachievable, ratings are unpredictable and surveillance is constant. The economic model that emerges from this is one that extracts labour while avoiding responsibility for those who perform it.

From a rights perspective, these systems deny due process.

Any person accused of underperformance, error or violation should have the chance to understand the charge and respond. In these systems, there is no hearing, no appeal and no accountability.

Workers’ anxiety comes from the built-in uncertainty of the system, which makes it impossible to plan for the future due to unstable income or employment. This leads to harm that spreads through families and communities, affecting health, education and long-term well-being.

Attempts to solve this with transparency alone will not work. Informing workers about how systems function does not change the fact that those systems are still designed without their input and can be changed at any time. What is needed is structural reform. Workers should have the right to participate in decisions about how algorithms manage them. Regulatory authorities must be able to audit and restrict systems that cause harm. Redress mechanisms must be independent, effective and easy to access.

In India’s app-based economy, control over work has rapidly moved from human managers to software systems. But algorithms are designed by companies to serve business interests. If those designs ignore fairness, dignity or workers’ voices, then it becomes the duty of law and public policy to step in and set clear boundaries.

You have just read a News Briefing by Newsreel Asia, written to cut through the noise and present a single story for the day that matters to you. Certain briefings, based on media reports, seek to keep readers informed about events across India, others offer a perspective rooted in humanitarian concerns and some provide our own exclusive reporting. We encourage you to read the News Briefing each day. Our objective is to help you become not just an informed citizen, but an engaged and responsible one.

News Briefings Archive
Vishal Arora

Journalist – Publisher at Newsreel Asia

https://www.newsreel.asia
Previous
Previous

Emotional Distress Linked to Compulsive Pornography Use, Indian Study Finds

Next
Next

Bail for Rapist, Violence for Survivor, Laughter from a Politician