Transparency and explainability for algorithmic decisions at work

Any employer or platform that uses AI or other algorithmic tools to manage their workers must be clear and open about how those tools work. Workers need to be able to understand decisions that affect them, especially when it’s about important things like their working hours and pay. Otherwise, companies can unfairly leverage control over workers by withholding important information from them.

That’s why workers must be given the right level and amount of information at the right time, including clear reasons for why decisions are made. This information should enable workers to understand which parameters are the most important in an automated decision-making process and what changes they can make to get the outcomes they prefer. They must also know how they can ask a human to review algorithmic decisions. Without parity in information, workers are left playing a game that they don’t know the rules for.

Ultimately, this is about respect for the rights of workers who are subject to algorithmic management, in particular their right to privacy and dignity. To provide the foundation for that respect, we’re calling on all employers and platforms to do the following when using AI and algorithms in the workplace.

Dear [insert names or companies here](,

We, the undersigned, believe that companies should respect their workers. We believe that you should respect your workers.

Each one of you are a market leader. And each of you claim to care, variously promising you ‘believe in doing business responsibly and having a positive impact’, that you will ‘put the voice of the rider at the heart of everything’ or ‘will ensure that we treat our customers, our colleagues, and our cities with respect; and […] will run our business with passion, humility, and integrity’.

But this has yet to be borne out in your business practices, which have lead to millions of euros in fines for obstructing drivers’ attempts to enforce their rights, and for systemically inappropriate data processing, along with de-activating accounts by automated systems for minor overpayments.

Instead, you are automating exploitation - leveraging black box algorithms to make decisions about de-activation, work allocation and pay without sufficient explanation, stripping workers of the ability to understand and challenge those decisions.

We believe the foundation of repect is transparency. Yet current systems withhold vital information from workers - creating precarity, stress, and misery.

We believe a responsible employer should: