Time to Deliver Answers
We're calling on Deliveroo, Uber, and Just Eat Takeaway to take serious steps to significantly improve the transparency and explainability around the algorithms they use to manage their workforce.
Algorithmic management of workers has become the norm for gig-economy platforms, with workers obligated to give up an immense amount of personal data just to go to work. Decisions made by these algorithms can determine how much individuals are paid and even whether their employment or accounts are suspended or terminated. Yet, workers are often not provided with satisfactory explanations as to how these decisions are made. This lack of transparency means that decisions made through the “black-box” of an algorithm are seemingly impossible to challenge.
That's why we're calling on food delivery platforms to step up and change this status quo. With three clear recommendations to implement, these platforms have an opportunity to lead the way and demonstrate that they respect their workforce, their rights, autonomy and dignity.
What's the problem?
Food delivery rely heavily on the use of algorithms to manage many aspect of their workers employment, from account creation, to account suspension to how much workers get paid. Yet, it’s almost impossible for workers to know how those decisions are made, and even harder to try and challenge those. Increasing the transparency and explainability of their algorithms can change that, giving workers the due autonomy and dignity they deserve.
What are our recommendations?
This is key to allow workers to understand what algorithms they could be subjected to, and begin to address the information asymmetry which prevents workers from knowing what they're signing up for.
This should allow workers to challenge decisions that might be unfair or flawed, help workers better understand the decisions that are impacting them and meet the expectations of the companies they work for.
Algorithms can be complex, and that complexity means that they can lead to results that are difficult to understand and create systemic discrimination. Allowing people to test them will help to more deeply understand the impact on those affected by algorithms, including potential harms that they may cause.