Maintain a public register of the algorithms used to manage workers

The register must include all algorithms that make management decisions that affect rights at work.

Uniform grid of endless rows of desk cubicles with laptops and monitors

The public register is key to addressing the information imbalance of algorithmic management by allowing workers (and candidates) and their representatives to understand what algorithms are being used and how they work. In order to do this, the register must be in accessible non-technical language and kept up to date. It must include a list of all algorithms that affect workers' treatment while at work. For each listed algorithm, the following information must be included:
A variety of ways of communicating this information is likely to be valuable. For example, flow charts or short form videos that accompany textual explanations may help everyone to understand how the algorithm operates.

  • The purpose and design of the algorithm

A short (two or three sentence) description to explain what purpose(s) the company uses the algorithm for and why it has been preferred to other options.
An overview of the algorithm's design should also be given: including what sort of management decisions are made by the algorithm (and whether they are advisory or decisive); whether the algorithm relies on neural networks, machine learning, probabilistic functions or so on; what training data was used; and under what circumstances the algorithm is not deployed or has a failsafe.
:::success
Efficiency ranking. This algorithm measures how productive you have been during the day. It helps us to understand what amount of work is achievable and where there is room for improvement. This allows us to ensure productivity while allowing you the flexibility to work from home.
:::
:::success
Dynamic pricing. Using this algorithm, a price is set for a trip. Depending on the situation, the prices given could range by up to 20%. All trips accepted or rejected by the driver will be recorded and be used to further finetune the algorithm.
:::

  • The relative importance of the algorithm's inputs and parameters

The register must explain, in an accessible and non-technical way, what data and ratings it uses to reach decision. This means providing an easy way to understand how important different inputs and parameters are to different decisions. This could be done in various ways: from a simple rating of 'high/medium/low importance', to giving more specific and granular detail of the weighting each input or parameter has.
Parameters: as well as explaining how important different parameters and inputs are, it should also explain the source of inputs (are they from the app, from customers, from the web, inferred, how long ago, while at work, from data brokers, etc). In particular, the register should confirm that the algorithm uses only data that is strictly necessary for the purposes of the algorithm, and does not use any sensitive personal data, emotion recognition, data collected while not at work etc. It is possible that AI algorithms will use parameters that are hard to give real-world human descriptions for. In such cases, the company must thoroughly explain how the tool has been built, and how they monitor and audit its outputs to ensure that they do not result in bias or discrimination. Examples (or statistics) comparing different, but similar, inputs with differing outputs may also be needed to explain that 'these sorts of inputs tend to lead to these sorts of outputs'.

[name=Eliot] I've merged inputs and parameters as I don't see a good reason to keep them separate and making the distinction isn't helpful.

:::success
Our facial recognition algorithm assesses whether or not the face presented to it is a match with the information held on our databases. It has been trained on a dataset of [something] and its decisions will always be the same given the same inputs. There is no underlying profiles or scoring behind the algorithm, and the key parameters for its decisions are the distances between facial features such as eyes and mouth. The algorithm is not used when it cannot get enough information from the face presented to it.
:::
:::success
"As a priority, Worker Info Exchange will be helping drivers demand access to their Uber worker performance profile and full information from Uber on how it has been assessed." https://www.workerinfoexchange.org/post/uber-surveils-one-and-the-tesla-tease-what-we-learned-from-uber-s-latest-financial-results
:::

  • Human intervention
    Where algorithms are used to make decisions in the workplace, there should always be a human either checking, and/or able to review, any decisions. The register must specify who has what oversight over the algorithm's outputs, what level of decision-making authority they have, and how they can be contacted (contact details need not be public, but must be accessible to workers).

The register should also provide some operational information about how much staff capacity (in FTE) is dedicated to human review and how long a review is expected should take. If a review exceeds this expected time period, its effect should be paused until the review is complete.
:::success
Delivery rate algorithm. If you think that the algorithm has failed to take into account an event beyond your control which has affected your delivery rate, please contact your area supervisor. Their contact details can be found under 'management functions' in the app
:::

  • Development history and updates
    The company should also state where responsibilities for the development and up-date of the algorithm lie, especially where an external supplier has been involved. This does not require identifying individuals, but rather relevant teams/departments/organisations and the nature of their different responsibilties. A log of updates should also be listed.

This is a paragraph

This is another paragraph