Search
Content type: Advocacy
Dejusticia, Fundación Karisma, and Privacy International submitted a joint stakeholder report on Colombia to the 44th session of the Universal Periodic Review at the UN Human Rights Council.Our submission raised concerns regarding the protection of the rights to freedom of expression and opinion, to privacy, and to personal data protection; the shutdown of civil society spaces; protection of the right to protest; and protection of the rights of the Venezuelan migrant and refugee population.…
Content type: Examples
Behind every powerful AI system are huge numbers of people labelling and clarifying data to train it, contracted by companies like Remotasks, a subsidiary of Silicon Valley-based data vendor Scale AI, whose customers include the US military and OpenAI. Often the workers, who are assigned tasks they don't understand for a purpose they don't know, are sworn to secrecy. Yet labelling is crucial; it can make the difference between a car stopping to spare the person walking a bike across the road or…
Content type: Examples
More than 150 workers employed by third-party outsourcing companies to provide content moderation for AI tools on Facebook, TikTok, and ChatGPT depend have pledged to create the African Content Moderators Union. The move to create such a union began in 2019 when the outsourcing company Sama fired Facebook content moderator Daniel Motaung for trying to form a union. https://time.com/6275995/chatgpt-facebook-african-workers-union/Publication: TimePublication date: 2023-05-01Writer: Billy…
Content type: Examples
Four people in Kenya have filed a petition calling on the government to investigate conditions for contractors reviewing the content used to train large language models such as OpenAI's ChatGPT. They allege that these are exploitative and have left some former contractors traumatized. The petition relates to a contract between OpenAI and data annotation services company Sama. Content moderation is necessary because LLM algorithms must be trained to recognise prompts that would generate harmful…