Freedom House survey finds 30 countries manipulate elections via armies of "opinion shapers"
A 2017 Freedom House survey of 65 countries found that 30 of them were using armies of "opinion-shapers" to manipulate elections, advance anti-democratic agendas, and repress their citizens. Although most of these countries direct these efforts to manipulate opinion domestically, the report finds that manipulation and disinformation played a role in elections in at least 17 other countries, including the US presidential election and the UK's EU referendum. The number of countries using these techniques - which includes Venezuela, Turkey, and the Philippines - has been growing every year since the first such report was issued in 2009, and the techniques in use have become increasingly sophisticated. Today, bots, propagandists, and fake news outlets ensure high visibility and spread by using social media and search algorithms, as well as integration with trusted content.
Writer: Alex Hern
Publication date: 2017-11-14
Personalisation, persuasion, decisions and manipulation
The data that is observed, derived or predicted from our behaviour is increasingly used to automatically rank, score, and evaluate people. These derived or inferred data are increasingly used to make consequential decisions through ever more advanced processing techniques. In the future people will be scored in all aspects of their lives, societies will be managed invisibly, and human behaviour will be under the control of the few and the powerful.
Profiling makes it possible for highly sensitive details to be inferred or predicted from seemingly uninteresting data, producing derived, inferred or predicted data about people. As a result, it is possible to gain insight into someone’s presumed interests, identities, attributes or qualities without their knowledge or participation.
Such detailed and comprehensive profiles may or may not be accurate or fair. However, increasingly such profiles are being used to make or inform consequential decisions, from finance to policing, to the news users are exposed to or the advertisement they see. These decisions can be taken with varying degrees of human intervention and automation.
In increasingly connected spaces, our presumed interests and identities also shape the world around us. Real-time personalisation gears information towards an individual’s presumed interests. Such automated decisions can even be based on someone’s predicted vulnerability to persuasion or their inferred purchasing power.
Automated decisions about individuals or the environment they are exposed to offer unprecedented capabilities to nudge, modify or manipulate behaviour. They also run risk of creating novel forms of discrimination or unfairness. Since these systems are often highly complex, proprietary and opaque, it can be difficult for people to know where they stand or how to seek redress.