Big data intelligence company builds advertising audiences from mobile tracking at Iowa caucuses

Caucuses, which are used in some US states as a method of voting in presidential primaries, rely on voters indicating their support for a particulate candidate by travelling to the caucus location. In a 2016 Marketplace radio interview, Tom Phillips, the CEO of Dstillery, a big data intelligence company, said that his company had collected mobile device IDs at the location for each of the political party causes during the Iowa primaries. Dstillery paired caucus-goers with their online footprints via ad networks, which bid in real time for the right to show individuals ads based on the data their phones send about them to the sites and apps they use. This data includes a location (when that privilege is granted) and, often, an identifying code the network has used to build a profile. On the night of the caucus, Dstillery flagged more than 16,000 of these ad auctions that took place and captured the mobile IDs, then looked up their user characteristics and added the caucus results in order to build observations about the kind of people that voted for each candidate. Approximately 350,000 Iowans caucused in 2016. Phillips calls the profiles Dstillery builds "crafted audiences", and says the data is anonymous and not personally identifiable; the caucus effort was an experiment of the kind the company considers to be its main business.

https://splinternews.com/how-this-company-tracked-16-000-iowa-caucus-goers-via-t-1793854687

Writer: Kai Ryssdal, Kashmir Hill
Publication: Marketplace, Splinternews
Publication date: 2016-02-10, 2016-02-12

What is Privacy International calling for?

People must know

People must be able to know what data is being generated by devices, the networks and platforms we use, and the infrastructure within which devices become embedded.  People should be able to know and ultimately determine the manner of processing.

Data should be protected

Data should be protected from access by persons who are not the user.

Limit data analysis by design

As nearly every human interaction now generates some form of data, systems should be designed to limit the invasiveness of data analysis by all parties in the transaction and networking.

Identities under our control

Individuals must be able to selectively disclose their identity, generate new identities, pseudonyms, and/or remain anonymous. 

We should know all our data and profiles

Individuals need to have full insight into their profiles. This includes full access to derived, inferred and predicted data about them.