Samaritans test online app to identify those in need of help

In 2014, the UK suicide prevention group The Samaritans launched Radar, a Twitter-based service intended to leverage the social graph to identify people showing signs of suicidal intent on social media and alert their friends to reach out to offer them help. The app was quickly taken offline after widespread criticism and an online petition asking them to delete the app. Among the complaints: the high error rate, intrusiveness, and the Samaritans' response, which was to suggest that people could opt out by taking their Tweets private or joining the organisation's whitelist. In 2015, the charity followed up with a workshop to consider what, if any, action they could take to help those who needed it. The general recommendation that surfaced in discussion was that they should focus on helping indirectly by ensuring that those wishing to help distressed friends and family knew where to look for support and expertise.

Read more: http://www.pelicancrossing.net/netwars/2014/11/private_fears_in_public_places.html
http://www.pelicancrossing.net/netwars/2015/06/indirect_line.html
 

What is Privacy International calling for?

People must know

People must be able to know what data is being generated by devices, the networks and platforms we use, and the infrastructure within which devices become embedded.  People should be able to know and ultimately determine the manner of processing.

Limit data analysis by design

As nearly every human interaction now generates some form of data, systems should be designed to limit the invasiveness of data analysis by all parties in the transaction and networking.

Control over intelligence

Individuals should have control over the data generated about their activities, conduct, devices, and interactions, and be able to determine who is gaining this intelligence and how it is to be used.

Identities under our control

Individuals must be able to selectively disclose their identity, generate new identities, pseudonyms, and/or remain anonymous. 

We should know all our data and profiles

Individuals need to have full insight into their profiles. This includes full access to derived, inferred and predicted data about them.

We may challenge consequential decisions

Individuals should be able to know about, understand, question and challenge consequential decisions that are made about them and their environment. This means that controllers too should have an insight into and control over this processing.