Follow us

SearchEsegui ricerca

Follow us

Keep you up to date!
Register now | Login

Residential Renovation towards nearly zero energy CITIES

Tackling the ethical challenges of big data

An authority on social data, Susan Etlinger argues we need to apply critical thinking and exercise caution as we enter the age of “data ubiquity”

Tackling the ethical challenges of big data

The coming tech disruptions and revolutions have at times been predicted to fix all manner of societal and environmental ills, so at first glance Susan Etlinger’s warning to exercise caution and restraint can seem odd. But while data may be propelling advances in many fields, caution and critical engagementwith the information we collect is essential if we are to avoid error and protect our privacy, Etlinger warns.

“At this point in our history, as we've heard many times over, we can process exabytes of data at lightning speed, and we have the potential to make bad decisions far more quickly, efficiently, and with far greater impact than we did in the past” she said in a 2014 Ted talk.

Applying these ideas to the world of business, Etlinger advises companies on using big data and is member of the Big Boulder Initiative, an industry organisation which promotes the successful and ethical use of social data.

We asked her about how cities and companies can safely and effectively handle information as we enter the age of data ubiquity.

 

Why is critical thinking so important to handling big data?
Here we are with more data than we know what to do with. And human beings have this tendency to give a lot of respect to technology. It’s funny, if you look at charts and graphs, if you look at studies that come out, people tend to trust charts and graphs quite a bit. What’s interesting is underneath that chart or that graph might be terrible data, and actually might be showing something that’s untrue. Or it might not account for something important.

 

In the age of nearing data ubiquity, are we keeping pace with our critical thinking and processing of this information?
It depends on the kind of data. So if you think about for example something like weather prediction has got so good. The set of data you need and the possible outcomes is more or less constrained.

Then when you get to things we call human data - so human expression, text, speech, audio, any of that - interpreting meaning, and then even translating, and then interpreting meaning again, you get into some real challenges in terms of understanding what people actually mean.

For example, on Twitter you could see something like “Oh great, I dropped my phone and broke it.” And most natural language processing technologies will classify that as a positive statement. So things like sarcasm, things like where certain groups might use veiled language because they might be politically active under administrations that frown upon that. Even simple things like the language teenagers use, which changes all the time, can be missed.

 

Could you summarise some of the applications and ethical concerns regarding image recognition (the ability of computers to “read” and interpret images accurately) and emotional detection technology (which aims to analyse and interpret a person’s mood either through speech patterns, facial movement or other cues)?
Some of the potential uses are really interesting. So you could imagine that if you had a photographic record of a city over time you could understand a little bit about population patterns, you could understand where people live during different times, where people work, what commute patterns look like. You could understand sentiment and how whether people seem happier or sadder or more worried than they were before. You could look at things like interests, sports or purchasing patterns, what people eat, anything. The question is given that you can do that, should you do that?

I think there are some ways in which this technology can help us understand our history better, I think there are ways in which it can help us understand others better.

[Regarding emotion detection technology] there are tremendous applications for social good (...) Delivering food and medicine to people who are different to reach. All the way to stuff for the elderly, stuff for the disabled, but at the same time those same technologies can also be used for mass surveillance, for other political purposes. They can be used for scary reasons too.

It’s a lot of power we potentially have at our fingertips now. So I’m arguing that we need to take a breath. And not stop it, because innovation will happen no matter what we do. But to really think about the ways we incorporate it into our businesses and into our society to.

 

You’ve discussed many cases of well-intentioned big data initiatives leading to unforeseen breaches of privacy. For example, the case of the charity Samaritan's’ controversial social media monitoring “suicide watch” app. How can we guard against these events? 
Well, this is difficult. When you think about Artificial Intelligence, one of the hallmarks of AI is that it’s really difficult to understand how an algorithm works. An algorithm is sort of like a recipe in the sense that it tells you what ingredients and in what proportion to get your outcome and the outcome could be a fantastic cake or something completely inedible depending on what you do. And people don’t want to share [their algorithms] because of competitive advantage.

One thing I really like about computer vision, about image recognition in particular is that you see the images come back. The beautiful thing about data science is you can then go back and work on your algorithm, on your data model to better reflect the world we want to live in rather than the world we actually live in.

I like looking at images because they show you right away. They show you right away when you google the phrase "three black teenagers" versus the phrase "three white teenagers”. And you see images of three white teenagers having fun at picnics and images of three black teenagers being booked into police departments.

The data that we have encodes the biases we have. And I really think for one think that as painful as it is, that gives us an opportunity to stop and be better.  

 

By Sam Edwards

13 December 2016

Photo credits: Markus Spiske

Print