Can we combat disadvantage by steering personal data flow?

In an increasingly complex society, a holistic approach to future data flows is a necessity.

Picture of Chris Mesiku

Written by: Chris Mesiku
12 Oct 2022

Essays Events

Can we combat disadvantage by steering personal data flow?
Can we combat disadvantage by steering personal data flow?

Global data creation is projected to surpass 180 zettabytes by 2025, up from 64 zettabytes in 2020. In an increasingly complex society, a holistic approach to future data flows-–-one that considers people, technology and environment––is a necessity.

Back in 2016 as my screen was left hanging while filling out the Census online, I felt a sense of unease, a feeling that was cemented when initial news broke of hacking attacks. What would happen to my data? Who had access to it now? And what could I do to mitigate potential harm?

Concern is warranted#

Throughout history we’ve seen how data can be exploited in disturbing ways. Learning how punch card technology was used in the horrors of the Holocaust – in camp operation and record keeping – led me down a path of questioning how we avoid technology systems being used for unintended purposes. A path I’m still on today. I wanted to know if there were efforts to support the persecuted and vulnerable through data collection.

The first poverty maps in London created by Charles Booth are an example of this. Booth invested 17 years of his life into this project, stemming from a concern for the plight of the urban poor. Through his work, Booth gave us an insight into the minutia of these peoples’ lives with online archives showing each street colour-coded to represent the income and social class of its inhabitants.

Forty years later Booth’s maps determined where the London Metropolitan Police would place police phone boxes, providing a direct line to headquarters for both the public and patrolling police to report a crime, a fire or summon an ambulance. Booth likely didn’t consider his maps would be used this way. I wonder what the placement of these phone boxes signalled to the local population. Did it make them feel safer, or more at risk?

A recent example of the unintentional use of sensitive personal data is the UK Home Office’s deportation program of homeless EU nationals. Information that had been collected by charities to help identify and protect rough sleepers, was instead used against the homeless. The program was found to be illegal but 18-months later, in 2019, the Home Office launched an eerily similar program.

In Australia, the government in 2021 ruled that American facial recognition company, Clearview AI, breached Australians’ privacy. The company was found to have collected sensitive information without consent without notifying individuals, among many other offences. This is a reminder of how vulnerable we are.

Concerns about how much of our personal data is collected are widespread and warranted.

Restricting data flow#

Many countries, including Australia, have historically resorted to narrow, territorial approaches to protect personal data by restricting its flow. But is this effective?

Taking away the holistic view inhibits a collective tracking of rich datasets that could reveal patterns across jurisdictions and complex systems; patterns that could potentially benefit the people who need it the most.

Steps in the right direction are happening in some places. Australian government agencies are increasing the availability of complex datasets for researchers and social good initiatives, via programs such as the Multi-Agency Data Integration Project (MADIP), a partnership across health, education, and social services departments, to name a few.

The hope is to control and protect data while allowing government programs and the research sector to successfully use these datasets.

This flowing data approach enables new kinds of research, policy proposals and targeted interventions. But we need more government agencies taking part, we need more researchers to be aware of the potential, we need more people to understand how this can help them.

If we are to achieve this, we need data infrastructure – not just accessible data, but everything that supports it from governance to policies and procedures, to education, to evaluation – that supports all those in the flow, from the people whose data is collected, all the way through to those who use the insights generated to make better policies and programs. We need everyone involved to understand that this in not just about data, it is about systems that are social, environmental and technological. In our work we are exploring what this will entail, and how we can support safe, sustainable, and responsible use of data.

Using data for good#

An example of a purposefully designed platform is Solid – an open web platform created by founder of the World Wide Web, Sir Tim Berners-Lee. It provides users with a Personal Online Data Store (Pods) so individuals can retain full sovereignty over their personal data, granting and revoking consent for it to flow through government, business, education, and services sectors. Key to this is the ability for individuals to see all systems using their data at any one time.

These approaches allow for the visibility of data relationships within systems and between systems. When people have visibility of their information and understand how it is being used, they are also more likely to engage in data collection activities.

Solid Pods arose from telling better stories about our preferred data futures. Without imagining such stories, we will continue to experience inconsistent attitudes about privacy and the economic value attached to data.

The culprit is not data but how we manage it, the approach we take, and the questions we ask along the way. A cybernetics approach is akin to donning our night-vision goggles and although the view may not be perfect, it allows us to find our way, leading us in the right direction.

An alternative approach#

The ANU School of Cybernetics is looking at how cybernetic approaches might help data users, collectors, and custodians anticipate potentially harmful future data flows within dynamic contexts by developing strategies that could help them maximise beneficial uses.

With its origins in the Greek word for ‘steering’ or ‘navigating’, cybernetics is concerned with the design and ongoing maintenance of systems. Just as a ship’s captain considers many factors in steering towards a destination, so too does cybernetics. A set and forget approach to data governance and use won’t cut it.

I will be speaking about cybernetics and why data privacy continues to be such a polarising topic at a Spark Festival event in Sydney on 25 October, hosted at the Paul Ramsay Foundation’s Yirranma Place. Intended for startup founders and managers, the workshop explores why being able to design and use data products that anticipate and respond to change at any stage of deployment is an essential part of being a reflexive, responsible startup founder and manager.

You are on Aboriginal land.

The Australian National University acknowledges, celebrates and pays our respects to the Ngunnawal and Ngambri people of the Canberra region and to all First Nations Australians on whose traditional lands we meet and work, and whose cultures are among the oldest continuing cultures in human history.

arrow-right arrow-left bars search times arrow-up