Straight-shooting Australian enforcer wants to put privacy on the map

Australia’s new privacy commissioner talks to MLex

In an exclusive interview, Carly Kind discusses AI ethics, her goal to raise awareness of privacy issues with the public, and leading the Office of the Australian Information Commissioner towards more assertive enforcement of still-evolving privacy laws.

Straight-shooting Australian enforcer wants to put privacy on the map

26 September 2024
By Ryan Cropp

On stage at a recent event, Australia’s new privacy commissioner, Carly Kind, fielded a cheeky question from the crowd. If she could have anything she wanted, the audience member asked, what would it be?

“An unlimited enforcement budget,” Kind shot back — half joking, half not.

The response set tongues wagging among Australian privacy lawyers, most of whom aren’t used to hearing regulators talk with such frankness.

But then, perhaps Kind isn’t your typical regulator.

Speaking with MLex from her office in Sydney this week, Australia’s top privacy official said a major part of her role, as she sees it, is to “put privacy on the agenda.”

“[I want] to reassure the Australian community that the privacy regulator is on the job, and that there’s somebody here fighting in their corner,” she says.

Six months into a five-year term as a member of an all-new leadership team at the Office of the Australian Information Commissioner, or OAIC, Kind is showing herself to be an able fighter — cultivating a strong media presence, talking tough on breaches and pushing hard for changes to Australia’s aging privacy laws.

Shifting the country’s somewhat ambivalent attitude towards privacy, though, remains the key challenge — and one that has largely outdone her predecessors.

At certain moments, such as in the immediate aftermath of major data breaches at Singtel Optus and Medibank Private in late 2022, Australians have appeared to put a high price on the protection of their personal information. In those moments, too, lawmakers have often rushed to push through new rules and tougher penalties.

But the rest of the time, it seems that Australians couldn’t really care less. The government’s recent decision to abandon the vast majority of its proposed changes to the country’s 1988 Privacy Act is a case in point. It saw little political advantage in pushing them through, and many potential headaches.

In Kind’s view, though, this might be the wrong way of framing it.

“It's about agency,” she says. “People don't [check] rubbish tick boxes or agree to terms and conditions or accept cookies because they don't care. I think they do it because they don't feel like they have any agency and that their single or collective opinion doesn't matter.”

“I try not to fall into the trap of saying that people's actions are reflective of how they feel about an issue,” she says.

It’s an optimistic view, and one informed by Kind’s extensive experience working in human rights law. Among the 27 current and former jobs listed on her LinkedIn profile are roles touching on refugee advocacy, indigenous issues, artificial intelligence ethics and state surveillance.

But with a vast chunk of that expansive CV having been compiled outside Australia’s borders, to many locals she still remains a largely unknown quantity.

So, who is Carly Kind? And, more importantly, how does she feel about privacy regulation?

Kill ‘em with Kindness

When news of Kind’s appointment landed in Australian inboxes back in February, the main feedback from lawyers and privacy wonks was how out-of-the-box it was.

While her predecessors in the role have mostly spent their careers working inside the Australian public service, with Kind the government has broken the mold, opting for a young and outspoken human rights advocate — and one who seems to particularly understand the Internet and the various digital technologies it has spawned.

“Carly’s not someone who has in any way hidden her passion for privacy,” says Anna Johnston, the founder of local consultancy firm Salinger Privacy. “She brings a global perspective and a particular grounding in technology.”

“When I heard about her appointment, I was both surprised and thrilled,” Johnston says.

Kind arrived at the OAIC from the Ada Lovelace Institute, a London-based think tank that she helped establish in 2018 as a means of influencing policy discussions relating to data and artificial-intelligence technologies.

Those who’ve worked alongside her in her various roles say she takes an evidence-based, tech-forward approach to issues and has an uncanny ability to get to the heart of the matter.

“Carly’s not a Luddite,” says Andrew Strait, an associate director at Ada Lovelace. “The approach that she brought to Ada — which is still very much part of our DNA — is this question of: ‘How can we make technology work for people in society?’ It is capable of wonderful things, but it requires understanding.”

“She's someone who's so good at asking the right questions. She can cut through the maze of marketing material around technology and get at the core questions,” Strait told MLex.

For an example, look no further than Kind’s response to the suggestion — recently floated publicly by the government’s Treasury Department — that strengthening Australia’s privacy protections ran the risk of stifling the development of pro-competitive and innovative technologies in data-intensive industries.

“We can have our cake and eat it too,” she says. “We can get the cool tech that we want, and also have our privacy respected.”

“It's clear now that not only is holding personal data a risk — including if you're subject to a data breach — but also that there are technological ways we can achieve the same end without having to use that data.”

Having said that, Kind proposes one caveat: artificial intelligence, a field in which she thinks the vast gulf between hype and reality can encourage some to be blasé about genuine privacy risks.

“We're moving wholesale to the idea of AI being a kind of social good [that is] necessary to advance innovation and growth,” she says. “[But] all the people who are talking about the technological feasibility of AI are the same ones who have a vested interest in it being supported.”

“I think the objective should be interesting technology that helps us do good things as humanity, and that also protects rights.”

‘The power to change’

Given Kind’s family background, her ability to get to the heart of complex matters shouldn’t come as a surprise.

Born Carly Nyst, she was raised on Queensland’s Gold Coast, where her father was a criminal defense lawyer with a list of colorful clients. Kind says she grew up with an instinctive understanding of the law and its connection to social concerns.

Part of that instinct, she says, was passed down by her grandfather, a Dutchman who fought in the French resistance as a teenager before fleeing to Australia, where he too eventually became a lawyer, appearing in the country’s top court.

“I very much grew up in a legal world and, in particular, one that really saw the intersections [between] the legal system and social-justice issues,” she says.

After completing her own law degree at the University of Queensland in the early 2000s, Kind landed a job at a small firm run by activist lawyer Andrew Boe, where she worked on a number of indigenous matters — including a high-profile coronial inquest into the death in custody of an aboriginal man on Queensland’s Palm Island.

From there, her next move was to Switzerland to work as a lawyer at the United Nations, a decision she says was motivated by the limitations of Australia’s legal system.

“I really wanted to work in human rights law, right from the outset, in part due to that experience working for Andrew [Boe]. I felt that the Australian legal system was quite restrictive … there was no human rights act,” she says.

But how did she end up in privacy? Kind says it’s hard to discount the impact that the fallout from the 9/11 World Trade Center attacks had on human rights and, by extension, individual privacy.

“All of my intellectual adulthood at those really formative moments was against the backdrop of this expanding state-power regime, and particularly the use of surveillance to pursue the war on terror,” she says. “I became interested in privacy through that lens.”

In 2014, Kind took a job at Privacy International, a London-based advocacy organization focused on digital human-rights issues. While there, she was involved in bringing the first case against the UK intelligence services for a contravention of the country’s surveillance laws — a case that ultimately led to an overhaul of the entire UK surveillance framework.

Kind realized that it was this type of “strategic litigation,” as she describes it, that could really move the needle on privacy issues. “It really showed me the power of law to change the way rights are protected in that context.”

From there she moved into consulting, expanding her CV with a variety of short-term privacy-related roles. One involved working with Google on a project dealing with state-sponsored online trolling; another was a gig at the European Commission promoting EU-style privacy laws in places like India and Brazil.

According to one European official who spoke to MLex, Kind remains very well regarded in EU policy circles, not least because she’s able to bring a global perspective to privacy issues. “It’s very good to have someone [at the OAIC] with international experience and with a global view,” the official said.

Similarly, Kind’s work at Ada Lovelace, which has covered issues such as algorithmic risk and AI policy, has had quite an impact in UK policy debates — so much so that in June 2023, the British government awarded her the title of Member of the Order of the British Empire for “services to data and artificial-intelligence ethics.”

Kind says that like many Australians, she has “complicated feelings about the British empire,” but ultimately accepted the title as an acknowledgement of Ada Lovelace’s work.

Pushing the boundaries

Given her experience on this larger stage, why would Kind want to move her young family halfway round the world to be a regulator?

“Working in the AI space, it became clear to me that regulation is a key lever that you pull to affect change,” she says. “I'd been on the side of asking for more regulation, for better regulation, for better enforcement of existing regulation, but I hadn't been on the side of trying to make that happen.”

However, when Kind began her stint as the OAIC’s privacy commissioner in February, Australian businesses were in a kind of privacy-law limbo, with the government notionally committed to a root-and-branch overhaul of the country’s cobwebbed privacy laws, but coy on both the details and the timing.

Earlier this month, the so-called first tranche of that update finally appeared, and while in the scheme of things it looked underwhelming, it did include some crucial changes to the watchdog’s enforcement powers.

Among them is a new tiered-penalty system, effectively creating two additional types of penalties for mid-level breaches that currently don’t meet the threshold for civil proceedings. Kind says it will be a “significant” change.

“It enables us to give a relatively significant slap on the wrist to entities who aren't meeting the most basic of Privacy Act requirements, without having to elevate that to lengthy proceedings,” she says.

While advocating for law changes in recent months, Kind has spoken of her desire for the OAIC to become a more enforcement-focused regulator, akin to its antitrust counterpart, the Australian Competition & Consumer Commission.

“Our interest is in pursuing enforcement action where we can change market practices, where our intervention is actually going to reshape products, reshape services, reshape platforms,” Kind says.

“We will put our money where our mouth is, and if there is non-compliance we’ll be prepared to take enforcement action … in the most important of cases,” she says.

That, at least, is the goal. The perennial problem for the OAIC, Kind readily admits, is that funding for such enforcement-litigation has not historically been easy to come by. And it’s this, she says, that she was referring to when she joked to lawyers about unlimited enforcement budgets.

“The way you advance the application of the Privacy Act, particularly in the digital realm, is through test cases,” Kind says. “It’s through litigating and seeing what the courts say and then going back again and taking different routes in and choosing novel legal arguments and trying to apply them.”

“Where we are hampered is in our discretionary ability to initiate investigations and take a strategic approach to shaping privacy through the tools that we have, including through investigation and litigation,” she says.

“That is an expensive process and is hard to justify in constrained-resource environments — to take cases that you know have reasonable grounds of succeeding but may not be a sure thing,” she says.

The new oil

Kind’s interest in enforcement, and the limitations of the OAIC’s fire power, reveal what she acknowledges is a long-held preoccupation with power in all its forms.

“I talk about power all the time,” Kind jokes. Yet the message to policymakers is a serious one: neither Kind nor the OAIC can take on the digital giants on their own. “The power at play here is so big that we can't put it on individual shoulders,” she says.

Indeed, a lifetime in human rights activism has continually reaffirmed the obvious point that those with an interest in opposing regulation have both the motivation and the resources to take on governments — a fact not limited to the digital sphere.

“I think that the privacy and, more broadly, the digital-rights movement is akin to the climate rights movement in a lot of different ways,” Kind says. “It may not be as existential an issue, but it's as systemic an issue as climate.”

And that’s the nub of it: Asking individuals to deal with such monumental problems on their own misunderstands the reality of the problem at hand, Kind argues. We’re constantly told that data is the new oil, but we’re done pretending that one person limiting their car usage will, on its own, be able to solve the planet’s environmental problems.

The solutions, she suggests, require more wholesale change — and that starts at the top.

“We can put it on an individual not to use plastic straws, and that will, in some tiny way, improve the situation,” she says.

“But until we go after the fossil fuel producers of the digital economy, then we're not going to actually change the entire system.”

Additional reporting by Matthew Newman in Brussels

For the latest developments in data privacy, cybersecurity, digital regulation and more, activate your instant trial of MLex today.

blue and yellow star flag