For those of us with an interest in spy and thriller movies where they track down people in near real time using facial recognition technology (FRT), Australia’s Privacy Commissioner brought that issue ‘home’ in a landmark determination against much-loved Aussie home and hardware store, Bunnings Group. According to the Office of the Australian Information Commissioner (OAIC), Bunnings had not received meaningful consent to use FRT so as to compare hundreds of thousands of customers’ images of individuals who’d been observed stealing or had been violent or threatening towards staff.
Why was this problematic? Facial images are biometric information and are considered sensitive information under the Privacy Act 1988. The collection, handling and use of sensitive information generally requires individuals to provide their express consent, subject to certain exemptions.
Following the Privacy Commissioner’s decision, Bunnings published its rationale for using FRT in a video statement (link here) and reiterating that the intent was only ever to keep staff and customers safe. Plus, virtually all States, Territories and the Commonwealth have initiated public awareness campaigns relating to customer aggression, outlining employers’ workplace health and safety obligations to protect staff. On face value, Bunnings was doing good, and for the benefit of many.
However, the Privacy Commissioner, shared some important takeaways from the determination for other retailers using FRT (link here), stating: “In the Bunnings matter, hundreds of thousands of people likely had their personal information collected, without their knowledge. This kind of covert and indiscriminate surveillance undermines individuals’ control over their personal information and can have larger societal impacts.”
Going beyond ‘face’ value, what exactly are those larger societal impacts? Let’s try and unpack a few.
Potential for false positives, bias and discrimination
FRT is a system that is fraught, and there have been many instances of returning false positives. Not only can biometric data be inaccurate, but biases are also built into these systems, and the system can return inaccurate matches because of those biases. The ‘real world’ consequences of those biases were on full display in the American city of Detroit, where a man Robert Williams, was wrongly identified and arrested after the police used FRT. What happened? The FRT, its algorithm simply got it wrong; and FRT tends to ‘get it wrong’ more often with people from ‘minority’ (non-Caucasian) backgrounds. In Robert William’s case, he was African American.
Let’s go back to the Australian retail scenario. In the instance where Bunnings’ FRT could potentially ‘match’ someone to a previously identified thief or as a violent customer, a few options could occur: the police may be called, or the individual could be escorted out of the store without having displayed violent or threatening conduct at that point in time. That’s in addition to the potential stigmatisation of detecting a wrongly ‘matched’ person.
Downstream use of sensitive personal data
While the issues in the Bunnings case are complex, my understanding is that many of the issues would be solved by displaying a clear sign, outlining the use of FRT and how it’s used to safeguard staff from customer aggression and violence. In that context, most clients would likely accept the use of FRT for that specific purpose. However, that does not address the issue of ‘downstream’ uses of FRT data. When I was travelling in Japan, I was offered the option of taking part in a ‘sleep study’ that recorded you as you slept in a hotel. The data was presented back to me in a neat report with diagrams, which arguably, benefits me and other hotel guests. It can provide helpful key indicators for early detection of sleep apnoea and avoids the need to pay an exorbitant amount to conduct one of those tests in a hospital.
However, if you delved into the terms and conditions of the hotel’s ‘sleep study’, it’s clear that the researchers were intending to sell my data to big pharmaceutical companies. Drawing from the Bunnings’ scenario, this would be the equivalent of the OAIC's recommendation of having a ‘clear sign’ at the store entrance informing customers of the use of FRT. In my own case, I knew what the sleep researchers planned to do with my personal data. I accept their ‘T&Cs’ and I consented to the various uses that my de-identified data would be put. At the same time, I also had some trepidations. Would the data really be de-identified? Would some clever ICT expert be able to crack the encryption code? Or would the researchers just go and sell the data, hoping that I’d never find out?
By no means am I suggesting that Bunnings used FRT-derived data for any other purpose than ensuring the safety of clients and staff. However well-intentioned a business is, the use of high-privacy risk technologies cannot be justified simply on the basis that it’s available, convenient and helpful – at least not until customers are given assurances that their sensitive, biometric data is only used for clearly-agreed purposes. Whether it be sleeping in a hotel room, or entering a local retail shop, the infringement on personal autonomy and increased surveillance stands as a key privacy concern.
Conclusion
The Bunnings decision emphasises the importance of seeking consent from individuals, particularly when dealing with sensitive information, and when that sensitive information could be used for harmful purposes. Like what? I’m not sure, but that’s the point. However, my starting point would be with identity theft and the misuse of biometric information to profile or even arrest someone. The reason I find the Bunnings’ case so interesting is that it raises a host of interesting questions that go far beyond privacy law. How do businesses take it a step beyond legal compliance and how do they make careful, ethical decisions when using technology that build trust with consumers and the broader community?
Synergy Law provides expert advice covering technical as well as regulatory issues in the Privacy space, including undertaking privacy impact assessments, managing and responding to data breaches and ensuring compliance with the Australian Privacy Principles. We help our clients go beyond compliance and consider both the legal and ethical requirements.