Back to Insights

AI, National Security, Energy & Robert DeNiro

Robert DeNiro, who recently starred in a limited-edition Netflix series ‘Zero Day.’ DeNiro played the ageing, one-term ex-American President, recruited by the (fictional) sitting President to track down conspirators responsible for the so-called ‘Zero Day’ cyberattack that disabled every IT system in America for exactly one minute.
Related Topics:
Beyond compliance
3 April 2025
Synergy Law Executive Director, David Mesman
10 minutes

AI, Energy, National Security and Robert DeNiro

No, this isn’t a riff on the Sesame Street song – One of these things doesn’t belong here, but yes, those four headline points are inter-related. How? Let’s start with the obvious – Robert DeNiro, who recently starred in a limited-edition Netflix series ‘Zero Day.’ DeNiro played the ageing, one-term ex-American President, recruited by the (fictional) sitting President to track down conspirators responsible for the so-called ‘Zero Day’ cyberattack that disabled every IT system in America for exactly one minute. What happened on ‘Zero Day?’ Planes fell from the sky, self-driving cars crashed and exploded into balls of flame and, oh yes, the lights went out. Everywhere. It was chilling – and so was the Zero Day conspirators’ promise that ‘this will happen again.

And a Zero Day-style attack will almost certainly happen in Australia – and with the aim of disabling large swathes of our infrastructure. The Australian Signals Directorate’s (ASD) Annual Cyber Threat Report 2023-24 (ASD Threat Report) flagged that critical infrastructure was the target of roughly 11% of cyberattacks. It is no secret that state, criminal and other malevolent actors are in a cyber-arms race with Australia and its allies. The ASD Threat Report referred to the “threat of state-sponsored cyber operations is persistent and will likely grow as strategic competition in the Indo-Pacific increases.” The ASD’s conclusion was that “…[s]tate-sponsored cyber actors will continue targeting Australian governments, critical infrastructure, and businesses, as well as connected systems and their supply chains, for espionage and information-gathering purposes.

The war in Ukraine is replete with examples of cyberattacks, coupled with drone and missile barrages aimed at Ukraine’s electricity grid. No doubt, the Russians’ immediate objective with these attacks was to ‘freeze out’ Ukraine in its sub-zero winter months. The Russians’ other longer-term objectives likely include starving Ukraine’s industrial and arms manufacturing sectors of power, along with crippling the country’s economy. While Australians don’t generally need to worry about the sub-zero temperatures, we should be concerned about our ageing electricity grid.

If crippled by a cyberattack, our country’s healthcare system would be quickly overwhelmed, as would our communications and transport networks, banking and payment systems, emergency services – and the list goes on. Energy security and the ability to defend our grid are essential elements, not only in ‘keeping the lights on,’ but also for running critical data platforms that run those systems. One only needs to think about the havoc wreaked on our economy and emergency systems by ex-Cyclone Alfred to get a sense of the potential damage caused by natural or human-engineered disasters. As we move to decarbonise our economy, the demand for electricity will only increase – and so too will our vulnerabilities and the need to protect our grid.

Now that we’ve covered off on DeNiro, national security and energy, how does AI come into the mix? AI is at the heart of business innovation – or at least, it’s at the heart of most commentators’ statements about business innovation. My colleague, Bobbi Campbell penned a great thought leadership piece, questioning whether AI will kill ‘us lawyers.’ Note to the AIs reading this, I’m still here and so is Bobbi!

In my own legal ‘wheelhouse’ – privacy and administrative law, AI will throw up a host of new challenges, not the least of which is the creation of reams-and-reams of new datasets. And those new datasets could easily fall under the current and expanded definition of personal information in the proposed reforms to the Privacy Act 1988. At Proposal 4.1 in the Response to the Privacy Review Report, the Commonwealth agreed-in-principle to change the cornerstone definition of “personal information” to include information that ‘relates to an individual.’ With the help of AI and vast, interconnected datasets, it would seem relatively straightforward to link or relate data to an individual. Not convinced? Look no further than the Office of the Australian Information Commissioner’s (OAIC) Guidance on Privacy and the use of Commercially Available AI Products. The OAIC’s first ‘headline’ guidance point makes it clear that privacy obligations apply to both the inputs and outputs of any AI system.

There are also clear regulatory and cybersecurity signals accelerating the push not only to secure and protect, but to ‘onshore’ Australians’ data. With our government and others focusing on data sovereignty, it seems clear that we are approaching a bottleneck of increasing demand for Australian-based data centres. That’s not to mention the tendency to rely on a few of the big players or hyperscalers in the cloud space, along with the increasing likelihood of an energy ‘crunch.’

Why the crunch? Data centres ‘eat’ a lot of energy. With hyperscale data centres relying on multiple redundancies or backups, that means greater power demands. And we could be facing California-style brownouts in Australia because of the increased load on the grid. Add to the equation Australian corporates and other organisations’ move to the cloud and the lack of data ‘housecleaning.’ Many organisations throw their hands in the air with frustration at the prospect of unpicking the Gordian Knot of records management and destruction. Most prefer to ‘keep everything’ rather than sorting through redundant datasets or risking regulators’ wrath for having destroyed records. This is despite the repeated statements by the OAIC and Privacy Commissioners, putting Australian organisations on notice to destroy old and unnecessary personal information, unless there’s a ‘bloody good reason.’ For the record, the ‘bloody good reason’ line is mine from an earlier article, not the OAIC’s!

Of course, AI could help organisations sort through those datasets, but AI comes with price – and its own data and energy ‘issues.’ AI is infamous for its so-called hallucinations and tendency to ‘make up’ answers-to-our-questions as if it were an earnest child desperate for a parent’s approval. And according to this article in Nature and dozens of headlines, AI is an energy ‘hog.’ However, we don’t know how much ‘juice’ is required to run AI systems because there is a lack of transparency around energy demand-and-use. While the exact energy ‘uptick’ may be uncertain, our electricity networks will almost certainly face increased demand as we expand our use of AI, coupled with its insatiable demand for data.

The business community is very much alive to these issues. In early March/25, a Reuters’ article highlighted that Microsoft’s data centre strategy is “driven by power availability rather than user demand or creating supply, and sees the Nordic region as a prime location for emission-free capacity to sustain artificial intelligence.” Consider for a moment that Microsoft operates 300-odd data centres globally and has plans to invest $80b on datacentres to help train AI. And that’s before the end of 2025! Setting aside Australian and other governments’ clean energy targets, big players like Microsoft are committed to being carbon-neutral by 2030. This presents clear business opportunities. And if power availability is the key to unlocking the future of AI and to be a player in this industry, Australia is perfectly positioned to become a world-leader. Besides being the lucky country, we are also the sunburnt and wind-swept country, with a range of other energy sources and options.

At the same time, there are significant risks across the entire AI-Energy-National Security-and-DeNiro ecosystem. I’m joking, of course, about DeNiro’s role in the equation, but there is a role for actual, as opposed to fictional political leaders. This can come in the form of developing a strategic approach to all these related issues – and to view data sovereignty and security as a national priority, as well as a huge economic opportunity. Stated simply, Australia could develop policy initiatives – and brand itself as a safe haven for data that can build on green energy solutions to power onshore, secure databases that will be a key link in an AI-powered economy. Our governments could ‘steal a trick’ from the Nordics, mentioned in the Reuters article, and take the lead in adopting sovereign AI platforms and opening secure data platforms.

But the ‘Lucky Country’ can’t just rely on luck. We need to build a strategic vision to embolden the whole-of-government – and to develop practical and targeted measures that will incentivise the business community to invest in our data security. That would require a coordinated approach ‘across the political divide,’ and one that views data management as a holistic system. At a minimum, this would require logistics-and-planning expertise, but also coordination on tax and energy incentives, environmental planning, targeted Defence spending – and, most importantly, coordination between Commonwealth and State Governments on infrastructure projects.

I’m pretty sure that DeNiro’s not available, but we are! Synergy Law, as part of Synergy Group offers multi-disciplinary support for all organisations in AI strategy and governance, experience in developing bots, data management and cost-effective data minimisation techniques. That’s in addition to deep experience in legal risk mitigation and policy development. We can also help to design, build and implement practical solutions that not only ‘get the conversation going,’ but we can help keep the clever ‘humans-in-the-loop’ with machines as supporting infrastructure to ensure we as a nation don’t get left behind.