Kinetic cyberattacks damage equipment: What network security misses puts us at risk
On Aug. 11, 2025, I had an opportunity to brief some congressional House Homeland Security staffers on issues raised from the July 22 U.S. House Committee on Homeland Security hearing, “Fully Operational Stuxnet 15 Years Later & the Evolution of Cyber Threats to Critical Infrastructure.”
The issues I discussed concerned kinetic cyberattacks (cyberattacks that damage equipment). It was evident from the discussions with the staffers and other public discussions that control system cybersecurity continues to be widely misunderstood. When cybersecurity is discussed, the focus has been on data cybersecurity—particularly ransomware attacks—to the exclusion of control system cyber incidents. Other than the well-known but often misunderstood Stuxnet attack, cyberattacks with kinetic effects are rarely addressed. For critical infrastructures, network security is necessary but not sufficient.
The first premise, often missed, is that equipment damage occurs when physics is compromised, not when networks are compromised. For critical infrastructures, there are two aspects of cybersecurity: data cybersecurity – involving IT and OT networks – and functional cybersecurity – involving hardware and processes. Most discussions about cybersecurity tend to focus on data security. This was true at the July 2025 House hearings on Stuxnet that did not address the engineering aspects of Stuxnet. Kinetic cyberattacks have yet to be explicitly addressed in any sectors’ cybersecurity guidance including electric, oil/gas, maritime, food and agriculture, etc.
Consider two known kinetic cyberattacks: Aurora and Stuxnet. Both Aurora and Stuxnet attack physics and hardware, not networks, and attacks of this kind cannot be detected by network security. Both Aurora and Stuxnet remain threats to many critical infrastructures — they should be understood as attack techniques, not isolated incidents. What Aurora, Stuxnet and other kinetic cyberattacks such as Triton have in common are that they are designed specifically to cause equipment damage, not to exploit cyber vulnerabilities.
Consequently, kinetic cyberattacks are issues that require detailed engineering understanding not issues that can be addressed by familiar network cybersecurity measures such as network threat hunting, OT network monitoring, multi-factor authentication, etc. The July hearing clearly demonstrated that appropriate workforce development for control system cybersecurity is needed.
The Aurora attack: Why network operating didn’t work
What is Aurora? Aurora is reclosing protective relays out-of-phase with the grid, so the sine waves of the relay and the grid are not synchronized. The lack of synchronization creates damaging mechanical and electrical forces on the alternating current (AC) equipment connected to the relay. Causing the out-of-phase condition can be done either manually or remotely (cyber). There is no malware involved. Aurora uses the protection of the electric grid, arguably the most critical of all infrastructures, as its attack vector. That is, Aurora is a gap in protection of the electric grid.
The Aurora attack was conceived in 2006. The Idaho National Laboratory (INL) wanted to demonstrate to industry that cyberattacks could cause equipment damage equivalent to physical attacks. INL was interested in doing so because existing cyberattack demonstrations were not getting decision-makers’ attention. Inducing an out-of-phase condition in AC electrical equipment was identified as a possibility for demonstrating this type of attack because out-of-phase incidents were known to cause significant damage.
Accidental out-of-phase incidents had attracted attention before INL undertook its demonstration. In the 1970s, the IEEE Rotating Machinery Subcommittee formed a working group that prepared a report on generator out-of-phase protection. Out-of-phase protection attracted further attention with the 2003 Northeast Blackout.
In 2010, North American Electric Reliability Corporation (NERC) System Protection and Control Subcommittee produced a technical reference document, “Power Plant and Transmission System Protection Coordination,” which provided guidance on setting the out-of-phase relay. The IEEE efforts were based on out-of-phase conditions being unintentional events. The IEEE Subcommittee was unaware of the 2007 Aurora test that maliciously caused out-of-phase conditions. Interest increased about the out-of-phase condition when a whitepaper on the out-of-phase vulnerability was presented at a conference partially sponsored by the Chinese in 2006.
The Aurora vulnerability used remote access to reclose protective relays out-of-phase with the grid, thereby causing AC equipment to operate in unstable conditions. The unstable out-of-phase conditions generated large torques, current spikes and harmonics that created increased equipment heat. The large torques can damage AC induction motors and generators, the current spikes can damage transformers and the increased heat can cause fires in Lithium-ion battery energy storage systems. The hardware damage can make the grid and AC equipment in other industries and facilities unavailable for nine to 18 months or longer. It can take that long because of the sheer difficulty of repairing the ensuing hardware damage and the long lead-times of obtaining replacement equipment. Equipment damage can occur with any AC equipment connected to the affected protective relays, whether that equipment is from the utilities or by the utilities’ customers. The greater the out-of-phase angle between the equipment and system phase angles, the greater the damage.
In 2006, INL validated the physics of Aurora when it succeeded in damaging a small AC motor by reclosing the motor out-of-phase. The March 2007 INL test was a full-scale test that successfully demonstrated the impact of out-of-phase conditions could damage a large diesel generator without any malware!
The test met the intent of the Aurora demonstration program — destruction of equipment by bits and bytes, and not by dynamite. Specifically, the damage to the generator included damage to 14 of 16 engine cylinders and the destruction of the engine-to-generator coupling.
Aurora threats generally are not currently addressed by operational technology (OT) security, because Aurora is a physics/hardware problem and not a network issue. The effects of an out-of-phase condition are widely known, however, including to Russia, China and Iran. What’s more, in 2015 the Department of Homeland Security (DHS) declassified more than 800 pages on the INL Aurora program.
Aurora is not hypothetical. There have been several domestic and international Aurora events that have damaged critical equipment. Examples include damage to chiller motors in a U.S. data center from relay reclosing from the local utility (the lack of forensics precludes us from knowing whether the reclosing was malicious or unintentional or even if done by the utility). Another Aurora incident involved the destruction of an overseas power plant turbine due to a coupling failure similar to the failure of the INL generator. In this case the power plant was in Iran. I published this case in December 2020 because of the rarity of a catastrophic coupling failure. And, finally, the December 2016 Russian cyberattack on the Ukrainian power grid attempted to reclose breakers to cause equipment damage — a deliberate attempt to induce an Aurora condition for causing a long-term outage.
Network monitoring would not have identified the Aurora attack or the operational status of the equipment following an Aurora attack.
The Stuxnet attack: Slipping under the radar
What is Stuxnet? Stuxnet was a series of cyber-physical attacks that caused physical damage to targeted nuclear centrifuge systems at selected intervals without being identified as being cyber-related. Stuxnet required detailed knowledge of the equipment and processes. Network issues were secondary.
As described in Ralph Langner’s, “To Kill a Centrifuge”, there were two different Stuxnet attacks. In each case, the Stuxnet attack changed controller logic to cause increasing long-term non-catastrophic damage in a way that the damage wouldn’t be identified as being cyber-related. One attack changed the centrifuge rotation speeds to damage the centrifuge rotors. The other attack used spoofed process sensor input to compromise the pressure controllers and over-pressure the centrifuge tubes while disabling overpressure protection yet doing so as not to cause catastrophic damage. The spoofing of process sensor data was critical to the success of the attacks. Process sensor monitoring at the physics layer would have detected the compromised sensor data being provided to the controllers and the operator displays. Network sensor monitoring would not have identified either the attack or the status of the equipment.
Elements used in Stuxnet are not unique to Siemens or centrifuges. Altering controller logic without the operator being aware was a generic vulnerability affecting multiple controller suppliers. The vulnerability was called “Boreas,” and dates to 2008.
INL gave a presentation at the 2008 Siemens International User Group describing how Siemens PLCs’ control system logic could be altered without the operator being aware of the changes. The INL presentation applied to any industrial or manufacturing application (there was no mention of centrifuges in the presentation).
Prior to Stuxnet, it was assumed that cyberattacks would have clearly different characteristics from unintentional incidents, simple mistakes and accidents. However, Stuxnet demonstrated that cyberattacks could be made to look like equipment malfunctions, and not cyberattacks, which is how Stuxnet was able to compromise equipment for more than a year before Stuxnet was identified as being a cyberattack. Thus, control system incidents may not be expeditiously identified as cyber-related (if at all). The lack of cyber-related identification inhibits cyber defenders from being involved in investigations where control system incidents haven’t been identified as being cyber-related.
Boreas (Stuxnet-like) attacks changing control system logic to cause physical impacts and then changing control system logic back so as not to be detected have not been limited to nation-states or military operations. Compromised control system logic can be used against industrial or manufacturing facilities equipped by a range of control system suppliers. In 2009 this was done for fraud. Volkswagen (VW) started selling turbocharged direct injection (TDI) diesel engines in 2009. In 2013, the International Council on Clean Transportation (ICCT) commissioned the West Virginia University Center for Alternative Fuels Engines and Emissions (WVU CAFEE) to test on-road emissions of diesel cars sold in the US. Researchers at WVU CAFEE, who conducted live road tests in California using a Japanese on-board emission testing system, detected additional NOx emissions from two out of three tested vehicles, both made by VW.
In May 2014, ICCT published WVU CAFEE's findings and reported them to the California Air Resources Board (CARB) and the Environmental Protection Agency (EPA). The VW emissions scandal became public in September 2015, when the EPA issued a notice of violation of the Clean Air Act to VW Group. The agency found that VW (and other diesel car and truck manufacturers) had contracted Bosch to develop specialized software to activate their emissions controls only during laboratory emissions testing, which caused the vehicles' NOx output to meet US standards during regulatory testing. However, the vehicles emitted up to 40 times more NOx in real-world driving when the control systems were returned to normal operation. VW deployed this software in millions of cars worldwide in model years 2009 through 2015. (It is unclear if Bosch attended the 2008 Siemens International User Group meeting). Other diesel car and truck manufacturers did the same and were also fined. Like Stuxnet, the compromised systems were not identified until years later.
Specifically, the VW attack:
- checked the operating conditions to see if the attack conditions were met – was the vehicle under test (similar to Stuxnet checking for appropriate process conditions before initiating attacks);
- changed the emissions controls on each individual vehicle to pass the emissions test (similar to Stuxnet for changing controller logic for the attacks); and
- changed the emission controls back to normal design conditions following completion of testing (similar to Stuxnet for changing controller logic back to normal operating conditions).
There were no internet protocol (IP) networks involved so network security approaches would not be relevant.
Process sensor issues in cybersecurity
In 2014, a presentation was given at the 2014 S4 Conference on previous work at INL to compromise process sensor data and hide the attack from experienced operators and automated alarms. The work was originally done to simulate an attack on a chemical plant system. Centrifuge systems are chemical plant systems. There were no IP networks involved as this was using engineering to spoof the sensor data. The presenter also bemoaned the fact that little work was (and is) being done to address the integrity of the process measurements.
In 2016, a Russian researcher from Moscow demonstrated the capability of hacking process sensor data at the 2016 ICS Cybersecurity Conference. Russia knows.
Other critical cybersecurity issues
In November 2023, Iran cyberattacked Unitronics Controllers in multiple U.S. critical infrastructures and may have compromised PLC logic like Stuxnet.
In 2024, I was part of a team performing a Department of Energy (DOE) Phase 1 Small Business Innovative Research (SBIR) project on mathematically predicting where a hacker would go next once inside the SCADA network. The literature search showed that recent literature on this subject was from Iran.
The attack vectors as well as controller design features exploited by Stuxnet still exist.
Triton: A failed kinetic cyberattack
In 2017, Russian intelligence services attempted a kinetic cyberattack, subsequently called “Triton,” against a Saudi Arabian petrochemical facility. The intent of the Russian Triton cyberattack was to blow up the Petro Rabigh petrochemical plant. The Russian kinetic cyberattack against Petro Rabigh was a two-part attack: hacking the control systems to cause the plant to operate in an unsafe condition, and then initiating the Triton malware to prevent the Triconex safety systems from safely shutting the plant down.
The safety system attack was unsuccessful because Triconex is a triple redundant safety system with the fail-safe option being an automatic plant shutdown. The complexity of the software caused the Triton malware to shut the plant down twice before the malware was identified. The first time the plant shut down, the incident was only identified as a malfunction with no cyber indications from the network monitoring like Stuxnet. As a result, the plant restarted with the Triton malware still in the engineer’s Triconex workstation until the plant shut down again two months later when the Triton malware was initially identified. This lack of malware detection allowed the attackers to remain unimpeded in the system for two additional months.
Latent kinetic cyberattacks
The Chinese have installed hardware backdoors in large Chinese-made electric transformers to take control of the transformers resulting in issuance of Presidential Executive Order 13920 “Securing the United States Bulk-Power System”. There are almost 600 high voltage Chinese-made transformers in the U.S. electric grid. When one of the Chinese-made transformers was sent to the Sandia National Laboratory to be extensively examined, the report of what was found was classified as top secret. Physical damage could be done to the transformers and other equipment when voltages are maliciously changed, or protective relays are compromised.
The Chinese have also installed inverters in battery energy storage systems with remote communications that could enable China to remotely communicate with the protective relays to cause Aurora incidents or overheat Lithium-ion batteries.
This amounts to cyber battlespace preparation.
Government guidance gaps that can affect kinetic cyberattacks
Much of the guidance issued by the U.S. government misses the full import of kinetic cyberattacks. On Aug. 13, 2015, CISA and it partners issued “Foundations for OT Cybersecurity: Asset Inventory Guidance for Owners and Operators”. The report identifies and prioritizes OT asset inventories.
However, the document does not address OT safety topics. Considering safety is a prime consideration for OT and has been specifically exploited by kinetic cyberattacks, this omission is puzzling. The OT devices Aurora and Stuxnet exploited were part of the identified inventory. According to the report, “a structured OT taxonomy enables better data analytics by providing a clear framework for organizing and analyzing data. This leads to valuable insights that can drive continuous improvement and innovation.” However, that assumes the devices are uncompromised, authenticated, and include cyber forensic capabilities, which is not the case.
There was no guidance in the CISA report to address how OT control system field devices could be exploited or protected, as these are not network devices. The issues that were exploited by Aurora and Stuxnet cannot be addressed by simply having an asset inventory and taxonomy. For kinetic cyberattacks, you don’t know if the sensor readings going to the transformer or turbine are readings coming from the OT devices or from Beijing.
You may have an inventory of OT devices, but you don’t know if the OT devices have been compromised.
Summary
Data cybersecurity is a known threat to both IT and critical infrastructure applications. However, the major threats to critical infrastructures are kinetic cyberattacks that can cause extensive long-term equipment damage, and we are not ready.
Kinetic cyberattacks have yet to be explicitly addressed in any sector’s cybersecurity guidance, including electric, oil/gas, maritime, food and agriculture, etc. Network security organizations do not have the technical capabilities to address kinetic cyberattacks which are engineering-based and don’t compromise the integrity of the data packets, just the data in the packets. Without engineering participation, kinetic cyberattacks cannot be detected or mitigated. The July hearing clearly demonstrated that appropriate workforce development for control system cybersecurity is needed. The August 13 CISA OT asset inventory guidance document doesn’t address the issues exploited by Aurora, Stuxnet and the Chinese. That is, you may have an inventory of the OT devices, but you don’t know if the OT devices have been compromised. Moreover, consider how much more widespread and extensive damage could be done by incorporating artificial intelligence into kinetic cyberattacks.