Opentrons debuts simulation and visualization for AI-generated lab workflows
Opentrons Labworks Inc. launched Mar. 16 its Protocol Visualization for Opentrons Flex, which is a new simulation and visualization capability in its software. The feature let users simulate and visually inspect robotic protocols in a dynamic, virtual environment before running them on a Flex system. Within the interface, users can observe pipette movements, liquid handling actions, labware positions, and module status throughout an automated workflow. For researchers developing automated experiments, previewing protocol execution may help identify potential issues before reagents and instrument time are committed.
The new simulation and visualization environment introduces an inspection layer between AI-generated experimental plans and robotic execution by allowing user to walk through each action of a protocol before it runs on a robot. Researchers can navigate workflows step-by-step, or move quickly across an entire timeline to examine how robotic actions will unfold. For organizations deploying AI-assisted experimentation, the ability to review robotic execution pathways is expected to improve oversight of automated, experimental designs.
The new capability will be available through Opentrons App version 9.0, scheduled for release in April 2026. The feature is designed for use with the Opentrons Flex robotic platform, and supports protocols authored by Opentrons software. For laboratories adopting AI-assisted experimentation, the update provides a new tool for reviewing automated workflows before they’re executed on physical laboratory systems.
"AI can now design experiments and generate robotic protocols, but scientists still need to understand how those experiments will execute in the physical world," says James Atwood, CEO of Opentrons. "This capability gives researchers a dynamic way to simulate and inspect robotic execution before an experiment begins, creating a clearer bridge between computational design and physical laboratory workflows."
The new capability is supported for protocols authored across the Opentrons software ecosystem, including OpentronsAI, Python Protocol API and Protocol Designer application. The visualization environment tracks pipette positioning, liquid volumes, tip usage, and labware interactions, while maintaining a continuous view of the Flex deck configuration. Users can inspect workflows containing thousands of actions, and observe changes in liquid levels at microliter scale. The system also includes a Slot Spotlight view that provides additional details for individual deck locations, allowing users to monitor well volumes and module conditions throughout a run. For laboratories developing complex automation workflows, this level of inspection may support faster debugging and protocol refinement.
By design, the visualization function operates directly in the Opentrons App, and only needs a protocol file to run. Users can review workflows offline without connecting to a robot, enabling protocol development and troubleshooting, while automation systems are running other experiments. This capability lets users iterate on experimental workflows without interrupting active laboratory operations. For teams managing shared robotic infrastructure, offline inspection may support faster development of automated protocols.
"Our focus is building the execution layer that connects AI-generated experimental plans to real laboratory experiments," adds Atwood. "As AI systems propose more experiments, researchers need infrastructure that makes those experiments understandable, inspectable, and repeatable before they reach the bench."


