The best off-the-shelf option for this is OpenTAP. (https://opentap.io/) It may take a while to wrap your head around it, but it handles a lot of the backend plumbing for you.
From what I understand, the industry generally uses NI TestStand, ATEasy, or some in-house software probably written in VB6 20+ years ago. There’s not a ton of great software options out there for this, unfortunately.
Hi - Julien here, co-founder of TofuPilot. Thanks for the mention.
You’re right that the open-source hardware test ecosystem is still pretty sparse. Today, OpenHTF and OpenTAP are still the most commonly cited Python-centric frameworks, but both show their age when it comes to orchestration, concurrency, and modern DX.
We recently released TofuPilot Framework (https://github.com/tofupilot/framework) as an open-source (MIT), hardware-agnostic test orchestration framework that’s explicitly designed for hardware testing in lab and production environments.
TofuPilot Framework is a Rust-based test orchestrator that executes Python test steps (with more languages planned based on feedback). It’s meant as a “spiritual successor” to OpenHTF, focused on orchestration, control-flow, and observability rather than locking you into a vendor ecosystem.
What it covers relative to your requirements:
1. Parallel execution & precise control flow: setup/teardown phases, conditional steps, retries, concurrent steps, and multi-device orchestration (PSUs, DMMs, DAQs... basically you come with your Python driver and the framework manages the object lifecycle vs. your test execution).
2. Hardware-agnostic connectors: driver layers are explicitly separated so instruments can be swapped without rewriting orchestration logic
3. Run history & data: local run data management with automatic sync to the TofuPilot dashboard for logs, metrics, trends, and querying historical runs
4. Slack & integrations: we’re about to ship TofuPilot Workflows, a dashboard module that lets you define event-driven flows reacting to test results (Slack/Discord notifications, MES/ERP sync, etc.)
We ship it as desktop apps:
TofuPilot Studio: developer environment for writing and debugging tests (UI editor, step debugging, dry runs, plugin dev)
TofuPilot Station: locked-down production app for lab/production PCs; stations auto-update from a connected test repo via the dashboard
On the business side: we’re an independent team based in Switzerland, coming from robotics and industrial test backgrounds. The framework itself is free and MIT-licensed. Our revenue comes from the hosted dashboard, which is free up to a certain data volume, then ~$50/user or station/month. Most current revenue is from self-hosted enterprise deployments.
Genuinely curious: for teams like yours, is pricing a blocker for adoption? We’d really like to make TofuPilot attractive for smaller teams and are very open to feedback.
Happy to answer any technical questions or go deeper on architecture trade-offs.
Upon further research I found OpenHTF (https://github.com/google/openhtf) which is opensource. And, if you don't want to spin up your own dashboard, there's tofupilot (https://www.tofupilot.com/) but this is paid and not affiliated with google. Note, tofupilot maintains the documentation for OpenHTF (https://www.openhtf.org/)
The best off-the-shelf option for this is OpenTAP. (https://opentap.io/) It may take a while to wrap your head around it, but it handles a lot of the backend plumbing for you.
From what I understand, the industry generally uses NI TestStand, ATEasy, or some in-house software probably written in VB6 20+ years ago. There’s not a ton of great software options out there for this, unfortunately.
I was afraid this was going to be the answer :( We are currently using python scripts and I thought there has to be a better way but couldn't find any
Hi - Julien here, co-founder of TofuPilot. Thanks for the mention.
You’re right that the open-source hardware test ecosystem is still pretty sparse. Today, OpenHTF and OpenTAP are still the most commonly cited Python-centric frameworks, but both show their age when it comes to orchestration, concurrency, and modern DX.
We recently released TofuPilot Framework (https://github.com/tofupilot/framework) as an open-source (MIT), hardware-agnostic test orchestration framework that’s explicitly designed for hardware testing in lab and production environments.
TofuPilot Framework is a Rust-based test orchestrator that executes Python test steps (with more languages planned based on feedback). It’s meant as a “spiritual successor” to OpenHTF, focused on orchestration, control-flow, and observability rather than locking you into a vendor ecosystem.
What it covers relative to your requirements:
1. Parallel execution & precise control flow: setup/teardown phases, conditional steps, retries, concurrent steps, and multi-device orchestration (PSUs, DMMs, DAQs... basically you come with your Python driver and the framework manages the object lifecycle vs. your test execution).
2. Hardware-agnostic connectors: driver layers are explicitly separated so instruments can be swapped without rewriting orchestration logic
3. Run history & data: local run data management with automatic sync to the TofuPilot dashboard for logs, metrics, trends, and querying historical runs
4. Slack & integrations: we’re about to ship TofuPilot Workflows, a dashboard module that lets you define event-driven flows reacting to test results (Slack/Discord notifications, MES/ERP sync, etc.)
We ship it as desktop apps:
TofuPilot Studio: developer environment for writing and debugging tests (UI editor, step debugging, dry runs, plugin dev)
TofuPilot Station: locked-down production app for lab/production PCs; stations auto-update from a connected test repo via the dashboard
On the business side: we’re an independent team based in Switzerland, coming from robotics and industrial test backgrounds. The framework itself is free and MIT-licensed. Our revenue comes from the hosted dashboard, which is free up to a certain data volume, then ~$50/user or station/month. Most current revenue is from self-hosted enterprise deployments.
Genuinely curious: for teams like yours, is pricing a blocker for adoption? We’d really like to make TofuPilot attractive for smaller teams and are very open to feedback.
Happy to answer any technical questions or go deeper on architecture trade-offs.