Your test bench has a memory now.
The same memory layer for engineers driving the bench, engineers reading from it, and engineers shipping software around it. Every run, revision, calibration event, and note — embedded, indexed, and queryable.
tribal → structured → shared · live · autonomous · always-on
Wired into your team’s data, whichever shape it takes.
The platform is built for engineers supporting three flavors of team. The indexer doesn’t care which one you are.
R&D and T&M teams running sequences on real instruments through Galois. Every step, register write, and bitstream load lands on the memory bus the moment it happens.
Manufacturing, QA, reliability, and data teams who consume the bench output without driving the instruments themselves. Galois ingests the runs, audits, and drift series and answers questions across them.
Platform, firmware, and tools engineers whose data lives in CI, services, tickets, and docs. The same indexer, the same context bundles — the memory layer doesn't care where the row came from.
The questions you used to walk down the hall to ask.
Every answer is cited to the actual run, note, schematic, or commit. No hallucinated tribal knowledge.
Grounded answer with a citation back to the design-review thread that called for it.
Audit-trail RAG across cal certs, run logs, and the instrument profile — not a folder of PDFs.
Autonomous compliance check across the run log; the answer is a yes/no with the timestamps.
Semantic search plus statistical synthesis in one query — drift bands and outliers cited inline.
The closer
When the engineer who wrote the VI leaves, Galois still knows what they did.
No one writes up the handoff. No one updates the wiki. The memory layer was already running while they worked.
What’s running while you work.
Each one ships today. Each one watches a slice of bench activity and writes findings back to the memory the answers are drawn from.
- Watches
- SCPI command logs, instrument register writes, FPGA bitstream loads
- Produces
- Classified events on the memory bus
- Cadence
- Continuous, streaming
- Watches
- Annotations, snapshot pins, sequence locks, run completions, document uploads
- Produces
- Vector index across 14 memory collections
- Cadence
- Sub-second after every write
- Watches
- Vector index entries, recent activity windows
- Produces
- Distilled summaries embedded back into memory
- Cadence
- On-schedule, per-team quota
- Watches
- Recompile triggers from annotations, pins, sequence locks, run completions
- Produces
- Versioned, regime-aware context bundles
- Cadence
- On-event, 60-second fingerprint dedupe
- Watches
- All memory writes, project regime setting
- Produces
- Pool labels on rows, audit-log entries, run-page flags
- Cadence
- Continuous, per write
- Watches
- Calibration records, sequence definitions, run results, regulatory profiles
- Produces
- Alerts on the run page, items in the engineer inbox
- Cadence
- On-event plus daily sweep
- Watches
- Test-run results, historical run windows
- Produces
- Drift alerts, trend annotations on the run page
- Cadence
- On-event, every new run
- Watches
- User chat message, latest project context bundle
- Produces
- Cited answer streamed to the chat panel
- Cadence
- On-query
The eight services aren’t the whole story.
You know your flows better than we do. The workplace intelligence SDK opens the same memory bus the built-in services run on — so your team can plug in custom ingest, custom processing, and custom emit without forking the platform.
import { defineSource } from "@galois/wi";
defineSource({
id: "linear_tickets",
watch: linear.events("issue.updated"),
produce: (issue) => ({
text: `${issue.title}\n${issue.body}`,
meta: { regime: "rd" },
}),
});Index every Linear ticket as it moves through review.
import { defineService } from "@galois/wi";
defineService({
id: "fmea_synthesizer",
subscribesTo: ["run.completed", "annotation.created"],
async handle(events, ctx) {
const row = await ctx.llm.synthesize(events);
ctx.memory.write("fmea", row);
},
});Auto-build an FMEA row whenever a run + note pair lands.
import { defineSurface } from "@galois/wi";
defineSurface({
id: "drift_to_pagerduty",
subscribesTo: ["drift_analyzer.alert"],
async handle(alert) {
pagerduty.trigger({
summary: `${alert.dut} drift on ${alert.rail}`,
});
},
});Page the on-call when drift_analyzer raises a flag.
The SDK is in design with our first paying customers. If you have a flow you want plugged in, we want to hear about it before we ship it.
See it on your bench.
Connect Galois to your instruments and the eight services start indexing before the first sequence finishes. No buttons to press, no documents to upload.