AI cannot run safely if your context is scattered.
Policies in drives. Owners in chat. Support paths in people's heads. OpenData turns that context into AI operating memory: the shared layer Forge reads from, Factory builds with, and Pulse writes back to every day.
Automation without context is a fancy way to make the wrong thing fast.
Every customer comes to us with context scattered across drives, wikis, ticket comments, and people's heads. Policies are stale. Owners are implicit. Support paths live in chat history. AI tools amplify everything, including the confusion.
So we open every deployment with a knowledge inventory. We don't replace your existing systems, we rationalize them, link them, and make them queryable by the same layer that sets the course, builds safely, and runs the workflows your teams create with AI.
One AI operating memory. Four audiences served.
Each layer is tuned to a different reader. The same policy, owner, or support fact gets presented at the altitude each role needs.
Same fact, four readers. Marketing for the buyer · Customer for the user · Product for the team · Architecture for the engineer.
What happens in week one.
We crawl existing documentation systems, Notion, Confluence, Google Docs, GitHub, internal wikis, and inventory what exists, where it lives, and how stale it is.
Each fact is mapped to owners, policies, systems, support paths, and the audience layer that needs it.
Operating context gets wired into Forge, Factory, and Pulse. When a workflow needs a policy, owner, or support path, it pulls from the right layer.
From here on, launched workflows update the memory. Stale entries get flagged. Documentation stops being a project and becomes part of the operating layer.
Forge keeps the memory current.
Policy updates, support notes, owner changes, and runbook edits are drafted in context. The AI suggests; the human approves. Every artifact lands in the right layer with proper provenance.
when there's an alert, page the on-call
Page the on-call engineer via PagerDuty (not Slack, alerts shouldn't compete with chat traffic). Include: alert title, runbook URL, severity, and the most recent metric snapshot.
if they don't ack in 10 min, escalate
Escalate to the secondary on-call after 10 minutes without acknowledgement. If no ack at 20 minutes, escalate to the engineering manager and open an incident channel #inc-YYYYMMDD.
▍
For the engineers building on top.
Pulse data is queryable via REST on Professional and Enterprise. Forge definitions and Factory handoffs are webhook-driven and Model Context Protocol-compatible.
REST API
Read access to outputs, owners, policies, support paths, telemetry, and value records. JSON, JWT auth, fair rate limits.
Webhooks
Push notifications on output changes, anomaly detection, and connector health changes. Verified signatures, retry semantics, replay.
Custom Connectors
Build via Model Context Protocol, same standard our AI-native integrations use. Templates, harness, validation included.
Want an AI operating-memory inventory?
We'll inventory your existing documentation systems and come back with a layered map: what to keep, what to connect, what to retire, and which AI workflows need it first. Free for qualified pilots.