Our Ontology Manager is where the most critical step of your analytics journey begins. Before you can build a single trusted dashboard or ML model, you must first create a clean, reliable map of your business. This is that tool.
Here, you will transform raw, physical data tables into a meaningful "digital twin" of your enterprise. You will create the reusable Objects (e.g., Customer
, Product
) and define the Relationships between them that become the single source of truth for the entire Arkham platform.
The central part of the screen is the Ontology Graph, a canvas that visually represents your Objects and the relationships between them.
Pilot
, Aircraft
, Flight
). These are the core entities of your business.Aircraft
can have many Flights
).You can click and drag to pan the canvas and use the mouse wheel to zoom in and out, making it easy to navigate even very large and complex ontologies.
When you select an Object in the graph (like the Aircraft
Object in the example), the Details Panel appears at the bottom. This is where you can inspect and manage the selected Object.
The Details
tab provides a comprehensive overview of the Object's configuration.
Year Built
) and a description. These become the trusted, reusable dimensions and features for downstream analytics and ML models.The Preview
tab allows you to see the live data for the first 100 instances of the selected Object, directly from the Lakehouse. This is invaluable for quickly validating that your Object is mapped correctly and that the underlying data is as you expect, before it gets used in a critical report.
On the right-hand side, the Dependents Panel provides critical, automated data lineage. It shows you all the downstream resources in our Arkham platform that consume the selected Object. This is your primary tool for impact analysis.
This panel removes the guesswork from maintenance, making your semantic layer safe to evolve and improve over time.
+ Add Object
button in the top-right corner opens a wizard that allows you to create a new Object by mapping it to a production-grade dataset in your Lakehouse.