Observability for Vision AI models in production

Monitor CV pipelines for drift and outliers, and debug model behavior with XAI explanations.

The problem

Vision models are hard to keep reliable.

Silent Performance Degradation

Input shifts happen (lighting, devices, domains) and performance degrades silently

Unexpected Outliers

Outliers appear and cause unexpected behavior

Slow Debugging

Debugging is slow without data visibility and explanations

The solution

ObzAI gives you visibility into your vision pipeline

Data Inspection

Thorougly profile your reference data and detect outliers in production.

Explanations (XAI)

Seamlesly integrate XAI methods with your CV pipeline.

Monitoring Dashboard

Get visual insights into drift, outliers, XAI heatmaps and pipeline behavior over time.

Supports: classification, segmentation, and image-to-image translation (pix2pix).

Detection planned next.

Data Inspection

Profile reference dataset and in production detect drift and outliers before they affect performance.

See docs

Legend

ML Feature 1
ML Feature 2
ML Feature 3

Detection Trends & Outliers

Time series view of statistical measures

Explanations (XAI)

Log heatmaps alongside predictions to see what the model relies on and debug failure cases faster.

See docs

Original Image

Golf ball

XAI Heatmap

Golf ball attention map
How it works

Setup. Log. Monitor.

1

Setup your pipeline with ObzAI SDK

2

Log predictions, stats, embeddings and explanations

3

Monitor in ObzAI Cloud

How to get started

Start in minutes

1

Install ObzAI and fit the Data Inspector

# Install ObzAI SDK
pip install obzai

# Import Feature Extractor and Data Inspector
from obzai.data_inspection.extractors import FirstOrderExtractor
from obzai.data_inspection.inspectors import GMMInspector

# Initialize classes
feat_extractor = FirstOrderExtractor()
gmm_inspector = GMMInspector(extractors=feat_extractor, n_components=3)

# Fit Data Inspector on your reference data
gmm_inspector.fit(reference_dataloader)

# Save Data Inspector checkpoint for later
gmm_inspector.save_checkpoint()
2

Set up ObzAI SDK Client

# Imports
from obzai import ObzClient
from obzai.data_inspection.inspectors import GMMInspector

# Load the Data Inspector from a checkpoint
gmm_inspector = GMMInspector.load_checkpoint(path="your_ckpt_path")

# Set up XAI tool(s)
saliency_tool = SaliencyTool(model=MODEL)

# Set up ObzAI Client
client = Client(
          data_inspectors=[gmm_inspector],
          xai_tools=[saliency_tool]
          )
# Validate API Key
client.login(api_key=API_KEY)

# Set up project
from obzai.client import MLTask
client.setup_project(
      project_name="Your Project",
      ml_task=MLTask.CLASSIFICATION
    )
3

Log data

# Run model inference and get predictions
probabilities, max_logits = inference_pipeline(image_batch)
      
# Run the Data Inspector on a batch of images
client.run_inspectors(
  input_images=image_batch
  )
# Run the XAI tool on a batch of images
client.run_explainers(
  images=image_batch,
  target_idxs=max_logits,
  )
# Log data to ObzAI Dashboard
client.log(
  input_images=image_batch,
  predictions=probabilities
  )

Join Early Access

We're building ObzAI with early users. If you run vision models in production, we'd love your feedback.

Free access during beta

Direct influence on roadmap

Help with onboarding

Open-source core

Committed to Open Community

We believe in transparency, collaboration, and community-driven innovation. Our core SDK is open-source because we want you to understand, trust, and contribute to the tools that power your vision AI systems.

By open-sourcing ObzAI, we're building a community of practitioners who can share knowledge, improve the platform together, and ensure that AI observability remains accessible to everyone - from startups to enterprises.

Your contributions, feedback, and ideas shape the future of ObzAI. Join us in making vision AI more explainable, reliable, and trustworthy for everyone.

Community-Driven

Built with and for the community

Fully Transparent

Source code open for inspection

Collaborative

Contributions welcome and valued