Databricks, MLflow and Credo AI Integration

Ensure proper AI Governance without requiring any additional work from your model development teams. Meet your technical stakeholders where they are by integrating your MLFlow model store with Credo AI. Any relevant model metadata stored in MLFlow will be made available for governance in Credo AI, reducing the burden on your technical teams and streamlining your governance workflows.

This integration can be enabled for cloud and on-prem customers. Reach out to your Credo AI team to learn more.


Register your models in Managed MLflow.

Register your ML models in MLflow managed in the cloud by Databricks (managed MLflow) to automatically pull model version data and metadata into Credo AI’s Model Registry. 




See models and metadata in the Credo AI Model Registry.

Your models and their metadata will be available in the Credo AI Model Registry, where you can add those models to AI Use Cases in order to ensure your model is meeting compliance and risk mitigation requirements. Select which model version to associate with a Use Case for governance, and see when new versions are available in your Model Registry that need to be governed.