MLRun is Iguazio's open source MLOps orchestration framework. MLRun enables you to run the same code either locally on your PC for a test or on a large scale Kubernetes cluster with minimal changes. Track all experiments with their parameters, inputs, outputs and labels. Github
Automated and Scalable MLOps Orchestration
Run multiple experiments in parallel, each using a different combination of algorithm functions and parameter sets (hyper-parameters) to automatically select the best result.
Describe and track code, metadata, inputs and outputs of machine learning related tasks (executions) and re-use results with a generic and easy-to-use mechanism.
Online and Offline Feature Store
Maintain the same set of features in the training and inferencing (real-time) stages with MLRun's unified feature store.
Natively integrate with Kubeflow Pipelines to compose, deploy and manage end-to-end machine learning workflows with UI and a set of services.