AI/ML projects can run up big bills on compute. With Spark Operator, you can take advantage of spot instances and dynamic executor allocation, which can deliver big savings. Here's how to very simply set it up in MLRun.
AI/ML projects can run up big bills on compute. With Spark Operator, you can take advantage of spot instances and dynamic executor allocation, which can deliver big savings. Here's how to very simply set it up in MLRun.
AutoMLOps means automating engineering tasks so that your code is automatically ready for production. Here we outline the challenges and share open-source tools.
In this article, we will walk you through steps to run a Jenkins server in docker and deploy the MLRun project using Jenkins pipeline.
Here's how to build simple AI applications that leverage pre-built ML models and allow you to interact with a UI to visualize the results.
Here's how to turn your existing model training code into an MLRun job and get the benefit of all the experiment tracking, plus more.
A step by step tutorial covering experiment tracking complexity concerns and how to solve them with MLRun, a new open source framework which optimizes the management of machine learning operations.