AI/ML projects can run up big bills on compute. With Spark Operator, you can take advantage of spot instances and dynamic executor allocation, which can deliver big savings. Here's how to very simply set it up in MLRun.
When it comes to scaling your MLOps operations, a high-quality, reliable and effective MLOps platform is essential for growth.
AI/ML projects can run up big bills on compute. With Spark Operator, you can take advantage of spot instances and dynamic executor allocation, which can deliver big savings. Here's how to very simply set it up in MLRun.
Here's how to use the Iguazio feature store to build, store and share features from your Snowflake data.
Open and free financial datasets and economic datasets are an essential starting point for data scientists and engineers who are developing and training ML models for finance. Here are 13 excellent ones.
Iguazio users can now run their ML workloads on AWS EC2 Spot instances. When running ML functions, you might want to control whether to run on Spot nodes or On-Demand compute instances.When deploying Iguazio MLOps platform on AWS, running a job (e.g. model training) or deploying a serving function users are now able to choose to deploy it...
AutoMLOps means automating engineering tasks so that your code is automatically ready for production. Here we outline the challenges and share open-source tools.
In this article, we will walk you through steps to run a Jenkins server in docker and deploy the MLRun project using Jenkins pipeline.