MLOps Live

Join our webinar on Implementing a GenAI Smart Call Center Analysis App - Tuesday 27th of February 2024 - 9am PST / 12 noon EST / 6pm CET

How do I use MLRun for batch sizing?

A batch is a group of instances from the dataset. Batches are fed into the different phases of the MLOps pipeline, like the training phase, for processing. Batches are used for use cases that do not require real-time or online data, which makes it more resource efficient and simple to use.

You can practice how to run batch inference with MLRun with this tutorial.

Need help?

Contact our team of experts or ask a question in the community.

Have a question?

Submit your questions on machine learning and data science to get answers from out team of data scientists, ML engineers and IT leaders.