Google

Accelerate and Automate your AI Pipeline with Iguazio and Google

GCP Integration

Continuously and rapidly deliver AI applications to production with the Iguazio MLOps Platform and feature store deployed with Google Cloud and fully integrated with a wide range of Google Cloud services. With Iguazio on GCP, data scientists, data engineers and DevOps teams can:

  • Deploy AI Models Faster: Build models on Vertex AI and easily deploy them in production as scalable functions running on a scalable inference layer.
  • Monitor Models, Detect Drift and Automate Retraining: Leverage Iguazio’s built-in monitoring functionality to identify drift in models developed on Vertex AI. When drift is detected, Iguazio automatically triggers a training process in Vertex AI to optimize the model.
  • Leverage a Feature Store: Use Iguazio’s feature store with GCP to reduce the development cycle and the overhead of feature engineering. Data scientists can write online and offline features once and use them for both training and online inference.
  • Optimize Feature Management: Create and manage features in the Vertex AI environment with the Iguazio feature store. Features can be used for both online inference and offline training.
  • Support Real-time Use Cases: Ingest streaming data from any source, leverage the Iguazio real-time feature store as a data transformation service, generate real-time features and build scalable low-latency real-time applications. Support real-time ML use cases such as predictive maintenance, real-time recommendation engines and fraud prediction.
  • Deploy Anywhere: A single seamless experience to deploy your AI on Google Cloud or Anthos.

Seamlessly integrates with:

  • Google Kubernetes Engine 
  • Anthos 
  • Google Cloud Storage 
  • BigQuery 
  • Vertex AI

Simplify and Accelerate MLOps with Iguazio & Google

Explore More of the Google – Iguazio Partnership

For inquiries on the joint Google-Iguazio solution please contact us: