Event

Meet Us at ODSC West in San Francisco from Oct 31-Nov 1

Is there a privacy issue or data leaking risk with custom models that utilize proprietary or public data?

There have been cases where chat information was leaked, which makes it all the more important to implement guardrails and involve the legal team. This will help minimize the risk of a privacy breach. As a general rule of thumb, it’s recommended to not share private data with LLMs since it might accidentally leak or get exfiltrated. This is especially important when running a POC with a public LLM. When the business is hosting an internal LLM, considerations might change.

Interested in learning more?

Check out this 9 minute demo that covers MLOps best practices for generative AI applications.

View this webinar with QuantumBlack, AI by McKinsey covers the challenges of deploying and managing LLMs in live user-facing business applications.

Check out this demo and repo that demonstrates how to fine tune an LLM and build an application.

Need help?

Contact our team of experts or ask a question in the community.

Have a question?

Submit your questions on machine learning and data science to get answers from out team of data scientists, ML engineers and IT leaders.