There have been cases where chat information was leaked, which makes it all the more important to implement guardrails and involve the legal team. This will help minimize the risk of a privacy breach. As a general rule of thumb, it’s recommended to not share private data with LLMs since it might accidentally leak or get exfiltrated. This is especially important when running a POC with a public LLM. When the business is hosting an internal LLM, considerations might change.
Interested in learning more?
Check out this 9 minute demo that covers MLOps best practices for generative AI applications.
View this webinar with QuantumBlack, AI by McKinsey covers the challenges of deploying and managing LLMs in live user-facing business applications.
Check out this demo and repo that demonstrates how to fine tune an LLM and build an application.