From the course: GenAIOps Foundations

Unlock this course with a free trial

Join today to access over 25,500 courses taught by industry experts.

Hallucinations

Hallucinations

- [Instructor] Hallucinations are a known limitation of generative models. Let's discuss what hallucinations are and how to control them. GenAIOps plays a key role in automating detection of hallucination and ensuring trusted outcomes for the users. What is hallucination in a generative AI context? Hallucination happens when the output of the GenAI model is not supported by facts. A model output may provide data that is not true or events that have not happened. The output is disconnected from reality. Using this output for business processes may lead to incorrect results and loss of trust. Hallucinations happen because of how generative models are designed. A model can interpret independent facts as related and provide incorrect answers. One reason is the lack of contextual understanding, where the prompt does not have specific context for the question being asked. The training data also influences hallucination if it misrepresents facts. How do we manage hallucinations? It begins…

Contents