Leaderboard Comparing LLM Performance at Producing Hallucinations when Summarizing Short Documents
-
Updated
Aug 29, 2025 - Python
Leaderboard Comparing LLM Performance at Producing Hallucinations when Summarizing Short Documents
✨✨Woodpecker: Hallucination Correction for Multimodal Large Language Models
Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"
A curated list of trustworthy deep learning papers. Daily updating...
Protect your AI agents and GenAI apps with confidence
Hallucinations (Confabulations) Document-Based Benchmark for RAG. Includes human-verified questions and answers.
[ACL 2024] User-friendly evaluation framework: Eval Suite & Benchmarks: UHGEval, HaluEval, HalluQA, etc.
Attack to induce LLMs within hallucinations
Code for ACL 2024 paper "TruthX: Alleviating Hallucinations by Editing Large Language Models in Truthful Space"
Framework for testing vulnerabilities of large language models (LLM).
Dataset and evaluation script for "Evaluating Hallucinations in Chinese Large Language Models"
Code for the EMNLP 2024 paper "Detecting and Mitigating Contextual Hallucinations in Large Language Models Using Only Attention Maps"
Initiative to evaluate and rank the most popular LLMs across common task types based on their propensity to hallucinate.
mPLUG-HalOwl: Multimodal Hallucination Evaluation and Mitigating
[ICML 2024] Official implementation for "HALC: Object Hallucination Reduction via Adaptive Focal-Contrast Decoding"
An Easy-to-use Hallucination Detection Framework for LLMs.
Repository for the paper "Cognitive Mirage: A Review of Hallucinations in Large Language Models"
[CVPR 2025] Devils in Middle Layers of Large Vision-Language Models: Interpreting, Detecting and Mitigating Object Hallucinations via Attention Lens
Official repo for SAC3: Reliable Hallucination Detection in Black-Box Language Models via Semantic-aware Cross-check Consistency
Add a description, image, and links to the hallucinations topic page so that developers can more easily learn about it.
To associate your repository with the hallucinations topic, visit your repo's landing page and select "manage topics."