|
|
--- |
|
|
license: cc-by-4.0 |
|
|
title: >- |
|
|
SECA: Semantically Equivalent and Coherent Attacks for Eliciting LLM |
|
|
Hallucinations |
|
|
sdk: static |
|
|
emoji: π |
|
|
colorFrom: gray |
|
|
colorTo: gray |
|
|
pinned: true |
|
|
short_description: Eliciting LLM Hallucinations |
|
|
thumbnail: >- |
|
|
https://cdn-uploads.huggingface.co/production/uploads/653839549d7dd331b28dac39/hnHFsOe4Qj0tBYoeVfPnf.jpeg |
|
|
--- |
|
|
# SECA project website |
|
|
|
|
|
This Space is a landing page for our NeurIPS 2025 paper. |
|
|
|
|
|
- Paper (arXiv): https://arxiv.org/abs/2510.04398 |
|
|
- NeurIPS page: https://neurips.cc/virtual/2025/poster/119640 |
|
|
- Code: https://github.com/Buyun-Liang/SECA |
|
|
- Project website: https://buyunliang.org/project_website/seca/ |