Similar Items: LLM hallucinations in the wild: Large-scale evidence from non-existent citations
- HalluCiteChecker: A Lightweight Toolkit for Hallucinated Citation Detection and Verification in the Era of AI Scientists
- Generating Synthetic Citation Networks with Communities
- CiteRadar: A Citation Intelligence Platform for Researcher Profiling and Geographic Visualization
- LLM-ReSum: A Framework for LLM Reflective Summarization through Self-Evaluation
- Impact of large language models on peer review opinions from a fine-grained perspective: Evidence from top conference proceedings in AI
- Automating Categorization of Scientific Texts with In-Context Learning and Prompt-Chaining in Large Language Models