Similar Items: A Causal Language Modeling Detour Improves Encoder Continued Pretraining
- Pretraining Exposure Explains Popularity Judgments in Large Language Models
- Beyond Decodability: Reconstructing Language Model Representations with an Encoding Probe
- EMO: Pretraining Mixture of Experts for Emergent Modularity
- Fuzzy Fingerprinting Encoder Pre-trained Language Models for Emotion Recognition in Conversations: Human Assessment and Validity Study
- LASE: Language-Adversarial Speaker Encoding for Indic Cross-Script Identity Preservation
- Shadow-Loom: Causal Reasoning over Graphical World Model of Narratives