LLM Research
updated
Is Multilingual LLM Watermarking Truly Multilingual? A Simple
Back-Translation Solution
Paper
•
2510.18019
•
Published
•
17
PORTool: Tool-Use LLM Training with Rewarded Tree
Paper
•
2510.26020
•
Published
•
4
POWSM: A Phonetic Open Whisper-Style Speech Foundation Model
Paper
•
2510.24992
•
Published
•
2
Ming-Flash-Omni: A Sparse, Unified Architecture for Multimodal
Perception and Generation
Paper
•
2510.24821
•
Published
•
38
Generalization or Memorization: Dynamic Decoding for Mode Steering
Paper
•
2510.22099
•
Published
•
3
Omni-Reward: Towards Generalist Omni-Modal Reward Modeling with
Free-Form Preferences
Paper
•
2510.23451
•
Published
•
26
ARC-Encoder: learning compressed text representations for large language
models
Paper
•
2510.20535
•
Published
•
7
Continuous Autoregressive Language Models
Paper
•
2510.27688
•
Published
•
70
Can Visual Input Be Compressed? A Visual Token Compression Benchmark for
Large Multimodal Models
Paper
•
2511.02650
•
Published
•
9
RADLADS: Rapid Attention Distillation to Linear Attention Decoders at
Scale
Paper
•
2505.03005
•
Published
•
36
Titans: Learning to Memorize at Test Time
Paper
•
2501.00663
•
Published
•
29