Evaluating distillation methods for data-efficient syntax learning.Published in the Findings of Association for Computational Linguistics (ACL), 2025Share on Twitter Facebook LinkedIn Previous Next