CERTAIN board member Simon Ostermann co-authored a paper on Advancing Cross-Lingual NLP with Smarter Transfer Methods

Cross-lingual knowledge transfer, particularly from high- to low-resource languages, remains one of NLP’s most persistent challenges. At NAACL 2025, CERTAIN researcher Simon Ostermann co-authored a paper exploring how parameter-efficient fine-tuning methods can improve multilingual AI systems without massive computational overhead, directly supporting CERTAIN’s mission to make AI inclusive, efficient, and trustworthy across all languages. The paper introduces the first use of soft prompts for language transfer, presenting soft language prompts as a novel approach. It systematically compares language-specific adapters, task-specific adapters, and soft prompts across 16 languages, focusing on 10 mid- and low-resource languages. The study shows that, contrary to earlier claims, the best results often come from combining a soft language prompt with a task adapter rather than stacking language and task adapters. The takeaway is that smarter, lighter fine-tuning strategies can significantly boost performance in low-resource settings, opening the door to more inclusive and sustainable NLP and supporting the development of Trusted AI in multilingual contexts.

[Link to the paper]