Most of past studies about in-context methods like in-context learning (ICL), cross-lingual ICL (X-ICL), and in-context alignment (ICA) come from older, unaligned large language models (LLMs). However, modern human-aligned LLMs are different; they come with chat-style prompt templates, are extensively human-aligned, and cover many more languages. We re-examined these in-context techniques using two recent, human-aligned multilingual LLMs. Our study covered 20 languages from seven different language families, representing high, mid, and low-resource levels. We tested how well these methods generalized using two tasks: topic classification (SIB-200) and machine reading comprehension (Belebele). We found that utilizing prompt templates significantly improves the performance of both ICL and X-ICL. Furthermore, ICA proves particularly effective for mid- and low-resource languages, boosting their f1-score by up to 6.1%. For X-ICL, choosing a source language that is linguistically similar to the target language, rather than defaulting to English, can lead to substantial gains, with improvements reaching up to 21.98%. Semantically similar ICL examples continue to be highly relevant for human-aligned LLMs, providing up to a 31.42% advantage over static examples. However, this gain decreases when using machine translation model to translate query from target language. These results collectively suggest that while modern human-aligned LLMs definitely benefit from in-context information, the extent of these gains is highly dependent on careful prompt design, the language’s resource level, language pairing, and the overall complexity of the task.
Passion Timothy Gerald SianiparPoliteknik Imigrasi, Tangerang WilonotomoPoliteknik Imigrasi, Tangerang Priati AssirojPoliteknik Imigrasi, Tangerang Views: 376 Downloads: 151 DOI: https://doi.org/10.12962/j24068535.v23i2.a1267 Abstract The
Ervina NoorainiInstitut Teknologi Sepuluh Nopember Mohamad Almas PrakasaInstitut Teknologi Sepuluh Nopemberhttps://orcid.org/0000-0003-0052-3656 Muhammad Ruswandi DjalalInstitut Teknologi Sepuluh Nopemberhttps://orcid.org/0000-0002-4313-4557 Rony Seto
Ubaidillah Ariq PrathamaInstitut Teknologi Bandung Ayu PurwariantiInstitut Teknologi Bandung Samuel CahyawijayaCohere, United Kingdom Views: 331 Downloads: 199 DOI: https://doi.org/10.12962/j24068535.v23i2.a1323 Abstract Most
Ary Mazharuddin ShiddiqiInstitut Teknologi Sepuluh Nopember Bagaskoro Kuncoro ArdiInstitut Teknologi Sepuluh Nopember Bilqis AmaliahInstitut Teknologi Sepuluh Nopember I Komang