Abstract: The Mixture of Experts (MoE) model is a promising approach for handling code-switching speech recognition (CS-ASR) tasks. However, the existing CS-ASR work on MoE has yet to leverage the ...
Abstract: This paper reports on SOTA results achieved using openAI’s Whisper model with adaptation on different adaptation corpus sizes for two established code-switch Mandarin/English corpus - namely ...
一些您可能无法访问的结果已被隐去。
显示无法访问的结果