{"id":7419,"date":"2025-09-15T08:18:11","date_gmt":"2025-09-14T23:18:11","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=7419"},"modified":"2025-09-14T08:38:30","modified_gmt":"2025-09-13T23:38:30","slug":"qwen3-next-80b-a3b-qwen3-asr-hunyuan-mt-mmbert","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=7419","title":{"rendered":"Qwen3-Next-80B-A3B, Qwen3-ASR, Hunyuan-MT, MMBERT"},"content":{"rendered":"\n<p>\u5148\u9031\u306e\u5927\u304d\u306a\u30cb\u30e5\u30fc\u30b9\u306f\u975e\u5e38\u306b\u758e\u306a\u69cb\u6210\u3092\u6301\u3061\u6027\u80fd\u306e\u9ad8\u3044<a href=\"https:\/\/huggingface.co\/Qwen\/Qwen3-Next-80B-A3B-Instruct\">Qwen\/Qwen3-Next-80B-A3B-Instruct \u00b7 Hugging Face<\/a>\u306e\u767a\u8868\u3060\u308d\u3046\u3068\u601d\u3046\u3002DeepSeek\u306a\u3069\u3082\u540c\u69d8\u306bMoE\u69cb\u6210\u3067\u306f\u3068\u3066\u3082\u30b9\u30d1\u30fc\u30b9\u306a\u69cb\u9020\u3092\u3068\u308b\u3053\u3068\u304c\u6d41\u884c\u3063\u3066\u3044\u308b\u3002Qwen\u304b\u3089\u306f\u30de\u30eb\u30c1\u30ea\u30f3\u30ac\u30eb\u306a\u97f3\u58f0\u8a8d\u8b58\u30e2\u30c7\u30eb<a href=\"https:\/\/qwen.ai\/blog?id=41e4c0f6175f9b004a03a07e42343eaaf48329e7&amp;from=research.latest-advancements-list\">Qwen-ASR<\/a>\u3082\u767a\u8868\u3055\u308c\u3066\u3044\u308b\u3002\u5468\u8fba\u9818\u57df\u3082\u3057\u3063\u304b\u308a\u3068\u4f5c\u3063\u3066\u3044\u308b\u5370\u8c61\u3002<\/p>\n\n\n\n<p>Hunyuan-MT\u306fHunyuan\u3092\u30d9\u30fc\u30b9\u3068\u3057\u305f\u6a5f\u68b0\u7ffb\u8a33\u30e2\u30c7\u30eb\u3067\u3042\u308b\u3002<a href=\"https:\/\/tech.preferred.jp\/ja\/blog\/plamo-translate\/\">\u7279\u5316\u578b\u5927\u898f\u6a21\u8a00\u8a9e\u30e2\u30c7\u30eb\u300ePLaMo\u7ffb\u8a33\u300f\u3092\u516c\u958b\u3057\u307e\u3057\u305f &#8211; Preferred Networks Research &amp; Development<\/a>\u3082\u3060\u304c\u3001LLM\u30d9\u30fc\u30b9\u306e\u3082\u306e\u306f\u975e\u5e38\u306b\u5f37\u529b\u3067\u3042\u308b\u3002<\/p>\n\n\n\n<p>\u6700\u5f8c\u306b\u30de\u30eb\u30c1\u30ea\u30f3\u30ac\u30eb\u306aencoder only\u30e2\u30c7\u30eb\u3001MMBERT\u3082\u767a\u8868\u3055\u308c\u3066\u3044\u305f\u3002decoder only\u306aLLM\u5168\u76db\u3068\u3044\u3046\u611f\u3058\u3067\u306f\u3042\u308b\u304c\u3001\u5206\u985e\u306a\u3069\u5b9f\u7528\u7684\u306a\u30bf\u30b9\u30af\u3067\u306f\u4eca\u3067\u3082\u91cd\u8981\u306a\u30a2\u30d7\u30ed\u30fc\u30c1\u3067\u3042\u308b\u3002<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Hunyuan-MT Technical Report\u00a0<\/strong>[20.9]<br>Hunyuan-MT-7B\u306f33\u306e\u4e3b\u8981\u8a00\u8a9e\u306b\u307e\u305f\u304c\u308b\u53cc\u65b9\u5411\u7ffb\u8a33\u3092\u30b5\u30dd\u30fc\u30c8\u3057\u3066\u3044\u308b\u3002 Hunyuan-MT-Chimera-7B\u306f\u3001\u30b9\u30ed\u30fc\u30b7\u30f3\u30ad\u30f3\u30b0\u30e2\u30fc\u30c9\u306b\u30a4\u30f3\u30b9\u30d1\u30a4\u30a2\u3055\u308c\u305f\u7ffb\u8a33\u30e2\u30c7\u30eb\u3067\u3042\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2509.05209v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2509.05209v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Fri, 05 Sep 2025 16:11:05 GMT)<\/li>\n\n\n\n<li>\u300cThe development of our models follows a holistic training process specifically engineered for multilingual translation, which begins with general and MT-oriented pre-training to build foundational capabilities, proceeds to Supervised Fine-Tuning (SFT) for task-specific adaptation, and culminates in advanced alignment through Reinforcement Learning (RL) and weak-to-strong RL.\u300d\u3068\u3042\u308b\u304c\u305d\u308c\u305e\u308c\u306e\u30d1\u30a4\u30d7\u30e9\u30a4\u30f3\u3082\u3068\u3066\u3082\u51dd\u3063\u3066\u3044\u308b\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/huggingface.co\/tencent\/Hunyuan-MT-7B\">tencent\/Hunyuan-MT-7B \u00b7 Hugging Face<\/a><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>mmBERT: A Modern Multilingual Encoder with Annealed Language Learning\u00a0<\/strong>[57.6]<br>mmBERT\u306f\u3001\u591a\u8a00\u8a9e\u30c6\u30ad\u30b9\u30c8\u306e3T\u30c8\u30fc\u30af\u30f3\u3067\u4e8b\u524d\u8a13\u7df4\u3055\u308c\u305f\u30a8\u30f3\u30b3\u30fc\u30c0\u306e\u307f\u306e\u8a00\u8a9e\u30e2\u30c7\u30eb\u3067\u3042\u308b\u3002 \u30c7\u30fc\u30bf\u306b1700\u4ee5\u4e0a\u306e\u4f4e\u30ea\u30bd\u30fc\u30b9\u8a00\u8a9e\u3092\u8ffd\u52a0\u3057\u3066\u3044\u307e\u3059\u3002 \u5206\u985e\u304a\u3088\u3073\u691c\u7d22\u30bf\u30b9\u30af\u306b\u304a\u3051\u308b\u5f93\u6765\u306e\u30e2\u30c7\u30eb\u3088\u308a\u3082, mmBERT\u306e\u65b9\u304c\u512a\u308c\u3066\u3044\u305f\u3053\u3068\u3092\u793a\u3059\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2509.06888v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2509.06888v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Mon, 08 Sep 2025 17:08:42 GMT)<\/li>\n\n\n\n<li>\u300cWe do this by pre-training our new model suite, MMBERT, on 3T tokens of multilingual text using an architecture inspired from ModernBERT (Warner et al , 2024).\u300d\u3068\u3044\u3046\u30de\u30eb\u30c1\u30ea\u30f3\u30ac\u30ebBERT\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/jhu-clsp\/mmBERT\">GitHub &#8211; JHU-CLSP\/mmBERT: A massively multilingual modern encoder language model<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>\u5148\u9031\u306e\u5927\u304d\u306a\u30cb\u30e5\u30fc\u30b9\u306f\u975e\u5e38\u306b\u758e\u306a\u69cb\u6210\u3092\u6301\u3061\u6027\u80fd\u306e\u9ad8\u3044Qwen\/Qwen3-Next-80B-A3B-Instruct \u00b7 Hugging Face\u306e\u767a\u8868\u3060\u308d\u3046\u3068\u601d\u3046\u3002DeepSeek\u306a\u3069\u3082\u540c\u69d8\u306bMoE\u69cb\u6210\u3067\u306f\u3068\u3066\u3082\u30b9\u30d1\u30fc &hellip; <a href=\"https:\/\/devneko.jp\/wordpress\/?p=7419\" class=\"more-link\"><span class=\"screen-reader-text\">&#8220;Qwen3-Next-80B-A3B, Qwen3-ASR, Hunyuan-MT, MMBERT&#8221; \u306e<\/span>\u7d9a\u304d\u3092\u8aad\u3080<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[223,505,602],"class_list":["post-7419","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-llm","tag-505","tag-602"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7419","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=7419"}],"version-history":[{"count":1,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7419\/revisions"}],"predecessor-version":[{"id":7420,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7419\/revisions\/7420"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=7419"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=7419"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=7419"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}