{"id":7909,"date":"2025-12-16T05:21:00","date_gmt":"2025-12-15T20:21:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=7909"},"modified":"2025-12-13T07:28:12","modified_gmt":"2025-12-12T22:28:12","slug":"k2-v2-a-360-open-reasoning-enhanced-llm","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=7909","title":{"rendered":"K2-V2: A 360-Open, Reasoning-Enhanced LLM\u00a0"},"content":{"rendered":"\n<ul class=\"wp-block-list\">\n<li><strong>K2-V2: A 360-Open, Reasoning-Enhanced LLM\u00a0<\/strong>[89.7]<br>K2-V2\u306f,\u30b9\u30af\u30e9\u30c3\u30c1\u304b\u3089\u69cb\u7bc9\u3057\u305f360\u5ea6\u30aa\u30fc\u30d7\u30f3LLM\u3067,\u63a8\u8ad6\u9069\u5fdc\u306e\u305f\u3081\u306e\u512a\u308c\u305f\u57fa\u76e4\u3068\u306a\u308b\u3002 \u3053\u308c\u306fQwen2.5-72B\u3092\u4e0a\u56de\u308a\u3001Qwen3-235B\u306e\u6027\u80fd\u306b\u8fd1\u3065\u3044\u3066\u3044\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2512.06201v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2512.06201v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Fri, 05 Dec 2025 22:53:45 GMT)<\/li>\n\n\n\n<li>\u300cWe introduce K2, the best fully open-source pretrained large language model (LLM) to date, and ranks competitively against the best open-weight models of its class. As the latest base model in the LLM360 family (Liu et al , 2023; Tao et al , 2024; Liu et al , 2025c; Cheng et al , 2025a), Beyond standard competencies like knowledge and conversation, K2 provides advanced capabilities, including long context consistency, deep mathematical knowledge, and reasoning behaviors. These serve as foundational building blocks that enable sophisticated downstream use cases, such as solving complex math problems and executing agentic workflows.\u300d\u3068\u30aa\u30fc\u30d7\u30f3\u304b\u3064\u5f37\u529b\u306a\u6027\u80fd\u3092\u4e3b\u5f35\u3059\u308bLLM<\/li>\n\n\n\n<li><a href=\"https:\/\/github.com\/llm360\/k2v2_train\">GitHub &#8211; LLM360\/k2v2_train: Pre-training codebase for K2-V2<\/a>\u3001<a href=\"https:\/\/huggingface.co\/LLM360\/K2-V2\">LLM360\/K2-V2 \u00b7 Hugging Face<\/a>\u306a\u3069\u30b3\u30fc\u30c9\u3084\u30e2\u30c7\u30eb\u30a6\u30a7\u30a4\u30c8\u306e\u307f\u3067\u306f\u306a\u304f\u30c7\u30fc\u30bf\u306a\u3069\u3082\u516c\u958b\u3055\u308c\u3066\u3044\u308b\u3088\u3046\u3002<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[223,293],"class_list":["post-7909","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-llm","tag-oss"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7909","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=7909"}],"version-history":[{"count":1,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7909\/revisions"}],"predecessor-version":[{"id":7910,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7909\/revisions\/7910"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=7909"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=7909"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=7909"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}