{"id":7941,"date":"2025-12-22T06:10:00","date_gmt":"2025-12-21T21:10:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=7941"},"modified":"2025-12-21T18:05:11","modified_gmt":"2025-12-21T09:05:11","slug":"openai-gpt-image-1-5-gemini-3-0-flash-nemotron-3-xiaomi-mimo-v2-flash-olmo-3%e8%ab%96%e6%96%87%ef%bc%89-bolmo-step-gui","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=7941","title":{"rendered":"OpenAI GPT Image-1.5, Gemini 3.0 Flash, Nemotron 3, Xiaomi MiMo-V2-Flash, Olmo 3(\u8ad6\u6587\uff09, Bolmo, LLaDA2.0, Step-GUI, Seedance 1.5 pro, Kling-Omni"},"content":{"rendered":"\n<p>OpenAI\u3068Google\u306e\u7af6\u4e89\u306f\u6fc0\u3057\u304f\u3001OpenAI\u304b\u3089\u306fNanoBanana\u306b\u5bfe\u6297\u3059\u308b\u3068\u898b\u8fbc\u307e\u308c\u308bGPT Image-1.5\u304c\u767a\u8868\u3055\u308c\u305f\uff08<a href=\"https:\/\/platform.openai.com\/docs\/models\/gpt-image-1.5\">GPT Image 1.5 Model | OpenAI API<\/a>\uff09\u3002Google\u304b\u3089\u306f\u30b3\u30b9\u30d1\u306b\u512a\u308c\u308bGemini 3.0 Flash\uff08<a href=\"https:\/\/blog.google\/products\/gemini\/gemini-3-flash\/\">Introducing Gemini 3 Flash: Benchmarks, global availability<\/a>\uff09\u304c\u51fa\u3066\u3044\u308b\u3002\u3068\u3066\u3082\u30b3\u30b9\u30c8\u304c\u5b89\u3044\u304c\u4e00\u90e8\u30d9\u30f3\u30c1\u30de\u30fc\u30af\u3067\u306fPro\u3092\u8d85\u3048\u3066\u3044\u308b\u3088\u3046\u306b\u3082\u898b\u3048\u3001\u30b3\u30b9\u30d1\u304c\u9ad8\u3044\u3002<\/p>\n\n\n\n<p>\u30aa\u30fc\u30d7\u30f3\u7cfb\u306e\u30e2\u30c7\u30eb\u3067\u3082\u3001Nemotron 3\uff08<a href=\"https:\/\/research.nvidia.com\/labs\/nemotron\/Nemotron-3\/\">NVIDIA Nemotron 3 Family of Models &#8211; NVIDIA Nemotron<\/a>\uff09\u3001Xiaomi MiMo-V2-Flash\uff08<a href=\"https:\/\/mimo.xiaomi.com\/blog\/mimo-v2-flash\">Xiaomi MiMo<\/a>\u3001<a href=\"https:\/\/x.com\/xiaomimimo\/status\/2000929154670157939\">X\u30e6\u30fc\u30b6\u30fc\u306eXiaomiMiMo\u3055\u3093: \u300c\u26a1 Faster than Fast. Designed for Agentic AI. Introducing Xiaomi MiMo-V2-Flash \u2014 our new open-source MoE model: 309B total params, 15B active. Blazing speed meets frontier performance. \ud83d\udd25 Highlights: \ud83c\udfd7\ufe0f Hybrid Attention: 5:1 interleaved 128-window SWA + Global | 256K context \ud83d\udcc8 https:\/\/t.co\/yCqP4L8bU4\u300d \/ X<\/a>\uff09\u3001Step-GUI\uff08<a href=\"https:\/\/opengelab.github.io\/\">GELab-Zero &#8211; GUI Agent for Mobile Devices<\/a>\uff09\u306a\u3069\u6ce8\u76ee\u3059\u3079\u304d\u767a\u8868\u304c\u3042\u3063\u305f\u3002Ai2\u304b\u3089\u306fOlmo3\u306b\u95a2\u3059\u308b\u8ad6\u6587\u304c\u51fa\u3066\u3044\u308b\u307b\u304b\u3001byte-level language model\u3068\u3044\u3046\u8208\u5473\u6df1\u3044\u30e2\u30c7\u30eb\u3082\u767a\u8868\u3055\u308c\u3066\u3044\u308b\u3002\u65b0\u305f\u306a\u30e2\u30c7\u30eb\u3068\u3044\u3046\u89b3\u70b9\u306f\u898f\u6a21\u306e\u5927\u304d\u306aDiffusion Language Models\u3001LLaDA2.0\u306b\u3082\u8981\u6ce8\u76ee\u3067\u3042\u308b\u3002<\/p>\n\n\n\n<p>\u52d5\u753b\u95a2\u9023\u3067\u3082Seedance 1.5 pro\u3084 Kling-Omni\u306e\u30c6\u30af\u30cb\u30ab\u30eb\u30ec\u30dd\u30fc\u30c8\u304c\u767a\u8868\u3055\u308c\u3066\u3044\u308b\u3002<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Step-GUI Technical Report&nbsp;<\/strong>[83.9]<br>\u672c\u7a3f\u3067\u306f,Calibrated Step Reward System\u3092\u5229\u7528\u3057\u305f\u81ea\u5df1\u9032\u5316\u578b\u30c8\u30ec\u30fc\u30cb\u30f3\u30b0\u30d1\u30a4\u30d7\u30e9\u30a4\u30f3\u3092\u63d0\u6848\u3059\u308b\u3002 \u307e\u305f\u3001\u6700\u5148\u7aef\u306eGUI\u6027\u80fd\u3092\u5b9f\u73fe\u3059\u308b\u30e2\u30c7\u30eb\u7fa4\u3067\u3042\u308bStep-GUI\u306b\u3064\u3044\u3066\u3082\u7d39\u4ecb\u3059\u308b\u3002 \u30a8\u30fc\u30b8\u30a7\u30f3\u30c8\u304c\u65e5\u5e38\u7684\u306b\u4f7f\u3048\u308b\u304b\u3069\u3046\u304b\u3092\u8a55\u4fa1\u3059\u308b\u305f\u3081\u306b,AndroidDaily\u3092\u7d39\u4ecb\u3057\u305f\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2512.15431v1\">\u8ad6\u6587<\/a>&nbsp;&nbsp;<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2512.15431v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>&nbsp; &nbsp;(Wed, 17 Dec 2025 13:26:30 GMT)<\/li>\n\n\n\n<li>\u300cwe introduce a self-evolving training pipeline centered on the Calibrated Step Reward System (CSRS).\u300d\u3001\u300cThe system consists of a Calibration Layer that performs trajectory-level validation (success\/failure) and a Data Extraction module powered by thinking models that generates seven categories of structured training data. Model-generated trajectories flow through CSRS in an iterative loop: rollout generates trajectories, CSRS processes them into high-quality training data, and training produces stronger models for the next iteration.\u300d\u3068\u51dd\u3063\u305f\u30d1\u30a4\u30d7\u30e9\u30a4\u30f3\u30027\u3064\u306e\u30ab\u30c6\u30b4\u30ea\u306e\u30c7\u30fc\u30bf\u3068\u306f\u300c(1) progress tracking, (2) state summary, (3) effect prediction, (4) self-reflection, (5) state verification, (6) intent execution, and (7) action prediction\u300d\u3092\u6307\u3059\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/stepfun-ai\/gelab-zero\">GitHub &#8211; stepfun-ai\/gelab-zero: GELab: GUI Exploration Lab. One of the best GUI agent solutions in the galaxy, built by the StepFun-GELab team and powered by Step\u2019s research capabilities.<\/a><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Olmo 3&nbsp;<\/strong>[195.4]<br>Olmo 3\u306f\u30017B\u304a\u3088\u307332B\u30d1\u30e9\u30e1\u30fc\u30bf\u30b9\u30b1\u30fc\u30eb\u306e\u6700\u5148\u7aef\u3067\u5b8c\u5168\u306b\u30aa\u30fc\u30d7\u30f3\u306a\u8a00\u8a9e\u30e2\u30c7\u30eb\u306e\u30d5\u30a1\u30df\u30ea\u30fc\u3067\u3042\u308b\u3002 \u79c1\u305f\u3061\u306e\u30d5\u30e9\u30c3\u30b0\u30b7\u30c3\u30d7\u30e2\u30c7\u30eb\u3067\u3042\u308bOlmo 3 Think 32B\u306f\u3001\u3053\u308c\u307e\u3067\u30ea\u30ea\u30fc\u30b9\u3055\u308c\u305f\u4e2d\u3067\u6700\u5f37\u306e\u5b8c\u5168\u30aa\u30fc\u30d7\u30f3\u306a\u601d\u8003\u30e2\u30c7\u30eb\u3067\u3059\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2512.13961v1\">\u8ad6\u6587<\/a>&nbsp;&nbsp;<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2512.13961v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>&nbsp; &nbsp;(Mon, 15 Dec 2025 23:41:48 GMT)<\/li>\n\n\n\n<li>Olmo3\u306e\u8ad6\u6587\u3002\u8ad6\u6587\u3092\u51fa\u3059\u306e\u304c\u9593\u306b\u5408\u3063\u3066\u3044\u306a\u3044\u304f\u3089\u3044\u9032\u5c55\u304c\u901f\u3044\u30fb\u30fb\u30fb<\/li>\n\n\n\n<li>\u30c7\u30fc\u30bf\u3084\u30c8\u30ec\u30fc\u30cb\u30f3\u30b0\u30ed\u30b0\u306a\u3069\u3001\u30e2\u30c7\u30eb\u3060\u3051\u3067\u306a\u304f\u69d8\u3005\u306a\u90e8\u5206\u304c\u516c\u958b\u3055\u308c\u3066\u3044\u308b\u3002<\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Bolmo: Byteifying the Next Generation of Language Models&nbsp;<\/strong>[115.3]<br>\u7af6\u5408\u3059\u308b\u5b8c\u5168\u30aa\u30fc\u30d7\u30f3\u306a\u30d0\u30a4\u30c8\u30ec\u30d9\u30eb\u8a00\u8a9e\u30e2\u30c7\u30eb(LM)\u306e\u6700\u521d\u306e\u30d5\u30a1\u30df\u30ea\u30fc\u3067\u3042\u308bBolmo\u3092\u7d39\u4ecb\u3057\u307e\u3059\u3002 \u30d0\u30a4\u30c8\u5316\u306f\u30b5\u30d6\u30ef\u30fc\u30c9\u30c8\u30fc\u30af\u30f3\u5316\u306e\u9650\u754c\u3092\u514b\u670d\u3059\u308b\u3002 \u6211\u3005\u306fBolmo\u304c\u30b5\u30d6\u30ef\u30fc\u30c9\u30ec\u30d9\u30eb\u306eLM\u3068\u7af6\u5408\u3059\u308b\u63a8\u8ad6\u901f\u5ea6\u3092\u5b9f\u73fe\u3067\u304d\u308b\u3053\u3068\u3092\u793a\u3059\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2512.15586v1\">\u8ad6\u6587<\/a>&nbsp;&nbsp;<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2512.15586v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>&nbsp; &nbsp;(Wed, 17 Dec 2025 16:46:11 GMT)<\/li>\n\n\n\n<li>\u30d0\u30a4\u30c8\u30ec\u30d9\u30eb\u306e\u8a00\u8a9e\u30e2\u30c7\u30eb\u3002\u76f8\u5fdc\u306e\u898f\u6a21\u3067\u691c\u8a3c\u3057\u305f\u306e\u304c\u3059\u3054\u3044\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/allenai\/bolmo-core\">GitHub &#8211; allenai\/bolmo-core: Code for Bolmo: Byteifying the Next Generation of Language Models<\/a><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>LLaDA2.0: Scaling Up Diffusion Language Models to 100B\u00a0<\/strong>[96.8]<br>LLaDA2.0 &#8211; \u96e2\u6563\u62e1\u6563\u5927\u8a00\u8a9e\u30e2\u30c7\u30eb(dLLM)\u3092100\u5104\u306e\u7dcf\u30d1\u30e9\u30e1\u30fc\u30bf\u306b\u30b9\u30b1\u30fc\u30eb\u30a2\u30c3\u30d7\u3059\u308b\u3002 LLaDA2.0\u306f\u77e5\u8b58\u7d99\u627f\u3001\u9032\u6b69\u7684\u9069\u5fdc\u3001\u52b9\u7387\u6027\u306b\u914d\u616e\u3057\u305f\u8a2d\u8a08\u539f\u5247\u3092\u652f\u6301\u3057\u3066\u3044\u308b\u3002 LLaDA2.0-mini (16B) \u3068 LLaDA2.0-flash (100B) \u306e2\u3064\u306e\u547d\u4ee4\u8abf\u6574\u578bMixture-of-Experts (MoE) \u304c\u5b9f\u7528\u7684\u5c55\u958b\u306b\u6700\u9069\u5316\u3055\u308c\u3066\u3044\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2512.15745v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2512.15745v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Wed, 10 Dec 2025 09:26:18 GMT)<\/li>\n\n\n\n<li>AR\u304b\u3089\u5909\u63db\u3057\u3066\u3044\u304f\u30a2\u30d7\u30ed\u30fc\u30c1\u306b\u3088\u308bDiffusion Language Model\u306e\u69cb\u7bc9\u3002\u300cThrough extensive evaluations, it validates the feasibility of the training paradigm. The LLaDA2.0-mini and LLaDA2.0-flash models achieve performances that are competitive with their AR counterparts. Slightly surprisingly, LLaDA2.0-flash seems to have demonstrated advantages in complex, structured domains such as code generation, mathematical reasoning, and agentic tool use. These may have opened a new door to future work in the agentic LLM era while solidifying a gaugeable potential of dLLM for test-time scaling.\u300d\u3068\u52b9\u679c\u304a\u3088\u3073\u5229\u70b9\u3092\u5831\u544a\u3057\u3066\u3044\u308b\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/huggingface.co\/collections\/inclusionAI\/llada-20\">LLaDA 2.0 &#8211; a inclusionAI Collection<\/a><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Seedance 1.5 pro: A Native Audio-Visual Joint Generation Foundation Model&nbsp;<\/strong>[144.6]<br>Seedance 1.5 Pro\u306f\u3001\u30cd\u30a4\u30c6\u30a3\u30d6\u306e\u30b8\u30e7\u30a4\u30f3\u30c8\u30aa\u30fc\u30c7\u30a3\u30aa\u30d3\u30c7\u30aa\u751f\u6210\u7528\u306b\u7279\u5225\u306b\u8a2d\u8a08\u3055\u308c\u305f\u57fa\u790e\u30e2\u30c7\u30eb\u3067\u3042\u308b\u3002 Seedance 1.5 Pro\u306f\u3001\u6b63\u78ba\u306a\u591a\u8a00\u8a9e\u3068\u65b9\u8a00\u306e\u30ea\u30c3\u30d7\u30b7\u30f3\u30af\u3001\u30c0\u30a4\u30ca\u30df\u30c3\u30af\u30b7\u30cd\u30de\u30ab\u30e1\u30e9\u30b3\u30f3\u30c8\u30ed\u30fc\u30eb\u3001\u7269\u8a9e\u306e\u30b3\u30d2\u30fc\u30ec\u30f3\u30b9\u306e\u5411\u4e0a\u3092\u901a\u3058\u3066\u3001\u81ea\u5206\u81ea\u8eab\u3092\u533a\u5225\u3059\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2512.13507v1\">\u8ad6\u6587<\/a>&nbsp;&nbsp;<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2512.13507v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>&nbsp; &nbsp;(Mon, 15 Dec 2025 16:36:52 GMT)<\/li>\n\n\n\n<li>&nbsp;\u300cwe present Seedance 1.5 pro, a foundational model engineered specifically for native, joint audio-video generation.\u300d<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/seed.bytedance.com\/en\/seedance1_5_pro\">Seedance 1.5 pro<\/a><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Kling-Omni Technical Report&nbsp;<\/strong>[80.6]<br>Kling-Omni\u306f\u30de\u30eb\u30c1\u30e2\u30fc\u30c0\u30eb\u306a\u8996\u899a\u8a00\u8a9e\u5165\u529b\u304b\u3089\u76f4\u63a5\u9ad8\u5fe0\u5b9f\u5ea6\u52d5\u753b\u3092\u5408\u6210\u3059\u308b\u305f\u3081\u306e\u751f\u6210\u30d5\u30ec\u30fc\u30e0\u30ef\u30fc\u30af\u3067\u3042\u308b\u3002 Kling-Omni\u306f\u3001\u591a\u69d8\u306a\u30d3\u30c7\u30aa\u751f\u6210\u3001\u7de8\u96c6\u3001\u30a4\u30f3\u30c6\u30ea\u30b8\u30a7\u30f3\u30c8\u306a\u63a8\u8ad6\u30bf\u30b9\u30af\u9593\u306e\u6a5f\u80fd\u7684\u5206\u96e2\u3092\u6a4b\u6e21\u3057\u3059\u308b\u3002 \u30c6\u30ad\u30b9\u30c8\u547d\u4ee4\u3001\u53c2\u7167\u753b\u50cf\u3001\u30d3\u30c7\u30aa\u30b3\u30f3\u30c6\u30ad\u30b9\u30c8\u3092\u542b\u3080\u591a\u69d8\u306a\u30e6\u30fc\u30b6\u5165\u529b\u3092\u30b5\u30dd\u30fc\u30c8\u3057\u3001\u305d\u308c\u3089\u3092\u7d71\u4e00\u3055\u308c\u305f\u30de\u30eb\u30c1\u30e2\u30fc\u30c0\u30eb\u8868\u73fe\u306b\u51e6\u7406\u3059\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2512.16776v1\">\u8ad6\u6587<\/a>&nbsp;&nbsp;<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2512.16776v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>&nbsp; &nbsp;(Thu, 18 Dec 2025 17:08:12 GMT)<\/li>\n\n\n\n<li><a href=\"https:\/\/app.klingai.com\/global\/omni\/new\">Kling AI: Next-Gen AI Video &amp; AI Image Generator<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>OpenAI\u3068Google\u306e\u7af6\u4e89\u306f\u6fc0\u3057\u304f\u3001OpenAI\u304b\u3089\u306fNanoBanana\u306b\u5bfe\u6297\u3059\u308b\u3068\u898b\u8fbc\u307e\u308c\u308bGPT Image-1.5\u304c\u767a\u8868\u3055\u308c\u305f\uff08GPT Image 1.5 Model | OpenAI API\uff09\u3002Googl &hellip; <a href=\"https:\/\/devneko.jp\/wordpress\/?p=7941\" class=\"more-link\"><span class=\"screen-reader-text\">&#8220;OpenAI GPT Image-1.5, Gemini 3.0 Flash, Nemotron 3, Xiaomi MiMo-V2-Flash, Olmo 3(\u8ad6\u6587\uff09, Bolmo, LLaDA2.0, Step-GUI, Seedance 1.5 pro, Kling-Omni&#8221; \u306e<\/span>\u7d9a\u304d\u3092\u8aad\u3080<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[42,114,181,223],"class_list":["post-7941","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-autonomous-agent","tag-diffusion-model","tag-gui-agent","tag-llm"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7941","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=7941"}],"version-history":[{"count":3,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7941\/revisions"}],"predecessor-version":[{"id":7952,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7941\/revisions\/7952"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=7941"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=7941"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=7941"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}