{"id":7653,"date":"2025-10-27T07:18:00","date_gmt":"2025-10-26T22:18:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=7653"},"modified":"2025-10-25T21:32:48","modified_gmt":"2025-10-25T12:32:48","slug":"chatgpt-atlas-ring-1t-deepseek-ocr-olmocr-2","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=7653","title":{"rendered":"ChatGPT Atlas, Ring-1T, DeepSeek OCR, olmOCR 2"},"content":{"rendered":"\n<p>\u5148\u9031\u306fChatGPT Atlas\uff08<a href=\"https:\/\/chatgpt.com\/ja-JP\/atlas\">ChatGPT Atlas<\/a>\uff09\u306e\u8a71\u984c\u304c\u591a\u304b\u3063\u305f\u3002GUI Agent\uff08\u3088\u308a\u6b63\u78ba\u306b\u306f\u30d6\u30e9\u30a6\u30b6\u30a8\u30fc\u30b8\u30a7\u30f3\u30c8\uff09\u306e\u3088\u3046\u306b\u4eba\u304c\u64cd\u4f5c\u3057\u3066\u3044\u308b\u3088\u3046\u306bUI\u3092\u4f7f\u3046\u30a8\u30fc\u30b8\u30a7\u30f3\u30c8\u306b\u306f\u671f\u5f85\u5927\u3002<\/p>\n\n\n\n<p>Ring-1T\u306fAnt group\u306b\u3088\u308bLRM\u30011T\u30d1\u30e9\u30e1\u30fc\u30bf\u306eMoE\u69cb\u6210\u3067\u6027\u80fd\u3082\u9ad8\u3044\u3002<\/p>\n\n\n\n<p>\u307e\u305f\u3001DeepSeek OCR\u3082\u30d0\u30ba\u3063\u3066\u3044\u305f\u3002OCR\u6027\u80fd\u3068\u3044\u3046\u3088\u308a\u3082\u30b3\u30f3\u30c6\u30ad\u30b9\u30c8\u3068\u3057\u3066\u753b\u50cf\u30c7\u30fc\u30bf\u3092\u4f7f\u3046\u6709\u52b9\u6027\u304c\u8208\u5473\u6df1\u3044\u3002OCR\u3068\u3057\u3066\u306fOlmoOCR\u306ev2\u3082\u51fa\u3066\u3044\u3066OSS\u306e\u52d5\u304d\u3082\u76db\u3093\u3002<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Every Step Evolves: Scaling Reinforcement Learning for Trillion-Scale Thinking Model\u00a0<\/strong>[100.9]<br>Ring-1T\u306f\u3001\u6570\u5146\u306e\u30d1\u30e9\u30e1\u30fc\u30bf\u3092\u6301\u3064\u6700\u521d\u306e\u30aa\u30fc\u30d7\u30f3\u30bd\u30fc\u30b9\u306e\u6700\u5148\u7aef\u306e\u601d\u8003\u30e2\u30c7\u30eb\u3067\u3042\u308b\u3002 \u7dcf\u30d1\u30e9\u30e1\u30fc\u30bf\u306f1\u5146\u3067\u30011\u30c8\u30fc\u30af\u30f3\u3042\u305f\u308a\u7d04500\u5104\u3092\u6d3b\u6027\u5316\u3059\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2510.18855v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2510.18855v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Tue, 21 Oct 2025 17:46:14 GMT)<\/li>\n\n\n\n<li>\u5927\u898f\u6a21\u306aLRM\u3001\u898f\u6a21\u304c\u5927\u304d\u3044\u3068\u3044\u3046\u3053\u3068\u3082\u3042\u308b\u304cDeepSeek V3.1\u306a\u3069\u65e2\u5b58\u306e\u516c\u958b\u30e2\u30c7\u30eb\u3092\u8d85\u3048\u308b\u6027\u80fd\u3092\u4e3b\u5f35<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/inclusionAI\/Ring-V2\">GitHub &#8211; inclusionAI\/Ring-V2: Ring-V2 is a reasoning MoE LLM provided and open-sourced by InclusionAI.<\/a>\u3002\u30e2\u30c7\u30eb\u306f<a href=\"https:\/\/huggingface.co\/inclusionAI\/Ring-1T\">inclusionAI\/Ring-1T \u00b7 Hugging Face<\/a><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>DeepSeek-OCR: Contexts Optical Compression\u00a0<\/strong>[15.6]<br>\u6211\u3005\u306f,DeepSeek-OCR\u3092,\u5149\u5b66\u76842\u6b21\u5143\u30de\u30c3\u30d4\u30f3\u30b0\u306b\u3088\u308b\u9577\u671f\u30b3\u30f3\u30c6\u30ad\u30b9\u30c8\u306e\u5727\u7e2e\u306e\u5b9f\u73fe\u53ef\u80fd\u6027\u306b\u95a2\u3059\u308b\u6700\u521d\u306e\u8abf\u67fb\u3068\u3057\u3066\u7d39\u4ecb\u3059\u308b\u3002 DeepSeek-OCR\u306fDeepEncoder\u3068DeepSeek3B-MoE-A570M\u306e2\u3064\u306e\u30b3\u30f3\u30dd\u30fc\u30cd\u30f3\u30c8\u3067\u69cb\u6210\u3055\u308c\u3066\u3044\u308b\u3002 \u5b9f\u9a13\u306b\u3088\u308a\u3001\u30c6\u30ad\u30b9\u30c8\u30c8\u30fc\u30af\u30f3\u306e\u6570\u304c\u30d3\u30b8\u30e7\u30f3\u30c8\u30fc\u30af\u30f3\u306e10\u500d\u4ee5\u5185\u3067\u3042\u308c\u3070\u3001\u30e2\u30c7\u30eb\u304c\u30c7\u30b3\u30fc\u30c9(OCR)\u7cbe\u5ea6\u309297%\u9054\u6210\u3067\u304d\u308b\u3053\u3068\u304c\u793a\u3055\u308c\u305f\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2510.18234v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2510.18234v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Tue, 21 Oct 2025 02:41:44 GMT)<\/li>\n\n\n\n<li>\u30c9\u30ad\u30e5\u30e1\u30f3\u30c8\u306e\u753b\u50cf\u3092\u30b3\u30f3\u30c6\u30ad\u30b9\u30c8\u3068\u3057\u305f\u6271\u3046\u69cb\u6210\u306eLLM\u3001\u300cIn this technical report, we propose DeepSeek-OCR and preliminarily validate the feasibility of contexts optical compression through this model, demonstrating that the model can effectively decode text tokens exceeding 10 times the quantity from a small number of vision tokens. We believe this finding will facilitate the development of VLMs and LLMs in the future.\u300d\u3068\u52b9\u7387\u7684\u306a\u3088\u3046\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/deepseek-ai\/DeepSeek-OCR\">GitHub &#8211; deepseek-ai\/DeepSeek-OCR: Contexts Optical Compression<\/a><\/li>\n<\/ul>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>olmOCR 2: Unit Test Rewards for Document OCR\u00a0<\/strong>[29.5]<br>olmOCR 2\u306f\u3001PDF\u306e\u3088\u3046\u306a\u30c7\u30b8\u30bf\u30eb\u5316\u3055\u308c\u305f\u5370\u5237\u6587\u66f8\u3092\u3001\u30af\u30ea\u30fc\u30f3\u3067\u81ea\u7136\u306b\u9806\u5e8f\u4ed8\u3051\u3089\u308c\u305f\u30d7\u30ec\u30fc\u30f3\u30c6\u30ad\u30b9\u30c8\u306b\u5909\u63db\u3059\u308b\u5f37\u529b\u306aOCR\u30b7\u30b9\u30c6\u30e0\u7fa4\u306e\u6700\u65b0\u7248\u3067\u3059\u3002 olmOCR 2\u306f\u3001\u5f37\u5316\u5b66\u7fd2\u3092\u7528\u3044\u3066\u8a13\u7df4\u3055\u308c\u305f7B\u8996\u899a\u8a00\u8a9e\u30e2\u30c7\u30eb(VLM)\u3067\u3042\u308bolmOCR-2-7B-1025\u3067\u99c6\u52d5\u3055\u308c\u308b\u3002 \u3053\u308c\u3089\u306e\u30c6\u30b9\u30c8\u30b1\u30fc\u30b9\u306b\u5bfe\u3059\u308bRL\u30c8\u30ec\u30fc\u30cb\u30f3\u30b0\u306f\u3001\u6211\u3005\u306e\u82f1\u8a9eOCR\u30d9\u30f3\u30c1\u30de\u30fc\u30af\u3067\u3042\u308bolmOCR-Bench\u306b\u304a\u3051\u308b\u6700\u5148\u7aef\u306e\u30d1\u30d5\u30a9\u30fc\u30de\u30f3\u30b9\u3092\u3082\u305f\u3089\u3059\u3053\u3068\u3092\u793a\u3059\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2510.19817v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2510.19817v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Wed, 22 Oct 2025 17:53:02 GMT)<\/li>\n\n\n\n<li>\u3053\u3061\u3089\u306fOCR\u3001olmOCR\u306e\u30d0\u30fc\u30b8\u30e7\u30f32\u3002\u300cTo scale unit test creation, we develop a pipeline for generating synthetic documents with diverse and challenging layouts, known ground-truth HTML source code, and extracted test cases.\u300d\u3068\u5408\u6210\u30c7\u30fc\u30bf\u3092\u6d3b\u7528\u3059\u308b\u30a2\u30d7\u30ed\u30fc\u30c1\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/allenai\/olmocr\">GitHub &#8211; allenai\/olmocr: Toolkit for linearizing PDFs for LLM datasets\/training<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>\u5148\u9031\u306fChatGPT Atlas\uff08ChatGPT Atlas\uff09\u306e\u8a71\u984c\u304c\u591a\u304b\u3063\u305f\u3002GUI Agent\uff08\u3088\u308a\u6b63\u78ba\u306b\u306f\u30d6\u30e9\u30a6\u30b6\u30a8\u30fc\u30b8\u30a7\u30f3\u30c8\uff09\u306e\u3088\u3046\u306b\u4eba\u304c\u64cd\u4f5c\u3057\u3066\u3044\u308b\u3088\u3046\u306bUI\u3092\u4f7f\u3046\u30a8\u30fc\u30b8\u30a7\u30f3\u30c8\u306b\u306f\u671f\u5f85\u5927\u3002 Ring-1T\u306fAn &hellip; <a href=\"https:\/\/devneko.jp\/wordpress\/?p=7653\" class=\"more-link\"><span class=\"screen-reader-text\">&#8220;ChatGPT Atlas, Ring-1T, DeepSeek OCR, olmOCR 2&#8221; \u306e<\/span>\u7d9a\u304d\u3092\u8aad\u3080<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[117,181,223,232,285],"class_list":["post-7653","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-document-understanding","tag-gui-agent","tag-llm","tag-lrm","tag-ocr"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7653","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=7653"}],"version-history":[{"count":1,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7653\/revisions"}],"predecessor-version":[{"id":7654,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7653\/revisions\/7654"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=7653"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=7653"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=7653"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}