{"id":8233,"date":"2026-02-20T03:18:00","date_gmt":"2026-02-19T18:18:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=8233"},"modified":"2026-02-15T15:20:15","modified_gmt":"2026-02-15T06:20:15","slug":"paddleocr-vl-1-5-towards-a-multi-task-0-9b-vlm-for-robust-in-the-wild-document-parsing","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=8233","title":{"rendered":"PaddleOCR-VL-1.5: Towards a Multi-Task 0.9B VLM for Robust In-the-Wild Document Parsing\u00a0"},"content":{"rendered":"\n<ul class=\"wp-block-list\">\n<li><strong>PaddleOCR-VL-1.5: Towards a Multi-Task 0.9B VLM for Robust In-the-Wild Document Parsing\u00a0<\/strong>[16.3]<br>\u6211\u3005\u306fOmniDocBench v1.5\u4e0a\u306794.5%\u306e\u65b0\u3057\u3044\u6700\u5148\u7aef(SOTA)\u7cbe\u5ea6\u3092\u5b9f\u73fe\u3059\u308b\u30a2\u30c3\u30d7\u30b0\u30ec\u30fc\u30c9\u30e2\u30c7\u30eb\u3067\u3042\u308bPaddleOCR-VL-1.5\u3092\u7d39\u4ecb\u3059\u308b\u3002 \u6211\u3005\u306f,\u30b7\u30fc\u30eb\u8a8d\u8b58\u3068\u30c6\u30ad\u30b9\u30c8\u30b9\u30dd\u30c3\u30c6\u30a3\u30f3\u30b0\u30bf\u30b9\u30af\u3092\u7d44\u307f\u8fbc\u3080\u3053\u3068\u3067\u30e2\u30c7\u30eb\u306e\u80fd\u529b\u3092\u62e1\u5f35\u3057,0.9B\u8d85\u30b3\u30f3\u30d1\u30af\u30c8VLM\u3092\u9ad8\u52b9\u7387\u3067\u7dad\u6301\u3059\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2601.21957v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2601.21957v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Thu, 29 Jan 2026 16:35:04 GMT)<\/li>\n\n\n\n<li>\u6700\u8fd1\u4e2d\u56fd\u306e\u30e2\u30c7\u30eb\u304c\u6fc0\u6226\u3092\u7e70\u308a\u5e83\u3052\u3066\u3044\u308bOCR\u3001Baidu\u306ePaddle\u304b\u3089\u3082\u5c0f\u578b\u9ad8\u52b9\u7387\u306e\u30e2\u30c7\u30eb\u304c\u51fa\u3066\u3044\u308b<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/PaddlePaddle\/PaddleOCR\">GitHub &#8211; PaddlePaddle\/PaddleOCR: Turn any PDF or image document into structured data for your AI. A powerful, lightweight OCR toolkit that bridges the gap between images\/PDFs and LLMs. Supports 100+ languages.<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[285],"class_list":["post-8233","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-ocr"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8233","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=8233"}],"version-history":[{"count":1,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8233\/revisions"}],"predecessor-version":[{"id":8234,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8233\/revisions\/8234"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=8233"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=8233"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=8233"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}