{"id":8316,"date":"2026-03-13T05:00:00","date_gmt":"2026-03-12T20:00:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=8316"},"modified":"2026-03-07T17:02:21","modified_gmt":"2026-03-07T08:02:21","slug":"retrievit-in-context-retrieval-capabilities-of-transformers-state-space-models-and-hybrid-architectures","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=8316","title":{"rendered":"Retrievit: In-context Retrieval Capabilities of Transformers, State Space Models, and Hybrid Architectures\u00a0"},"content":{"rendered":"\n<ul class=\"wp-block-list\">\n<li><strong>Retrievit: In-context Retrieval Capabilities of Transformers, State Space Models, and Hybrid Architectures\u00a0<\/strong>[47.3]<br>\u672c\u7814\u7a76\u3067\u306f,\u30c8\u30e9\u30f3\u30b9\u30d5\u30a9\u30fc\u30de\u30fc\u3068\u30b9\u30c6\u30fc\u30c8\u30b9\u30da\u30fc\u30b9\u30e2\u30c7\u30eb\u3092\u7d44\u307f\u5408\u308f\u305b\u305f\u30cf\u30a4\u30d6\u30ea\u30c3\u30c9\u30a2\u30fc\u30ad\u30c6\u30af\u30c1\u30e3\u304c,2\u3064\u306e\u5408\u6210\u30a4\u30f3\u30b3\u30f3\u30c6\u30ad\u30b9\u30c8\u691c\u7d22\u30bf\u30b9\u30af\u306b\u304a\u3044\u3066\u4e21\u4e16\u754c\u306e\u9577\u6240\u3092\u9054\u6210\u3067\u304d\u308b\u304b\u3069\u3046\u304b\u3092\u8003\u5bdf\u3059\u308b\u3002 \u30cf\u30a4\u30d6\u30ea\u30c3\u30c9\u30e2\u30c7\u30eb\u306fSSM\u3092\u4e0a\u56de\u308a\u3001\u30c7\u30fc\u30bf\u52b9\u7387\u3068\u60c5\u5831\u6df1\u5ea6\u30b3\u30f3\u30c6\u30ad\u30b9\u30c8\u691c\u7d22\u306e\u305f\u3081\u306e\u5916\u633f\u306b\u304a\u3044\u3066Transformer\u3092\u4e0a\u56de\u308a\u307e\u3059\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2603.02874v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2603.02874v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Tue, 03 Mar 2026 11:28:33 GMT)<\/li>\n\n\n\n<li>\u516c\u958b\u30e2\u30c7\u30eb\u3067\u3082\u63a1\u7528\u4f8b\u306e\u591a\u3044transformer+state space model\u306e\u6709\u52b9\u6027\u3092\u691c\u8a3c\u3057\u305f\u8ad6\u6587\u3001\u300cHybrid models outperform both pure Transformers and SSMs on n-gram retrieval in terms of data efficiency, length generalization, and robustness to duplicate queries.\u300d\u3068\u306e\u3053\u3068\u3002<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[376,415],"class_list":["post-8316","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-ssm","tag-transformer"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8316","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=8316"}],"version-history":[{"count":1,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8316\/revisions"}],"predecessor-version":[{"id":8317,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8316\/revisions\/8317"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=8316"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=8316"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=8316"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}