{"id":5762,"date":"2024-11-19T05:08:00","date_gmt":"2024-11-18T20:08:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=5762"},"modified":"2024-11-19T05:08:00","modified_gmt":"2024-11-18T20:08:00","slug":"on-the-surprising-effectiveness-of-attention-transfer-for-vision-transformers","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=5762","title":{"rendered":"On the Surprising Effectiveness of Attention Transfer for Vision Transformers"},"content":{"rendered":"\n<ul class=\"wp-block-list\">\n<li><strong>On the Surprising Effectiveness of Attention Transfer for Vision Transformers\u00a0<\/strong>[118.8]<br>\u5f93\u6765\u306e\u77e5\u6075\u306f\u3001\u4e8b\u524d\u5b66\u7fd2\u578b\u8996\u899a\u5909\u63db\u5668(ViT)\u304c\u6709\u7528\u306a\u8868\u73fe\u3092\u5b66\u7fd2\u3059\u308b\u3053\u3068\u3067\u3001\u4e0b\u6d41\u306e\u30d1\u30d5\u30a9\u30fc\u30de\u30f3\u30b9\u3092\u5411\u4e0a\u3055\u305b\u308b\u3053\u3068\u3092\u793a\u5506\u3057\u3066\u3044\u308b\u3002 \u4e88\u5099\u5b66\u7fd2\u3067\u5b66\u3093\u3060\u7279\u5fb4\u3084\u8868\u73fe\u306f\u5fc5\u9808\u3067\u306f\u306a\u3044\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2411.09702v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2411.09702v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Thu, 14 Nov 2024 18:59:40 GMT)<\/li>\n\n\n\n<li>\u300cSurprisingly, using only the attention patterns from pre-training (i.e., guiding how information flows between tokens) is sufficient for models to learn high quality features from scratch and achieve comparable downstream performance.\u300d\u3068\u3044\u3046\u307b\u3093\u307e\u304b\u3044\u306a\u3068\u3044\u3046\u5831\u544a\u3002\u300cOur key finding is that the attention patterns (inter-token operations) are the key factor behind much of the effectiveness of pre-training \u2013 our Attention Distillation method completely matches fine-tuning on ImageNet-1K.\u300d\u3068\u3044\u3046\u7d50\u679c\u3068\u306e\u3053\u3068\u3067\u9762\u767d\u3044\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/alexlioralexli\/attention-transfer\">alexlioralexli\/attention-transfer \u00b7 GitHub\uff08\u73fe\u72b6\u3067\u306f\u30b3\u30fc\u30c9\u306f\u30a2\u30c3\u30d7\u30ed\u30fc\u30c9\u3055\u308c\u3066\u3044\u306a\u3044\uff09<\/a><\/li>\n<\/ul>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[438],"class_list":["post-5762","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-vit"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/5762","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=5762"}],"version-history":[{"count":0,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/5762\/revisions"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=5762"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=5762"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=5762"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}