{"id":5733,"date":"2024-11-13T05:04:00","date_gmt":"2024-11-12T20:04:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=5733"},"modified":"2024-11-13T05:04:00","modified_gmt":"2024-11-12T20:04:00","slug":"mixture-of-transformers","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=5733","title":{"rendered":"Mixture-of-Transformers"},"content":{"rendered":"\n<ul class=\"wp-block-list\">\n<li><strong>Mixture-of-Transformers: A Sparse and Scalable Architecture for Multi-Modal Foundation Models\u00a0<\/strong>[112.0]<br>Mixture-of-Transformer (MoT) \u306f\u30b9\u30d1\u30fc\u30b9\u30de\u30eb\u30c1\u30e2\u30fc\u30c0\u30eb\u30c8\u30e9\u30f3\u30b9\u30a2\u30fc\u30ad\u30c6\u30af\u30c1\u30e3\u3067\u3042\u308b\u3002 MoT\u306f\u30e2\u30c7\u30eb\u306e\u975e\u57cb\u3081\u8fbc\u307f\u30d1\u30e9\u30e1\u30fc\u30bf\u3092\u30e2\u30c0\u30ea\u30c6\u30a3\u3067\u5206\u96e2\u3059\u308b\u3002 \u8907\u6570\u306e\u8a2d\u5b9a\u3068\u30e2\u30c7\u30eb\u30b9\u30b1\u30fc\u30eb\u3067MoT\u3092\u8a55\u4fa1\u3059\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2411.04996v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2411.04996v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Thu, 07 Nov 2024 18:59:06 GMT)<\/li>\n\n\n\n<li>\u6027\u80fd\u304c\u30eb\u30fc\u30bf\u306b\u4f9d\u5b58\u3059\u308bMixture of Experts\u306b\u5bfe\u3057\u3066\u3001\u300cMoT extends the standard transformer architecture by incorporating modality-specific weights for all non-embedding model parameters, including feed-forward networks, attention matrices, and layer normalization.\u300d\u3068\u3044\u3046\u30a2\u30d7\u30ed\u30fc\u30c1\u306eMixture of Transformer\u306e\u63d0\u6848\u3002\u300cIn the Chameleon 7B setting (autoregressive text-and-image generation), MoT matches the dense baseline\u2019s performance using only 55.8% of the FLOPs.\u300d\u3068\u6709\u52b9\u6027\u3092\u4e3b\u5f35\u3002<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[415,524],"class_list":["post-5733","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-transformer","tag-524"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/5733","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=5733"}],"version-history":[{"count":0,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/5733\/revisions"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=5733"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=5733"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=5733"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}