{"id":3004,"date":"2023-02-27T06:36:00","date_gmt":"2023-02-26T21:36:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=3004"},"modified":"2023-02-27T06:36:00","modified_gmt":"2023-02-26T21:36:00","slug":"llama","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=3004","title":{"rendered":"LLaMA"},"content":{"rendered":"\n<p><strong>Introducing LLaMA: A foundational,65-billion-parameter large languagemodel<\/strong><br>LLaMA\u306f\u3001\u7814\u7a76\u8005\u304cAI\u306e\u3053\u306e\u30b5\u30d6\u30d5\u30a3\u30fc\u30eb\u30c9\u3067\u7814\u7a76\u3092\u9032\u3081\u308b\u306e\u3092\u52a9\u3051\u308b\u305f\u3081\u306b\u8a2d\u8a08\u3055\u308c\u305f\u57fa\u790e\u7684\u306a\u5927\u304d\u306a\u8a00\u8a9e\u30e2\u30c7\u30eb\u3067\u3042\u308b\u3002\u30d5\u30a1\u30f3\u30c7\u30fc\u30b7\u30e7\u30f3\u30e2\u30c7\u30eb\u306f\u30e9\u30d9\u30eb\u306e\u306a\u3044\u5927\u91cf\u306e\u30c7\u30fc\u30bf\u3092\u30c8\u30ec\u30fc\u30cb\u30f3\u30b0\u3059\u308b\u306e\u3067\u3001\u3055\u307e\u3056\u307e\u306a\u30bf\u30b9\u30af\u306e\u5fae\u8abf\u6574\u306b\u7406\u60f3\u7684\u3067\u3059\u3002<\/p>\n\n\n\n<p>\u30d5\u30ea\u30fc\u306e\u5927\u898f\u6a21\u8a00\u8a9e\u30e2\u30c7\u30eb\u306765B\u30d1\u30e9\u30e1\u30fc\u30bf\u3067GPT-3 (175B)\u3092\u4e0a\u56de\u308aPaLM\uff08540B\uff09\u306b\u5339\u6575\u3068\u306e\u3053\u3068\u3002\u30aa\u30fc\u30d7\u30f3\u306a\u30e2\u30c7\u30eb\u3067\u306f\u3042\u308b\u304cNon-Commercial\u7528\u9014\u306e\u3088\u3046\u3002<\/p>\n\n\n\n<p>GPU-hour\u306e\u6bd4\u8f03\u304c\u8f09\u3063\u3066\u3044\u308b\u304cLLaMA\uff087B\uff09\u306782432\u3001LLaMA\uff0865B\uff09\u306f1022362\u3001p4d.24xlarge\u306e\u30aa\u30f3\u30c7\u30de\u30f3\u30c9\u4fa1\u683c\uff088 GPU hour\uff09\u304c32.77 USD\u30014500\u5186\u304f\u3089\u3044\u306a\u306e\u3067\u30017B\u3067\u826f\u3051\u308c\u30705000\u4e07\u5186\u304f\u3089\u3044\u3067\u30c8\u30ec\u30fc\u30cb\u30f3\u30b0\u3067\u304d\u308b\uff08\u30aa\u30f3\u30c7\u30de\u30f3\u30c9\u3067\u3084\u308b\u4eba\u306f\u3044\u306a\u3044\u306f\u305a\u3067\u5b9f\u614b\u306f\u3082\u3063\u3068\u5b89\u3044\u3060\u308d\u3046\u3051\u3069\u2026\uff09<\/p>\n\n\n\n<p>\u4e3b\u8981\u306a\u30c7\u30fc\u30bf\u304c\u82f1\u8a9e\u306eEnglish CommonCrawl [67%]\u3001Wikipedia\u3068Books\u30c7\u30fc\u30bf\u306f\u00a0bg, ca, cs, da, de, en, es, fr, hr, hu, it, nl, pl, pt, ro, ru, sl, sr, sv, uk\u3092\u4f7f\u7528\u3068\u306e\u3053\u3068\u3067\u65e5\u672c\u8a9e\u306e\u6027\u80fd\u306f\u671f\u5f85\u3067\u304d\u306a\u3055\u305d\u3046\u3002\uff08\u4ed6\u4f8b\u3092\u898b\u308b\u3068\u305d\u308c\u306a\u308a\u306b\u4f7f\u3048\u305f\u308a\u3082\u3059\u308b\u304b\u3082\u3057\u308c\u306a\u3044\u304c\u30fb\u30fb\u30fb\uff09<\/p>\n\n\n\n<p><a href=\"https:\/\/research.facebook.com\/publications\/llama-open-and-efficient-foundation-language-models\/\">LLaMA: Open and Efficient Foundation Language Models &#8211; Meta Research (facebook.com)<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/github.com\/facebookresearch\/llama\">GitHub &#8211; facebookresearch\/llama: Inference code for LLaMA models<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Introducing LLaMA: A foundational,65-billion-parameter large languagemodelLLaMA\u306f\u3001\u7814\u7a76\u8005\u304cAI\u306e\u3053\u306e\u30b5\u30d6\u30d5\u30a3\u30fc\u30eb\u30c9\u3067\u7814\u7a76\u3092\u9032\u3081\u308b\u306e\u3092\u52a9\u3051\u308b\u305f &hellip; <a href=\"https:\/\/devneko.jp\/wordpress\/?p=3004\" class=\"more-link\"><span class=\"screen-reader-text\">&#8220;LLaMA&#8221; \u306e<\/span>\u7d9a\u304d\u3092\u8aad\u3080<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[170,223],"class_list":["post-3004","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-gigantic-language-model","tag-llm"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/3004","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=3004"}],"version-history":[{"count":0,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/3004\/revisions"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=3004"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=3004"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=3004"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}