{"id":7801,"date":"2025-11-24T06:01:00","date_gmt":"2025-11-23T21:01:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=7801"},"modified":"2025-11-23T08:06:11","modified_gmt":"2025-11-22T23:06:11","slug":"sam-3d-3dfy-anything-in-images","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=7801","title":{"rendered":"SAM 3D: 3Dfy Anything in Images\u00a0"},"content":{"rendered":"\n<ul class=\"wp-block-list\">\n<li><strong>SAM 3D: 3Dfy Anything in Images\u00a0<\/strong>[99.1]<br>\u753b\u50cf\u304b\u3089\u5f62\u72b6, \u30c6\u30af\u30b9\u30c1\u30e3, \u30ec\u30a4\u30a2\u30a6\u30c8\u3092\u4e88\u6e2c\u3057, \u8996\u899a\u7684\u306a3D\u30aa\u30d6\u30b8\u30a7\u30af\u30c8\u518d\u69cb\u6210\u306e\u305f\u3081\u306e\u751f\u6210\u30e2\u30c7\u30ebSAM 3D\u3092\u63d0\u6848\u3059\u308b\u3002 \u30aa\u30d6\u30b8\u30a7\u30af\u30c8\u306e\u5f62\u72b6\u3001\u30c6\u30af\u30b9\u30c1\u30e3\u3001\u30dd\u30fc\u30ba\u3092\u30a2\u30ce\u30c6\u30fc\u30c8\u3059\u308b\u305f\u3081\u306e\u3001\u4eba\u9593\u7528\u304a\u3088\u3073\u30e2\u30c7\u30eb\u30fb\u30a4\u30f3\u30fb\u30b6\u30fb\u30eb\u30fc\u30d7\u30d1\u30a4\u30d7\u30e9\u30a4\u30f3\u3067\u3053\u308c\u3092\u5b9f\u73fe\u3059\u308b\u3002 \u30b3\u30fc\u30c9\u3068\u30e2\u30c7\u30eb\u306e\u91cd\u307f\u4ed8\u3051\u3001\u30aa\u30f3\u30e9\u30a4\u30f3\u30c7\u30e2\u3001\u305d\u3057\u3066Wild 3D\u30aa\u30d6\u30b8\u30a7\u30af\u30c8\u518d\u69cb\u7bc9\u306e\u305f\u3081\u306e\u65b0\u3057\u3044\u6311\u6226\u7684\u306a\u30d9\u30f3\u30c1\u30de\u30fc\u30af\u3092\u30ea\u30ea\u30fc\u30b9\u3057\u307e\u3059\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2511.16624v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2511.16624v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Thu, 20 Nov 2025 18:31:46 GMT)<\/li>\n\n\n\n<li>\u300c\u00a0SAM 3D, a generative model for visually grounded 3D object reconstruction, predicting geometry, texture, and layout from a single image\u300d\u30683D reconstruction\u30e2\u30c7\u30eb\u3067\u3042\u308a\u3001\u975e\u5e38\u306b\u9ad8\u3044\u54c1\u8cea\u306b\u898b\u3048\u308b\u3002LLM\u306e\u3088\u3046\u306a\u30a2\u30d7\u30ed\u30fc\u30c1\u3067\u69cb\u7bc9\u3057\u3066\u3044\u308b\u3068\u306e\u3053\u3068\n<ul class=\"wp-block-list\">\n<li>\u300cAs in recent works, we first train on a large collection of rendered synthetic objects. This is supervised pretraining: our model learns a rich vocabulary for object shape and texture, preparing it for real-world reconstruction. Next is mid-training with semi-synthetic data produced by pasting rendered models into natural images.  Finally, post-training adapts the model to real images, using both a novel model-in-the-loop (MITL) pipeline and human 3D artists, and aligns it to human preference. We find that synthetic pretraining generalizes, given adequate post-training on natural images.\u300d<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/facebookresearch\/sam-3d-objects\">GitHub &#8211; facebookresearch\/sam-3d-objects: SAM 3D Objects<\/a>\u3001\u30d7\u30ed\u30b8\u30a7\u30af\u30c8\u30b5\u30a4\u30c8\u306f<a href=\"https:\/\/ai.meta.com\/sam3d\/\">SAM 3D<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[690,328],"class_list":["post-7801","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-3d","tag-reconstructing-3d-shapes"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7801","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=7801"}],"version-history":[{"count":1,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7801\/revisions"}],"predecessor-version":[{"id":7802,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/7801\/revisions\/7802"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=7801"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=7801"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=7801"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}