{"id":8280,"date":"2026-03-04T04:51:00","date_gmt":"2026-03-03T19:51:00","guid":{"rendered":"https:\/\/devneko.jp\/wordpress\/?p=8280"},"modified":"2026-02-28T14:59:02","modified_gmt":"2026-02-28T05:59:02","slug":"test-time-computing-for-referring-multimodal-large-language-models","status":"publish","type":"post","link":"https:\/\/devneko.jp\/wordpress\/?p=8280","title":{"rendered":"Test-Time Computing for Referring Multimodal Large Language Models"},"content":{"rendered":"\n<ul class=\"wp-block-list\">\n<li><strong>Test-Time Computing for Referring Multimodal Large Language Models\u00a0<\/strong>[143.5]<br>\u305d\u3053\u3067\u6211\u3005\u306f,\u65b0\u3057\u3044\u30c6\u30b9\u30c8\u6642\u9593\u9069\u5fdc\u30d5\u30ec\u30fc\u30e0\u30ef\u30fc\u30af\u3067\u3042\u308b ControlMLLM++ \u3092\u63d0\u6848\u3059\u308b\u3002 \u5b66\u7fd2\u53ef\u80fd\u306a\u8996\u899a\u7684\u30d7\u30ed\u30f3\u30d7\u30c8\u3092\u51cd\u3063\u305f\u30de\u30eb\u30c1\u30e2\u30fc\u30c0\u30eb\u306a\u5927\u8a00\u8a9e\u30e2\u30c7\u30eb\u306b\u6ce8\u5165\u3059\u308b\u3002<br><a href=\"http:\/\/arxiv.org\/abs\/2602.19505v1\">\u8ad6\u6587<\/a>\u00a0\u00a0<a href=\"https:\/\/fugumt.com\/fugumt\/paper_check\/2602.19505v1\">\u53c2\u8003\u8a33\uff08\u30e1\u30bf\u30c7\u30fc\u30bf\uff09<\/a>\u00a0 \u00a0(Mon, 23 Feb 2026 04:42:10 GMT)<\/li>\n\n\n\n<li>\u300cWe introduce ControlMLLM++, a novel test- time latent variable optimization framework that injects explicit visual prompts into frozen pre-trained MLLMs to enable referring capabilities without additional training.\u300d\u3068\u306e\u3053\u3068\u3002\u300cControlMLLM++ falls into this category, performing test-time optimization of latent perturbations to visual tokens to steer attention maps towards the referred region r.\u300d\u3068\u3044\u3046\u30a2\u30d7\u30ed\u30fc\u30c1\u3002<\/li>\n\n\n\n<li>\u30ea\u30dd\u30b8\u30c8\u30ea\u306f<a href=\"https:\/\/github.com\/mrwu-mac\/ControlMLLM\">GitHub &#8211; mrwu-mac\/ControlMLLM: [NeurIPS2024] Repo for the paper `ControlMLLM: Training-Free Visual Prompt Learning for Multimodal Large Language Models&#8217;<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[251],"class_list":["post-8280","post","type-post","status-publish","format-standard","hentry","category-arxiv","tag-mllm"],"_links":{"self":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8280","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=8280"}],"version-history":[{"count":1,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8280\/revisions"}],"predecessor-version":[{"id":8281,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/8280\/revisions\/8281"}],"wp:attachment":[{"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=8280"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=8280"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devneko.jp\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=8280"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}