{"id":6767,"date":"2025-10-02T16:20:53","date_gmt":"2025-10-02T07:20:53","guid":{"rendered":"https:\/\/awl.co.jp\/?p=6767"},"modified":"2025-11-14T17:07:34","modified_gmt":"2025-11-14T08:07:34","slug":"20251002","status":"publish","type":"post","link":"https:\/\/awl.co.jp\/en\/news\/20251002\/","title":{"rendered":"AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 &#8211; Breakthrough in Efficient Edge-Based Video Understanding"},"content":{"rendered":"<p>AWL Inc.\u2019s research paper, \u201cPITR-Select: Partial Image Token Reduction via Temporal Selection for Efficient Video Understanding on Edge\u201d, has been accepted to VCIP2025 (Visual Communications and Image Processing), an international conference technically sponsored by IEEE CAS.<br \/>\nThe paper introduces PITR-Select, a novel algorithm designed to optimize the performance of Vision Language Models (VLMs) in edge computing environments. This technology forms the foundation of AWL\u2019s upcoming General AWL engine, which aims to enhance real-time video analysis while significantly reducing power consumption.<br \/>\nAWL will continue to accelerate real-world implementation of this research to help solve pressing societal challenges.<\/p>\n<p>*This result is based on results obtained from a project, JPNP23019, subsidized by the New Energy and Industrial Technology Development Organization (NEDO).<\/p>\n","protected":false},"excerpt":{"rendered":"<p>AWL Inc.\u2019s research paper, \u201cPITR-Select: Partial Image Token Reduction via Temporal Selection for Efficient Vi [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_locale":"en_US","_original_post":"https:\/\/awl.co.jp\/?p=6761","footnotes":""},"categories":[1,13],"tags":[],"class_list":["post-6767","post","type-post","status-publish","format-standard","hentry","category-news","category-research-publications","en-US"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v23.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 - Breakthrough in Efficient Edge-Based Video Understanding | AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3<\/title>\n<meta name=\"description\" content=\"AWL Inc.\u2019s research paper, \u201cPITR-Select: Partial Image Token Reduction via Temporal Selection for Efficient Video Understanding on Edge\u201d, has been\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/awl.co.jp\/en\/news\/20251002\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 - Breakthrough in Efficient Edge-Based Video Understanding | AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3\" \/>\n<meta property=\"og:description\" content=\"AWL Inc.\u2019s research paper, \u201cPITR-Select: Partial Image Token Reduction via Temporal Selection for Efficient Video Understanding on Edge\u201d, has been\" \/>\n<meta property=\"og:url\" content=\"https:\/\/awl.co.jp\/en\/news\/20251002\/\" \/>\n<meta property=\"og:site_name\" content=\"AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3\" \/>\n<meta property=\"article:published_time\" content=\"2025-10-02T07:20:53+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-11-14T08:07:34+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/awl.co.jp\/wp-content\/uploads\/2019\/03\/img-ogp.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"600\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"awl_chan_01\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"awl_chan_01\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/awl.co.jp\/en\/news\/20251002\/\",\"url\":\"https:\/\/awl.co.jp\/en\/news\/20251002\/\",\"name\":\"AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 - Breakthrough in Efficient Edge-Based Video Understanding | AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3\",\"isPartOf\":{\"@id\":\"https:\/\/awl.co.jp\/en\/#website\"},\"datePublished\":\"2025-10-02T07:20:53+00:00\",\"dateModified\":\"2025-11-14T08:07:34+00:00\",\"author\":{\"@id\":\"https:\/\/awl.co.jp\/en\/#\/schema\/person\/9dca4c702f0f9f5c8102808c3eae2401\"},\"description\":\"AWL Inc.\u2019s research paper, \u201cPITR-Select: Partial Image Token Reduction via Temporal Selection for Efficient Video Understanding on Edge\u201d, has been\",\"breadcrumb\":{\"@id\":\"https:\/\/awl.co.jp\/en\/news\/20251002\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/awl.co.jp\/en\/news\/20251002\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/awl.co.jp\/en\/news\/20251002\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\u30db\u30fc\u30e0\",\"item\":\"https:\/\/awl.co.jp\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 &#8211; Breakthrough in Efficient Edge-Based Video Understanding\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/awl.co.jp\/en\/#website\",\"url\":\"https:\/\/awl.co.jp\/en\/\",\"name\":\"AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3\",\"description\":\"AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e\u306f\u65e2\u8a2d\u306e\u9632\u72af\u30ab\u30e1\u30e9\u3092AI\u5316\u3059\u308b\u72ec\u81ea\u306eAI\u30a8\u30c3\u30b8\u30c7\u30d0\u30a4\u30b9\u300cAWLBOX\u300d\u3068\u3001\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u3092AI\u5316\u3057\u3066\u8996\u8074\u5206\u6790\u3084\u30bf\u30fc\u30b2\u30c6\u30a3\u30f3\u30b0\u914d\u4fe1\u304c\u53ef\u80fd\u306b\u306a\u308b\u300cAWL Lite\u300d\u306b\u3088\u308a\u3001\u5e97\u8217\u306e\u696d\u52d9\u52b9\u7387\u5316\u3001\u30de\u30fc\u30b1\u30c6\u30a3\u30f3\u30b0\u306e\u9ad8\u5ea6\u5316\u3092\u5b9f\u73fe\u3057\u307e\u3059\u3002\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/awl.co.jp\/en\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/awl.co.jp\/en\/#\/schema\/person\/9dca4c702f0f9f5c8102808c3eae2401\",\"name\":\"awl_chan_01\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/awl.co.jp\/en\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/c29a7387eec283ea500aefdb22d3c34c?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/c29a7387eec283ea500aefdb22d3c34c?s=96&d=mm&r=g\",\"caption\":\"awl_chan_01\"},\"url\":\"https:\/\/awl.co.jp\/author\/awl_chan_01\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 - Breakthrough in Efficient Edge-Based Video Understanding | AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3","description":"AWL Inc.\u2019s research paper, \u201cPITR-Select: Partial Image Token Reduction via Temporal Selection for Efficient Video Understanding on Edge\u201d, has been","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/awl.co.jp\/en\/news\/20251002\/","og_locale":"en_US","og_type":"article","og_title":"AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 - Breakthrough in Efficient Edge-Based Video Understanding | AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3","og_description":"AWL Inc.\u2019s research paper, \u201cPITR-Select: Partial Image Token Reduction via Temporal Selection for Efficient Video Understanding on Edge\u201d, has been","og_url":"https:\/\/awl.co.jp\/en\/news\/20251002\/","og_site_name":"AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3","article_published_time":"2025-10-02T07:20:53+00:00","article_modified_time":"2025-11-14T08:07:34+00:00","og_image":[{"width":1200,"height":600,"url":"https:\/\/awl.co.jp\/wp-content\/uploads\/2019\/03\/img-ogp.png","type":"image\/png"}],"author":"awl_chan_01","twitter_card":"summary_large_image","twitter_misc":{"Written by":"awl_chan_01","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/awl.co.jp\/en\/news\/20251002\/","url":"https:\/\/awl.co.jp\/en\/news\/20251002\/","name":"AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 - Breakthrough in Efficient Edge-Based Video Understanding | AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3","isPartOf":{"@id":"https:\/\/awl.co.jp\/en\/#website"},"datePublished":"2025-10-02T07:20:53+00:00","dateModified":"2025-11-14T08:07:34+00:00","author":{"@id":"https:\/\/awl.co.jp\/en\/#\/schema\/person\/9dca4c702f0f9f5c8102808c3eae2401"},"description":"AWL Inc.\u2019s research paper, \u201cPITR-Select: Partial Image Token Reduction via Temporal Selection for Efficient Video Understanding on Edge\u201d, has been","breadcrumb":{"@id":"https:\/\/awl.co.jp\/en\/news\/20251002\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/awl.co.jp\/en\/news\/20251002\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/awl.co.jp\/en\/news\/20251002\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\u30db\u30fc\u30e0","item":"https:\/\/awl.co.jp\/en\/"},{"@type":"ListItem","position":2,"name":"AWL\u2019s Research Paper \u201cPITR-Select\u201d Accepted for Presentation at VCIP2025 &#8211; Breakthrough in Efficient Edge-Based Video Understanding"}]},{"@type":"WebSite","@id":"https:\/\/awl.co.jp\/en\/#website","url":"https:\/\/awl.co.jp\/en\/","name":"AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e | \u30a8\u30c3\u30b8AI\u30ab\u30e1\u30e9\u30fbVMS\u3001AI\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u30bd\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3","description":"AWL\uff08\u30a2\u30a6\u30eb\uff09\u682a\u5f0f\u4f1a\u793e\u306f\u65e2\u8a2d\u306e\u9632\u72af\u30ab\u30e1\u30e9\u3092AI\u5316\u3059\u308b\u72ec\u81ea\u306eAI\u30a8\u30c3\u30b8\u30c7\u30d0\u30a4\u30b9\u300cAWLBOX\u300d\u3068\u3001\u30c7\u30b8\u30bf\u30eb\u30b5\u30a4\u30cd\u30fc\u30b8\u3092AI\u5316\u3057\u3066\u8996\u8074\u5206\u6790\u3084\u30bf\u30fc\u30b2\u30c6\u30a3\u30f3\u30b0\u914d\u4fe1\u304c\u53ef\u80fd\u306b\u306a\u308b\u300cAWL Lite\u300d\u306b\u3088\u308a\u3001\u5e97\u8217\u306e\u696d\u52d9\u52b9\u7387\u5316\u3001\u30de\u30fc\u30b1\u30c6\u30a3\u30f3\u30b0\u306e\u9ad8\u5ea6\u5316\u3092\u5b9f\u73fe\u3057\u307e\u3059\u3002","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/awl.co.jp\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/awl.co.jp\/en\/#\/schema\/person\/9dca4c702f0f9f5c8102808c3eae2401","name":"awl_chan_01","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/awl.co.jp\/en\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/c29a7387eec283ea500aefdb22d3c34c?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/c29a7387eec283ea500aefdb22d3c34c?s=96&d=mm&r=g","caption":"awl_chan_01"},"url":"https:\/\/awl.co.jp\/author\/awl_chan_01\/"}]}},"_links":{"self":[{"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/posts\/6767"}],"collection":[{"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/comments?post=6767"}],"version-history":[{"count":4,"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/posts\/6767\/revisions"}],"predecessor-version":[{"id":6840,"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/posts\/6767\/revisions\/6840"}],"wp:attachment":[{"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/media?parent=6767"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/categories?post=6767"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/awl.co.jp\/wp-json\/wp\/v2\/tags?post=6767"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}