sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

586
active users

#NePu

0 posts0 participants0 posts today
Urs Waldmann<p>Our latest work &quot;Neural Texture Puppeteer&quot; is published at <a href="https://openaccess.thecvf.com/content/WACV2024W/CV4Smalls/html/Waldmann_Neural_Texture_Puppeteer_A_Framework_for_Neural_Geometry_and_Texture_WACVW_2024_paper.html" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">openaccess.thecvf.com/content/</span><span class="invisible">WACV2024W/CV4Smalls/html/Waldmann_Neural_Texture_Puppeteer_A_Framework_for_Neural_Geometry_and_Texture_WACVW_2024_paper.html</span></a></p><p>As a base we make use of &quot;Neural Puppeteer&quot;, an efficient and flexible neural rendering pipeline <a href="https://openaccess.thecvf.com/content/ACCV2022/html/Giebenhain_Neural_Puppeteer_Keypoint-Based_Neural_Rendering_of_Dynamic_Shapes_ACCV_2022_paper.html" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">openaccess.thecvf.com/content/</span><span class="invisible">ACCV2022/html/Giebenhain_Neural_Puppeteer_Keypoint-Based_Neural_Rendering_of_Dynamic_Shapes_ACCV_2022_paper.html</span></a></p><p>Our key idea is to disentangle texture and geometry. </p><p>We show with twelve distinct synthetic cow textures that the new pipeline can be used in a downstream task to identify individuals.</p><p><a href="https://sigmoid.social/tags/NeTePu" class="mention hashtag" rel="tag">#<span>NeTePu</span></a> <a href="https://sigmoid.social/tags/NePu" class="mention hashtag" rel="tag">#<span>NePu</span></a> <a href="https://sigmoid.social/tags/WACV" class="mention hashtag" rel="tag">#<span>WACV</span></a> <a href="https://sigmoid.social/tags/WACV24" class="mention hashtag" rel="tag">#<span>WACV24</span></a> <a href="https://sigmoid.social/tags/computervision" class="mention hashtag" rel="tag">#<span>computervision</span></a> <span class="h-card" translate="no"><a href="https://xn--baw-joa.social/@unikonstanz" class="u-url mention">@<span>unikonstanz</span></a></span> <a href="https://sigmoid.social/tags/CBehav" class="mention hashtag" rel="tag">#<span>CBehav</span></a> <a href="https://sigmoid.social/tags/NeuralRendering" class="mention hashtag" rel="tag">#<span>NeuralRendering</span></a> <a href="https://sigmoid.social/tags/ReIdentification" class="mention hashtag" rel="tag">#<span>ReIdentification</span></a></p>
Urs Waldmann<p>Our paper &quot;Neural Texture Puppeteer&quot; was accepted for oral presentation <a href="https://cv4smalls.sites.northeastern.edu/schedule-deadlines/" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">cv4smalls.sites.northeastern.e</span><span class="invisible">du/schedule-deadlines/</span></a> at the <a href="https://sigmoid.social/tags/WACV2024" class="mention hashtag" rel="tag">#<span>WACV2024</span></a> workshop &quot;CV4Smalls&quot; <a href="https://cv4smalls.sites.northeastern.edu/" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">cv4smalls.sites.northeastern.e</span><span class="invisible">du/</span></a></p><p>Stay tuned for more infos.</p><p><a href="https://sigmoid.social/tags/NeTePu" class="mention hashtag" rel="tag">#<span>NeTePu</span></a> <a href="https://sigmoid.social/tags/NePu" class="mention hashtag" rel="tag">#<span>NePu</span></a> <a href="https://sigmoid.social/tags/WACV" class="mention hashtag" rel="tag">#<span>WACV</span></a> <a href="https://sigmoid.social/tags/WACV24" class="mention hashtag" rel="tag">#<span>WACV24</span></a> <a href="https://sigmoid.social/tags/computervision" class="mention hashtag" rel="tag">#<span>computervision</span></a> <span class="h-card" translate="no"><a href="https://xn--baw-joa.social/@unikonstanz" class="u-url mention">@<span>unikonstanz</span></a></span> <a href="https://sigmoid.social/tags/CBehav" class="mention hashtag" rel="tag">#<span>CBehav</span></a> <a href="https://sigmoid.social/tags/NeuralRendering" class="mention hashtag" rel="tag">#<span>NeuralRendering</span></a> <a href="https://sigmoid.social/tags/ReIdentification" class="mention hashtag" rel="tag">#<span>ReIdentification</span></a></p>
Urs Waldmann<p>Our paper Neural Puppeteer <a href="https://urs-waldmann.github.io/NePu/" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="">urs-waldmann.github.io/NePu/</span><span class="invisible"></span></a> was accepted in the Nactar Track at <a href="https://sigmoid.social/tags/GCPR23" class="mention hashtag" rel="tag">#<span>GCPR23</span></a> <a href="https://www.dagm-gcpr.de/year/2023" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://www.</span><span class="">dagm-gcpr.de/year/2023</span><span class="invisible"></span></a>.</p><p>See you all in <a href="https://sigmoid.social/tags/Heidelberg" class="mention hashtag" rel="tag">#<span>Heidelberg</span></a>.</p><p><a href="https://sigmoid.social/tags/NePu" class="mention hashtag" rel="tag">#<span>NePu</span></a> <a href="https://sigmoid.social/tags/GCPR" class="mention hashtag" rel="tag">#<span>GCPR</span></a> <a href="https://sigmoid.social/tags/GCPR2023" class="mention hashtag" rel="tag">#<span>GCPR2023</span></a> <a href="https://sigmoid.social/tags/computervision" class="mention hashtag" rel="tag">#<span>computervision</span></a> <a href="https://sigmoid.social/tags/UniKonstanz" class="mention hashtag" rel="tag">#<span>UniKonstanz</span></a> <a href="https://sigmoid.social/tags/CBehav" class="mention hashtag" rel="tag">#<span>CBehav</span></a> <a href="https://sigmoid.social/tags/NeuralRendering" class="mention hashtag" rel="tag">#<span>NeuralRendering</span></a> <a href="https://sigmoid.social/tags/PoseEstimation" class="mention hashtag" rel="tag">#<span>PoseEstimation</span></a> <a href="https://sigmoid.social/tags/3dpose" class="mention hashtag" rel="tag">#<span>3dpose</span></a></p>
Urs Waldmann<p>Our latest work &quot;Neural Puppeteer&quot; is published at <a href="https://link.springer.com/chapter/10.1007/978-3-031-26316-3_15" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">link.springer.com/chapter/10.1</span><span class="invisible">007/978-3-031-26316-3_15</span></a>.</p><p>We estimate 3D keypoints from multi-view silhouettes only, using our inverse neural rendering pipeline. In this way our 3D keypoint estimation is robust against transformations that leave silhouettes unchanged like texture and lighting.</p><p><a href="https://sigmoid.social/tags/NePu" class="mention hashtag" rel="tag">#<span>NePu</span></a> <a href="https://sigmoid.social/tags/NeuralRendering" class="mention hashtag" rel="tag">#<span>NeuralRendering</span></a> <a href="https://sigmoid.social/tags/PoseEstimation" class="mention hashtag" rel="tag">#<span>PoseEstimation</span></a> <a href="https://sigmoid.social/tags/3dpose" class="mention hashtag" rel="tag">#<span>3dpose</span></a> <a href="https://sigmoid.social/tags/computervision" class="mention hashtag" rel="tag">#<span>computervision</span></a> <a href="https://sigmoid.social/tags/CBehav" class="mention hashtag" rel="tag">#<span>CBehav</span></a> <a href="https://sigmoid.social/tags/UniKonstanz" class="mention hashtag" rel="tag">#<span>UniKonstanz</span></a></p>