IRON /atlas/ en Assistant Professor Dan Szafir to co-chair the ACM/IEEE International Conference on Human-Robot Interaction (HR2021) /atlas/2020/12/02/assistant-professor-dan-szafir-co-chair-acmieee-international-conference-human-robot <span>Assistant Professor Dan Szafir to co-chair the ACM/IEEE International Conference on Human-Robot Interaction (HR2021)</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2020-12-02T14:04:20-07:00" title="Wednesday, December 2, 2020 - 14:04">Wed, 12/02/2020 - 14:04</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/faculty_dan_szafir.jpg?h=c86a4834&amp;itok=L3r7T-BI" width="1200" height="800" alt="https://humanrobotinteraction.org/2021/student-design-competition/"> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/963" hreflang="en">briefly</a> <a href="/atlas/taxonomy/term/340" hreflang="en">szafir</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <div>HRI 2021, which will be held virtually March 8-11, is the 16th annual conference for basic and applied human-robot interaction research. Researchers from across the world present their best work to HRI to exchange ideas about the theory, technology, data and science furthering the state-of-the-art in human-robot interaction. The student design competition is open to everyone; the deadline is December 10.</div> <script> window.location.href = `https://humanrobotinteraction.org/2021/`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 02 Dec 2020 21:04:20 +0000 Anonymous 3405 at /atlas Pufferfish-inspired robot could improve drone safety /atlas/2020/10/20/pufferfish-inspired-robot-could-improve-drone-safety <span>Pufferfish-inspired robot could improve drone safety</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2020-10-20T11:33:17-06:00" title="Tuesday, October 20, 2020 - 11:33">Tue, 10/20/2020 - 11:33</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/pufferbot.jpg?h=3b20b61f&amp;itok=HuZLcrQt" width="1200" height="800" alt="drone with expandable protective cage that deploys based on proximity to solid objects"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/422" hreflang="en">hedayati</a> <a href="/atlas/taxonomy/term/392" hreflang="en">leithinger</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/1087" hreflang="en">pufferbot</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> <a href="/atlas/taxonomy/term/747" hreflang="en">suzuki</a> <a href="/atlas/taxonomy/term/340" hreflang="en">szafir</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <div>Pufferbot is an aerial robot with an expandable protective structure that deploys to encircle the drone and prevent the drone's rotors from coming in contact with obstacles or people. </div> <script> window.location.href = `/today/2020/10/21/pufferfish-inspired-robot-could-improve-drone-safety`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 20 Oct 2020 17:33:17 +0000 Anonymous 3243 at /atlas PufferBot: A flying robot with an expandable body /atlas/2020/08/24/pufferbot-flying-robot-expandable-body <span>PufferBot: A flying robot with an expandable body</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2020-08-24T13:06:50-06:00" title="Monday, August 24, 2020 - 13:06">Mon, 08/24/2020 - 13:06</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/3-pufferbotafl.jpg?h=df097086&amp;itok=EzKzjyln" width="1200" height="800" alt="Photo of PufferBot"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/422" hreflang="en">hedayati</a> <a href="/atlas/taxonomy/term/392" hreflang="en">leithinger</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/1087" hreflang="en">pufferbot</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> <a href="/atlas/taxonomy/term/747" hreflang="en">suzuki</a> <a href="/atlas/taxonomy/term/340" hreflang="en">szafir</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p>PufferBot: A flying robot with an expandable body, receives worldwide media coverage</p><p>Research about <a href="/atlas/pufferbot" rel="nofollow">PufferBot</a>, a pufferfish-inspired&nbsp;aerial robot developed in ATLAS Institute's&nbsp;IRON and THING labs, has received worldwide attention, with the project article&nbsp;translated to&nbsp;<a href="https://secure-web.cisco.com/1Uy_gZYMbU7Xr7ZwQfU_dOehHqFEqmUVaM3Qw1uWPuH-eRVg5ETaj4kbZU2YH9py83CDpwXrlgJm3Fkjzha5BJOSW2pqAel_L9emTs05QHfp_2cIEchRu1UKugWU0Bjs4KdBLrdj9KJb1fuyjME_sT-X-kYT1UPuXvnKM81Riqa5JbRQrLCss8n1ZJqvqrJuVJMuFfF6fpYqL-EVIcv4BhNcb-CxxfUWZkamiTgc7K8tgVXOf0p-xmlq_MEovObWCMuzHvlf2Fwy7EfmJoCy0zsDX90ZBxseiaAAfxTtsj0gWeGpPdKbrxQKBd2MDwRqFhbQp44OPKDLlMh0hnGwQC0MwOq3jp9MnhBzE5XqYVKgl99NIlE6-qzMU2mZ3nzOwKf7lV-TaUGe66S3W49G22W1iqYtYY10Avm6NKw-GbicHzb3vfEVDY9-dTyCmmcFZayQPuhDh_j6zUQE0KXnjyQ/https%3A%2F%2Fwww.kijkmagazine.nl%2Ffilmpjes%2Fdeze-drone-heeft-een-uitvouwbare-airbag%2F" rel="nofollow">Dutch</a>,&nbsp;<a href="https://objektiv.rs/vest/370793/stiti-sebe-i-druge-dron-se-ponasa-kao-riba-naduvaca-pogledajte-sta-radi-video/" rel="nofollow">Bosnian</a>,&nbsp;<a href="https://finance.sina.com.cn/tech/2020-10-23/doc-iiznezxr7611580.shtml" rel="nofollow">Chinese</a>,&nbsp;<a href="https://secure-web.cisco.com/1j7N34daIUAyABQu_RL7iUJVdHFXqNgeqsnprQOPtPCGBuroMbNomxHF-EwU0385FWLNtcMHJpqId2TbxGaeQ1sNcU4Yo-dTFDVwm8a2tc-xJd11U0y5Kky0EpHjZtOoZN3aYsg78tkdii9Ch6P6On3qYF1TwvDOAEXkchYX9GdcLoLrFgugJWeYcvEYHhmmM4duY5I2QIu70Ol5HEZaxkwuOBUkpZcUABURpFQN_dfGdN62U3ZMBJs-9U91Ghrphs2xkD6TewaiHkCEPkRmfXOqQFS33X0NU30BTDAL-qKD9le16_jEbFF2gZAZk4poLQt9HAvhwEIzTc-g-Z1ajI-GBw5f1hXDwl_49f6fzZrKU9ITxiEwzh-wlIxhL6ouGsAV39vWLGBSZHSH5ogJ2MAs2VCsspjyBtjHQr1FCCjJJ9OlQTK35VaaHCqSD94f3ZS0ujMtsIN3LJ6-Tss6OI48mzN_8skD8A0eHfoGC_yQ/https%3A%2F%2Fi-hls.com%2Fhe%2Farchives%2F105002" rel="nofollow">Hebrew</a>,&nbsp;<a href="https://secure-web.cisco.com/1wm8l3qfzz7NF08i_6IzllnLQqKAaFc7OSzcXh2LodvK8KtIuJ-ynTJQAGidLaVJrstJZB7jlV_wu5BhP1Y9ugaVbyJOkxDoki2b7OQJWM6beZA2WCOEbJY2B2OSOYuknnqXoEI_UeRy5qMfbFcpVsWtZHN6TW4oDsqvpmL1zbLB83KsfEu72RsF4y05KjVgfel9GYwAVdTLVjg9MtYxr7Nv0-5-eRXS5_qXHdk4t9UWZ2wEZkUnTPraQ7ydBWq5RRyuvpAphr3gnxPr7s3VumCCCYRDO8F0I01ZX9M2WV3mt55kxRF2tjpesgqzV_w1GwIiS35sJdzWudfEwDpDs5aei9P_fl8yBXLmpcXeNAU4b-kk8xkfY4NF2IMmQc_3Fp_ENSM8UJsD0y4xC-KVidKJCsJHakB1lIg0Ktfdd_Gc8tJPLPEcKBnc1eI_2pzJX0SVnFGsb4FxYcRa_8-XOrw/https%3A%2F%2Fwww.dronezine.it%2F197873%2Fpufferbot-il-drone-che-si-gonfia-come-un-pesce-palla%2F" rel="nofollow">Italian</a>,&nbsp;<a href="https://feber.se/pryl/pufferbot-gor-dronare-lite-mindre-farliga/417266/" rel="nofollow">Swedish</a>,&nbsp;<a href="https://www.popmech.ru/technologies/news-636533-dron-zashchishchaetsya-ot-povrezhdeniy-imitiruya-rybu-fugu/" rel="nofollow">Russian</a>&nbsp;and many more languages. In a paper to be presented at the 2020 IEEE/RSJ International Conference on Intelligent Robotics and Systems (IROS), the researchers, led by PhD student&nbsp;Hooman Hedayati, detailed the project's&nbsp;aim&nbsp;to improve&nbsp;drone safety, by providing a plastic shield that expands in size at a moment’s notice—forming a robotic airbag that could prevent dangerous collisions between people and machines.</p><p>The research was covered by <a href="https://techxplore.com/news/2020-09-pufferbot-robot-body.html" rel="nofollow">TechXplore </a>and <a href="https://www.techexplorist.com/pufferbot-pufferfish-inspired-robot-flying-drones-safer/35826/" rel="nofollow">Tech Explorist</a>.&nbsp; &nbsp;The project was also featured on several podcasts:&nbsp;<a href="https://techfocus.podbean.com/e/nobody-wants-a-failed-prototype-system-level-simulation-with-clarity-3d-transient-solver/" rel="nofollow">Nobody Wants A Failed Prototype - System-Level Simulation with Clarity 3D Transient Solver,</a> Amelia's weekly Fish Fry, EE journal, Oct. 30, 2020; and&nbsp;<a href="http://theuavdigest.com/351-drones-for-first-responders/" rel="nofollow">351 Drones for First Responders</a>, The UAV Digest, Oct. 29, 2020. The weekly audio podcast brings&nbsp;coverage of unmanned aerial vehicles and systems.&nbsp;</p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/article-image/3-pufferbotafl_2.jpg?itok=8Jvuna80" width="1500" height="900" alt="Photo of PufferBot"> </div> </div> </div> </div> </div> </div> </div> </div> </div> <div>TechXplore writes about PufferBot, an actuated, expandable structure that can be used to fabricate shape-changing aerial robots.</div> <script> window.location.href = `https://techxplore.com/news/2020-09-pufferbot-robot-body.html`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 24 Aug 2020 19:06:50 +0000 Anonymous 3321 at /atlas ATLAS research helps define the future of human-computer interaction /atlas/2020/05/01/atlas-research-helps-define-future-human-computer-interaction <span>ATLAS research helps define the future of human-computer interaction</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2020-05-01T00:00:00-06:00" title="Friday, May 1, 2020 - 00:00">Fri, 05/01/2020 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/may2020_newsletter_photo7.jpg?h=2da5aabd&amp;itok=Uy5BLHvH" width="1200" height="800" alt="2020 Conference on Human Factors in Computing Systems logo"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/384" hreflang="en">SUPER</a> <a href="/atlas/taxonomy/term/400" hreflang="en">THING</a> <a href="/atlas/taxonomy/term/729" hreflang="en">alistar</a> <a href="/atlas/taxonomy/term/715" hreflang="en">brubaker</a> <a href="/atlas/taxonomy/term/923" hreflang="en">danielleszafir</a> <a href="/atlas/taxonomy/term/342" hreflang="en">devendorf</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/426" hreflang="en">gach</a> <a href="/atlas/taxonomy/term/917" hreflang="en">gadiraju</a> <a href="/atlas/taxonomy/term/406" hreflang="en">gross</a> <a href="/atlas/taxonomy/term/422" hreflang="en">hedayati</a> <a href="/atlas/taxonomy/term/921" hreflang="en">kane</a> <a href="/atlas/taxonomy/term/593" hreflang="en">klefeker</a> <a href="/atlas/taxonomy/term/392" hreflang="en">leithinger</a> <a href="/atlas/taxonomy/term/731" hreflang="en">living matter</a> <a href="/atlas/taxonomy/term/919" hreflang="en">muehlbradt</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> <a href="/atlas/taxonomy/term/915" hreflang="en">striegl</a> <a href="/atlas/taxonomy/term/747" hreflang="en">suzuki</a> <a href="/atlas/taxonomy/term/340" hreflang="en">szafir</a> <a href="/atlas/taxonomy/term/376" hreflang="en">unstable</a> <a href="/atlas/taxonomy/term/713" hreflang="en">wu</a> <a href="/atlas/taxonomy/term/641" hreflang="en">zheng</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><div>&nbsp; <p> </p><div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/human-computer-interaction-hci-uhd-4k-wallpaper.jpg?itok=VsZkbnG2" width="750" height="422" alt="Drawing of human head with equations and numbers inside and outside."> </div> &nbsp;</div><p><span>Helping robots behave tactfully in group situations, pinpointing ways social media can avoid reminding the bereaved of their losses, blending modern technology with ancient weaving practices to improve&nbsp;smart textiles, </span>encouraging&nbsp;visually impaired children and sighted family members to learn Braille together through tangible blocks and computer games<span>—these are some of the topics covered in the nine&nbsp;papers and two workshops by researchers at Ҵýƽ ATLAS Institute that were accepted to CHI 2020, the world’s preeminent conference for the field of human-computer interaction.&nbsp;</span></p><p><span>Like so many other events, CHI 2020,&nbsp;</span>also known as ACM’s Conference on Human Factors in Computing Systems,<span> isn’t taking place this year, but the proceedings are published and faculty and students remain tremendously proud of their contributions. Commenting on their work, </span>ATLAS Director <a href="/atlas/mark-d-gross" rel="nofollow">Mark Gross</a> said, “The interactions we all have with hardware and software range from the absurd to the sublime. The field of human-computer interaction has more impact today than ever before, and ATLAS students and faculty are contributing at the highest levels. I’m immensely proud of this work.”</p><p><span>Researchers in the <a href="/atlas/unstable-design-lab" rel="nofollow">Unstable Design Lab</a> authored a remarkable four of the nine&nbsp;papers admitted to the conference, </span>two of which earned honorable mention, an accolade reserved for the top 5 percent of accepted conference papers. The <a href="/atlas/labscenters/thing-lab" rel="nofollow">THING</a>, Superhuman Computing, <a href="/atlas/labscenters/living-matter-lab" rel="nofollow">Living Matter</a>, <a href="/atlas/labscenters/acme-lab" rel="nofollow">ACME</a> and IRON labs also had papers accepted to the conference.&nbsp;</p><p>"Each of these papers is unique and forward-thinking," said&nbsp;<a href="/atlas/laura-devendorf" rel="nofollow">Laura Devendorf</a>, director of the Unstable Design Lab, of the researchers' papers.&nbsp;"They show&nbsp;new ways of both designing, engaging, but also recycling wearable tech devices. They not only present interesting design work, but present it in a way that ties in theories and practices from inside and outside our research community: from design for disassembly to ASMR channels on&nbsp;YouTube."</p><p><span>CHI 2020 was scheduled to take place April 25 – 30, in Hawaii. “I’m particularly disappointed for our </span>students. It’s a big opportunity for them and their careers to get that kind of exposure,” said&nbsp;<span>Devendorf.</span></p><p><span>In all, CHI 2020 received 3,126 submissions&nbsp;and accepted 760. In 2019, CHI accepted five ATLAS papers, including three from the Unstable Design Lab and two from the Superhuman Computing Lab.</span><br> &nbsp;</p><h2><span>CHI 2020 p</span><span>apers, position papers and workshops by ATLAS faculty and students</span> <div class="align-right image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/chi-logo-eps-white-background-2000px.jpg?itok=uHhLB3nf" width="750" height="390" alt="2020 Conference on Human Factors in Computing Systems logo"> </div> </div> </h2><h3><br><span>Unstable Design Lab</span></h3><p><strong><a href="http://unstable.design/wp-content/uploads/2020/04/chi20c-sub9178-cam-i16-2.pdf" rel="nofollow">Craftspeople as Technical Collaborators: Lessons Learned through an Experimental Weaving Residency</a> [Honorable Mention Award]</strong><br><em><a href="/atlas/laura-devendorf" rel="nofollow">Laura Devendorf </a>(ATLAS/INFO Faculty), Katya Arquilla (Aerospace PhD Student), Sandra Wirtanen,&nbsp; Allison Anderson (Aerospace Faculty), Steven Frost (Media Studies Faculty)&nbsp;</em><br><span>By broadening the idea of who and what is considered “technical,” this paper examines the ways HCI practitioners, engineers and craftspeople can productively collaborate.&nbsp;</span></p><p><strong><span><a href="http://unstable.design/wp-content/uploads/2020/04/CHI2020_designMemoirs.pdf" rel="nofollow">Making Design Memoirs: Understanding and Honoring Difficult Experiences</a></span> [Honorable Mention Award]</strong><br><span><em><a href="/atlas/laura-devendorf" rel="nofollow">Laura Devendorf </a>(ATLAS/INFO) Faculty), Kristina Andersen, Aisling Kelliher</em><br> How can we design for difficult emotional experiences without reducing a person’s experience? In this paper three researchers design objects that illustrate their personal experiences as mothers to gain a deeper understanding of their individual struggles.</span></p><p><strong><span><a href="http://unstable.design/wp-content/uploads/2020/04/chi20c-sub2165-cam-i16.pdf" rel="nofollow">Unfabricate: Designing Smart Fabrics for Disassembly</a></span>&nbsp;</strong>&nbsp;<br><span><em><a href="/atlas/shanel-wu" rel="nofollow">Shanel Wu</a> (ATLAS), <a href="/atlas/laura-devendorf" rel="nofollow">Laura Devendorf </a>(ATLAS/INFO)</em><br> Being mindful of the massive waste streams for digital electronics and textiles, HCI researchers address sustainability and waste in smart textiles development through designing smart textile garments with reuse in mind.</span></p><p><strong><span><a href="http://unstable.design/wp-content/uploads/2020/04/chi20c-sub8313-cam-i16.pdf" rel="nofollow">What HCI Can Learn from ASMR: Becoming Enchanted with the Mundane</a></span>&nbsp;&nbsp;</strong><br><span><em><a href="/atlas/jolie-klefeker" rel="nofollow">Josephine Klefeker</a> (ATLAS, TAM undergraduate), <a href="/atlas/libi-striegl" rel="nofollow">Libi Striegl</a> (Intermedia Art, Writing and Performance), <a href="/atlas/laura-devendorf" rel="nofollow">Laura Devendorf</a> (ATLAS/INFO)</em><br> Researchers introduced the online subculture of </span>autonomous sensory meridian response (ASMR) videos, showing people slowly interacting with objects and whispering into microphones and triggering a tingling bodily sensation in viewers and listeners, as a source of inspiration for wearables and experiences of enchantment, to cultivate deeper connections with our mundane and everyday environments.</p><h3><br><span>IRON Lab</span></h3><p><strong><span><a href="https://www.researchgate.net/publication/340849591_Comparing_F-Formations_Between_Humans_and_On-Screen_Agents" rel="nofollow">Comparing F-Formation</a></span><span><a href="https://www.researchgate.net/publication/340849591_Comparing_F-Formations_Between_Humans_and_On-Screen_Agents" rel="nofollow">s between Humans and On-Screen Agents </a></span>&nbsp;</strong><br><em><a href="/atlas/hooman-hedayati" rel="nofollow">Hooman Hedayati </a>(PhD student, Computer Science), James Kennedy, <a href="/atlas/dan-szafir" rel="nofollow">Daniel Szafir</a></em><br> While humans most often learn to interpret social situations and adjust their behavior accordingly, robots must be programmed to do so. This paper explores ways for robots to detect and predict the position of individuals in human conversational groups in order to more fluidly interact and participate in a conversation with them. <a href="/atlas/f-formations" rel="nofollow">More information</a></p><h3><span>THING Lab &amp; ACME Lab</span></h3><p><strong><a href="https://ryosuzuki.org/publications/chi-2020-roomshift.pdf" rel="nofollow">RoomShift: Room-scale Dynamic Haptics for VR with Furniture-moving Swarm Robots</a></strong><br><em><a href="/atlas/ryo-suzuki" rel="nofollow">Ryo Suzuki</a>, <a href="/atlas/hooman-hedayati" rel="nofollow">Hooman Hedayati</a>, (both PhD student, CS), <a href="/atlas/clement-zheng" rel="nofollow">Clement Zheng</a> (ATLAS PhD candidate), James Bohn (undergraduate, CS), <a href="/atlas/dan-szafir" rel="nofollow">Daniel Szafir</a>, <a href="/atlas/ellen-yi-luen-do" rel="nofollow">Ellen Yi-Luen Do</a>, <a href="/atlas/mark-d-gross" rel="nofollow">Mark D. Gross</a>, <a href="/atlas/daniel-leithinger" rel="nofollow">Daniel Leithinger</a> (all ATLAS faculty)</em><br> With applications in virtual tours and architectural design, this project dynamically synchronizes virtual reality with the physical environments by rearranging objects using a small swarm of robots able to elevate and relocate tables, chairs and other objects. When users can sit on, lean against, touch and otherwise interact with objects in a virtual scene, it provides more a fuller immersion in the virtual world than purely visual VR. <a href="/atlas/roomshift" rel="nofollow">More information</a></p><h3><span>Living Matter Lab&nbsp;</span></h3><p><strong><a href="https://dl.acm.org/doi/abs/10.1145/3334480.3381817" rel="nofollow">Semina Aeternitatis: Using Bacteria for Tangible Interaction with Data</a></strong><br><em><a href="/atlas/mirela-alistar" rel="nofollow">Mirela Alistar</a> (ATLAS), Margherita Pevere</em><br> An exploration of the potential of DNA molecules to enable new ways for humans to interact with their stories and memories via a physical interface. The project involved encoding an elderly woman's written memories into precisely sequenced DNA and then splicing the code into the genome of a microorganism. The transformed bacteria then replicated, creating billions of facsimiles of the woman's memories. The resulting biofilm was presented in an exhibition as a sculpture. (CHI '20: Extended Abstracts)</p><h3><span>Superhuman Computing Lab&nbsp;</span></h3><p><strong>BrailleBlocks: Computational Braille Toys for Collaborative Learning</strong><br><em><a href="/atlas/vinitha-gadiraju" rel="nofollow">Vinitha Gadiraju</a>, <a href="/atlas/annika-muehlbradt" rel="nofollow">Annika Muehlbradt</a>, and Shaun K. Kane (ATLAS/CS)</em><br> BrailleBlocks tactile gaming system encourages visually impaired children and their sighted family members to learn Braille together through tangible blocks and pegs and an iPad application with interactive educational games. <a href="/atlas/brailleblocks" rel="nofollow">More information.</a></p><h3><span>ATLAS PhD Student&nbsp;in External Labs</span></h3><p><strong><a href="https://cmci.colorado.edu/idlab/assets/bibliography/pdf/Gach2020TSC.pdf" rel="nofollow">Experiences of Trust in Postmortem Profile Management</a></strong><br><em><a href="/atlas/catherine-gach" rel="nofollow">Katie Z. Gach</a> (ATLAS PhD Student), Jed Brubaker (INFO Faculty)</em><br> Managing Facebook pages for loved ones after their death is fraught with difficulty, according to this paper. While Facebook has created the ability for users to appoint post-mortem managers, called legacy contacts, Facebook gives them limited authority over the content, making them feel distrusted by the social network (Published in Transactions on Social Computing, invited for presentation at CHI 2020)</p><h3><span>Workshops Organized</span></h3><p><strong><a href="https://hci-uncertainty.github.io/" rel="nofollow">Embracing Uncertainty in HCI</a></strong><br><em>Robert Soden (ATLAS alumnus), <a href="/atlas/laura-devendorf" rel="nofollow">Laura Devendorf</a> (ATLAS/INFO&nbsp;faculty), Richmond Y. Wong, Lydia B. Chilton, Ann Light, Yoko Akama</em><br> This workshop explores the many ways uncertainty appears in research&nbsp;and the different types of responses that HCI has to offer. Outcomes of the workshop include exercises designed to evoke uncertainty in participants, concept mappings and a collection of essays developed by participants.</p><p><strong><a href="https://asian-chi.github.io/2020/" rel="nofollow">Asian CHI Symposium: HCI Research from Asia and on Asian Contexts and Cultures</a>&nbsp;</strong><br><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><em>Ellen Yi-Luen Do</em></a><em>(ATLAS faculty) among many others listed <a href="https://programs.sigchi.org/chi/2020/program/content/32366" rel="nofollow">here</a></em><br> This symposium showcases the latest HCI work from Asia and those focusing on incorporating Asian sociocultural factors in their design and implementation. In addition to circulating ideas and envisioning future research in human-computer interaction, this symposium aims to foster social networks among researchers and practitioners and grow the Asian research community.</p><h3>Workshop Papers</h3><p><strong><span><a href="https://cmci.colorado.edu/visualab/" rel="nofollow">Toward Effective Multimodal Interaction in Augmented Reality</a></span></strong><br><em>Matt Whitlock (CS student), <a href="/atlas/daniel-leithinger" rel="nofollow">Daniel Leithinger</a> (ATLAS faculty), <a href="/atlas/danielle-szafir" rel="nofollow">Danielle Albers Szafir</a> (ATLAS faculty/INFO affiliate faculty)</em><br> This paper on envisioning future productivity for immersive analytics was accepted to the Immersive Analytics workshop at CHI 2020.</p><p><strong>Virtual and Augmented Reality for Public Safety</strong><br><em><a href="/atlas/cassandra-goodby" rel="nofollow">Cassandra Goodby</a> (CTD student)</em><br> This paper explores potential applications of AR and VR technologies, haptics and voice recognition for first-responders. It&nbsp;was accepted to the Everyday Proxy Objects for Virtual Reality workshop at CHI 2020.</p><p><strong>Mental Health Survey and Synthesis</strong><a href="http://https://tmilab.colorado.edu/" rel="nofollow"><strong>​</strong></a><br><em><a href="/atlas/cassandra-goodby" rel="nofollow">Cassandra Goodby</a> (CTD student)</em><br> This paper&nbsp;on&nbsp;tools and technologies available through mental health applications was accepted to the Technology Ecosystems: Rethinking Resources for Mental Health workshop at CHI 2020.</p><p>&nbsp;</p></div> </div> </div> </div> </div> <div>At a time when the field of human-computer interaction is becoming more important than ever, ATLAS researchers are making substantial contributions, contributing nine papers and two workshops to CHI '20.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 01 May 2020 06:00:00 +0000 Anonymous 2529 at /atlas Globalive Media's "Beyond Innovation:" IRON Lab research featured on globally broadcast program /atlas/2019/05/22/globalive-medias-beyond-innovation-iron-lab-research-featured-globally-broadcast-program <span>Globalive Media's "Beyond Innovation:" IRON Lab research featured on globally broadcast program</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2019-05-22T14:08:25-06:00" title="Wednesday, May 22, 2019 - 14:08">Wed, 05/22/2019 - 14:08</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/iron_lab_video.png?h=f980c6f4&amp;itok=MSuhI0XB" width="1200" height="800" alt="Research subject wearing augmented reality headset working with drone."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/422" hreflang="en">hedayati</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/340" hreflang="en">szafir</a> <a href="/atlas/taxonomy/term/520" hreflang="en">walker</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <div>Research from ATLAS Institute's IRON Lab&nbsp;involving utilizing&nbsp;augmented reality&nbsp;to gain information about a robot's intended path of motion was featured on the globally broadcast program,&nbsp;"Beyond Innovation." The program features the latest business and technology trends.</div> <script> window.location.href = `https://youtu.be/bHlfPPCzMOs?t=57`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 22 May 2019 20:08:25 +0000 Anonymous 2041 at /atlas A robotic helping hand /atlas/2018/11/14/robotic-helping-hand <span>A robotic helping hand</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2018-11-14T15:14:28-07:00" title="Wednesday, November 14, 2018 - 15:14">Wed, 11/14/2018 - 15:14</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/iron-lab-project-shared-autonomy2.jpg.jpeg?h=8d741133&amp;itok=5L1KoQr2" width="1200" height="800" alt="Fetch, a four-foot-tall robot is shown to attendees of the ATLAS Research Showcase."> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/340" hreflang="en">szafir</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <div>Research that helps robots understand gestures and the often vague nature of language will pave the way to mechanical beings taking on human tasks, from assembling children's toy castles on Christmas morning to caring for elderly relatives, says Dan Szafir, assistant professor at the ATLAS Institute and director of the IRON lab.</div> <script> window.location.href = `/today/2018/11/14/robotic-helping-hand`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 14 Nov 2018 22:14:28 +0000 Anonymous 1725 at /atlas Hooman Hedayati lands prestigious Microsoft Research internship /atlas/2018/09/11/hooman-hedayati-lands-prestigious-microsoft-research-internship <span>Hooman Hedayati lands prestigious Microsoft Research internship</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2018-09-11T13:58:46-06:00" title="Tuesday, September 11, 2018 - 13:58">Tue, 09/11/2018 - 13:58</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/screenshot_2018-09-21_17.31.23.png?h=2d869f1d&amp;itok=hN-qpszl" width="1200" height="800" alt="Hedayati with robot"> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/422" hreflang="en">hedayati</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/536" hreflang="en">newsbrief</a> <a href="/atlas/taxonomy/term/374" hreflang="en">phdstudent</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p> </p><div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/maryattphotography_hooman-6.jpg?itok=V6vlVCPP" width="750" height="500" alt="Hooman Hedayati sits nexts to a large robot."> </div> <strong><a href="/atlas/hooman-hedayati" rel="nofollow" target="_blank">Hooman Hedayati</a></strong>, a computer science PhD student based in the ATLAS Institute’s Interactive Robotics and Novel Technologies (IRON) Lab, participated in a prestigious summer internship at Microsoft Research in Redmond, Washington, where he worked on teaching robots social skills in group conversations. His research focused on helping robots detect F-formations, group conversations that happen when “two or more people sustain a spatial and orientational relationship in which the space between them is one to which they have equal, direct and exclusive access.”<p>“For us, detecting F-formations is easy, and we don’t think about it,” says Hedayati. &nbsp;“You know how many are in your conversational group, and you know how to position yourself in respect to others. But this task is not easy for robots.”</p><p>During the internship, Hooman and his Microsoft Research mentor, Sean Andrist, worked on developing an algorithm to help robots detect those in the same conversational group as the robot. The two plan to publish a paper about their findings.</p><p>“It was a great feeling to be surrounded by top scientists and legends in my field,” Hedayati says.</p><p>&nbsp;</p><div>&nbsp;</div></div> </div> </div> </div> </div> <div>Hooman Hedayati, a computer science PhD student based in the ATLAS Institute’s Interactive Robotics and Novel Technologies (IRON) Lab, participated in a prestigious summer internship at Microsoft Research in Redmond, Washington, where he worked on teaching robots social skills in group conversations. </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 11 Sep 2018 19:58:46 +0000 Anonymous 1575 at /atlas IRON Lab researcher wins Outstanding Research Excellence award /atlas/2018/05/18/iron-lab-researcher-wins-outstanding-research-excellence-award <span>IRON Lab researcher wins Outstanding Research Excellence award</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2018-05-18T14:55:58-06:00" title="Friday, May 18, 2018 - 14:55">Fri, 05/18/2018 - 14:55</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/hooman-500px.jpg?h=176efcf3&amp;itok=HZZF4MR_" width="1200" height="800" alt="Hooman Hedayati"> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/422" hreflang="en">hedayati</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/374" hreflang="en">phdstudent</a> <a href="/atlas/taxonomy/term/340" hreflang="en">szafir</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p> </p><div class="align-left image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/hooman-1500px.jpg?itok=B1DbdQod" width="750" height="709" alt="Hooman Hedayati"> </div> </div> <p dir="ltr">Hooman Hedayati, a computer science PhD student based in the ATLAS Institute’s Interactive Robotics and Novel Technologies (IRON) Lab, was recently awarded the Outstanding Research Award by Ҵýƽ Department of Computer Science for his accomplishments in the 2017-18 academic year.</p><p dir="ltr">“Hooman has demonstrated outstanding research and is making remarkable progress towards becoming a world-class scholar,” says ATLAS-based Assistant Professor Dan Szafir, who directs the IRON Lab and nominated Hedayati for the award.</p><p>Hedayati, who studies human computer interaction and is advised by Szafir, has submitted three papers to top international venues in the span of one year, two of which won top awards.</p><p>In fall 2017, both papers Hedayati submitted to the ACM/IEEE International Conference on Human Robot Interaction (HRI) were accepted and nominated for best paper, “which is essentially unheard of,” Szafir says.</p><p dir="ltr">Hedayati was lead author for the paper, “Communicating Robot Motion Intent with Augmented Reality,” which was named HRI 2018 Best Paper; and he was second author for the another IRON Lab submission which won Runner-Up Best Paper. HRI is the world's top venue for work in the field of human-robot interaction, says Szafir.</p><p dir="ltr">Hedayati followed this work up with another “very strong submission” to Robotics: Science and Systems (RSS) this spring, which is still under review, and he’s preparing to submit another paper to the Association for the Advancement of Artificial Intelligence (AAAI) Symposium in June. “In the span of a year,” says Szafir, “he will have submitted four full papers to top international venues, with at least two accepted and winning awards.”</p><p>Now spending the summer in Seattle after landing a prestigious summer internship with Microsoft Research, Hedayati says, “I'm truly honored to receive the Outstanding Research Award. My fellow PhD students and friends are extremely talented, hardworking and dedicated. I am motivated to keep working hard.”</p><p dir="ltr">The IRON Lab is an interdisciplinary research group with a mission to advance knowledge around the design of new sensing, interface and robotic technologies to improve user experience, productivity and enjoyment. The laboratory specializes in aerial/free-flying robots, human-robot collaboration, space robotics, augmented and virtual reality, and the synthesis of art and technology.</p><p>&nbsp;</p><p>&nbsp;</p></div> </div> </div> </div> </div> <div>Hooman Hedayati, a computer science PhD student in the ATLAS Institute’s IRON Lab, was awarded an Outstanding Research Award by Ҵýƽ Department of Computer Science.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 18 May 2018 20:55:58 +0000 Anonymous 1260 at /atlas Augmented reality enhances robot collaboration /atlas/2018/03/15/augmented-reality-enhances-robot-collaboration <span>Augmented reality enhances robot collaboration</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2018-03-15T15:54:29-06:00" title="Thursday, March 15, 2018 - 15:54">Thu, 03/15/2018 - 15:54</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/hooman-thumb.jpg?h=db02b657&amp;itok=vNb-0jz6" width="1200" height="800" alt="Researcher with drone."> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/625" hreflang="en">arhci</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/422" hreflang="en">hedayati</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/374" hreflang="en">phdstudent</a> <a href="/atlas/taxonomy/term/370" hreflang="en">pubres</a> <a href="/atlas/taxonomy/term/340" hreflang="en">szafir</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p>Thousands of exciting and novel applications of augmented reality and robotic technologies have emerged in recent years, but the potential for networking these two technologies and using them in conjunction with each other has gone relatively unexplored. However, two papers published by the ATLAS Iron Lab last week for the ACM/IEEE International Conference on Human Robot Interaction in Chicago open the door to this promising area of research, paving the way for more seamless integration of robots in modern life.<br> &nbsp;<br> Recognizing the value of their innovative work, conference organizers awarded the IRON Lab teams best paper and runner-up best paper in the design category.&nbsp;Assistant Professor <a href="/atlas/dan-szafir" rel="nofollow">Dan Szafir</a>, who directs the IRON Lab, explains that both papers examine the potential for transmitting real-time visual information from drones to people with AR. In the first study, research participants completing an assembly task while sharing a workspace with a drone&nbsp;were more efficient when informed of the drone's&nbsp;flightpath using AR, versus tracking its path without assistance.&nbsp;In the second study, drone photography proved safer and more accurate when a drone camera’s field of view was streamed to operators' AR displays instead of tablet screens, as is the norm today.<br> &nbsp;<br> To conduct the first study, researchers set up an environment </p><div class="align-right image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/drone1-crop-web.png?itok=dQrtkgC5" width="750" height="560" alt="Reseach subject looking at drone through AR headset."> </div> </div> similar to a small warehouse, where participants were assigned the task of stringing beads in a specific color order, requiring them to move between six assembly stations, remaining at a safe distance from the drone at all times. Their goal was to assemble as many beaded strings as possible in eight minutes. When the drone approached, the participants&nbsp;had to stop work and move to a different workstation.<br> &nbsp;<br> Results found that when the drone’s imminent flightpath was communicated with AR, participants were more efficient. Furthermore, the study evaluated tradeoffs in a variety of different graphical approaches to communicating the drone’s flightpath, which may help guide the design of future AR interfaces.<br> &nbsp;<br> The second study found that AR technology helped drone operators take photos more safely and with more accuracy. Using a drone-mounted camera, research subjects were asked to photograph framed targets on a wall as quickly and precisely as possible. The drone camera’s field of view was visible to operators using a handheld tablet and using AR, in a variety of graphical configurations.<br> &nbsp;<br> Results were judged by how fast subjects completed the task, the accuracy of their photos and the number of times the drones&nbsp;crashed. Once again, the study found AR significantly improved performance, increasing accuracy and reducing the number of crashes, with some AR graphical approaches proving more effective than others.<p> </p><div class="align-left image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/drone2.jpg?itok=C-Rrnj8a" width="750" height="498" alt="Research subject wears AR headset with airborne drone nearby."> </div> </div> As the world moves towards integrating humans and robots in the workplace, effective collaboration depends on the ability of team members to rapidly understand and predict a robot’s behavior, something that human workers do through facial expressions, gestures and speech,” says Szafir, who directs the IRON Lab. “Human workers want to know explicitly when and where their robot coworker intends to move next, and they perform best when they can anticipate those movements. We are excited to be exploring how to leverage augmented reality to communicate this information in new and more effective ways.”<br> &nbsp;<br> Taking place over a 12-month period, the two studies&nbsp;<span><a href="/atlas/node/154" rel="nofollow">Szafir</a></span> supervised&nbsp;were conducted by PhD students Michael Walker and <a href="/atlas/node/610" rel="nofollow">Hooman Hedayati</a>, along with master’s student <a href="/atlas/node/176" rel="nofollow">Jennifer Lee</a>. After being launched in January 2016 and&nbsp;<span>Szafir</span> making the <a href="/atlas/2017/01/20/forbes-30-under-30-science-assistant-professor-dan-szafir" rel="nofollow">Forbes&nbsp;"30 Under 30: Science" list</a> in January 2017, the IRON Lab's latest commendations from the world's preeminent HRI conference sets expectations high for this ambitious and growing group of researchers.<p><span><a href="https://dl.acm.org/citation.cfm?id=3171253" rel="nofollow">Communicating Robot Motion Intent with Augmented Reality</a></span> by Michael Walker, <a href="/atlas/node/610" rel="nofollow">Hooman Hedayati</a>, <a href="/atlas/node/176" rel="nofollow">Jennifer Lee</a>, and&nbsp;<span><a href="/atlas/node/154" rel="nofollow">Daniel Szafir</a>&nbsp;</span>(Best Paper—Design,&nbsp;ACM/IEEE International Conference on Human Robot Interaction, 2018)</p><p><span><a href="https://dl.acm.org/citation.cfm?id=3171251" rel="nofollow">Improving Collocated Robot Teleoperation with Augmented Reality</a>&nbsp;</span><span>by </span><a href="/atlas/node/610" rel="nofollow">Hooman Hedayati</a><span>, </span>Michael Walker<span> and <a href="/atlas/node/154" rel="nofollow">Daniel Szafir</a>&nbsp;</span>(Runner-Up Best Paper—Design,&nbsp;ACM/IEEE International Conference on Human Robot Interaction, 2018)</p><p><a class="ucb-link-button ucb-link-button-gray ucb-link-button-default ucb-link-button-large" href="/atlas/augmented-reality-informs-human-robot-interaction" rel="nofollow"> <span class="ucb-link-button-contents"> <i class="fa-solid fa-rocket">&nbsp;</i> Visit Project Page </span> </a> &nbsp;</p></div> </div> </div> </div> </div> <div>ATLAS IRON Lab researchers were awarded best paper and runner-up best paper at the ACM/IEEE International Conference on Human Robot Interaction for developing technologies that use augmented reality to enhance drone operation.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 15 Mar 2018 21:54:29 +0000 Anonymous 1148 at /atlas Browser extension helps the visually impaired interpret online images /atlas/2018/01/30/browser-extension-helps-visually-impaired-interpret-online-images <span>Browser extension helps the visually impaired interpret online images</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2018-01-30T14:01:29-07:00" title="Tuesday, January 30, 2018 - 14:01">Tue, 01/30/2018 - 14:01</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/caption-crawler.jpg?h=f58a0df8&amp;itok=86easV8U" width="1200" height="800" alt="&lt;img alt=&quot;image&quot;&gt;"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/278"> Research Brief </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/306" hreflang="en">IRON</a> <a href="/atlas/taxonomy/term/384" hreflang="en">SUPER</a> <a href="/atlas/taxonomy/term/428" hreflang="en">guinness</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/374" hreflang="en">phdstudent</a> <a href="/atlas/taxonomy/term/370" hreflang="en">pubres</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p>Imagine internet browsing without the ability to make sense of images. It’s a problem that visually impaired computer users face every day. While screen reading technology gives users audible access to written content, it needs written descriptions to interpret images, and often there isn’t any.</p><p> </p><div class="align-left image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/caption-crawler_0.jpg?itok=iEjpCo-g" width="750" height="500" alt="&lt;img alt=&quot;image&quot;&gt;"> </div> </div> Some website developers include descriptions of images in the code (called “alt text”) because it improves their websites’ search engine rankings. However, there’s no mechanism for determining whether these descriptions are accurate or informative. As a result, developers often enter one-word descriptions such as “image” or “photo,” leaving the visually impaired with no useful information about the image.<p>To help address this problem, an ATLAS Institute researcher developed a system that collects captions and alt text associated with other instances of the same photo elsewhere online, associating human-authored descriptions with every website where it appears. Called Caption Crawler, the image captioning system compiles descriptions in a database: if a photo has never been queried, it will offer alt text in about 20 seconds; if the photo has previously been processed, alt text is available almost immediately.</p><p>The technology was developed by Darren Guinness, a PhD student in the ATLAS Interactive Robotics and Novel Technologies (IRON) Lab and the Superhuman Computing Lab, working in conjunction with Microsoft Research’s Edward Cutrell and Meredith Ringel Morris. The research, which merges the benefits of a fully automated system with the quality of human-authored content, will be presented at the Association for Computing Machinery’s (ACM) 2018 Conference on Human Factors in Computing Systems (CHI) in Montreal in April.</p><p>Users who want Caption Crawler to replace poor-quality alt text, press a keyboard shortcut to request a replacement. &nbsp;The screen reader automatically speaks the new caption, which is the longest caption found for a particular photo. Users can also use a different shortcut to access any additional found captions.</p><p>Caption Crawler only works with images used on multiple websites, but the approach is effective because about half of website administrators provide informative photo descriptions, Guinness says.</p><p>“Although this approach cannot caption unique images that only appear in a single place online, it can increase the accessibility of many online images,” he says. “Caption Crawler is a low latency, incredibly low-cost solution to a big problem. It produces human-quality captioning without incurring additional costs in human labeling time.”</p><p>Caption Crawler combines a Google Chrome Browser Extension with a Node.js cloud server. The browser extension searches the Document Object Model (DOM) &nbsp;of the active webpage for image tags and background images, which are then sent to the server for caption retrieval. When Caption Crawler finds a caption for an image, the caption is streamed back to the browser extension, which then associates the caption to the image.</p><p>Research shows humans produce higher quality captions than automated computer and machine-learning based approaches, Guinness says. Caption Crawler uses a hybrid system that captures both, prioritizing human captioning over machine learning and computer vision-based approaches. If no human-authored captions can be found, computer-generated captions from Microsoft’s CaptionBot are used to describe the image. When the text from CaptionBot is read aloud, the screen reader first speaks the words “CaptionBot,” so that the user is aware that the caption is not human-authored.</p><p>“Hybrid systems that meld both human-quality text and machine learning approaches hold a lot of promise for improving access to online media,” Guinness says.</p><p><a href="https://www-cs.stanford.edu/~merrie/papers/captioncrawler.pdf" rel="nofollow">Full CHI 2018 paper</a></p><p>[video:https://vimeo.com/249025146]</p><p>&nbsp;</p></div> </div> </div> </div> </div> <div>ATLAS researcher Darren Guinness developed technology that conveys online photo content to the visually impaired </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 30 Jan 2018 21:01:29 +0000 Anonymous 1072 at /atlas