Sean Humbert News /program/robotics/ en Building next generation autonomous robots to serve humanity /program/robotics/2023/11/17/building-next-generation-autonomous-robots-serve-humanity <span>Building next generation autonomous robots to serve humanity </span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-11-17T16:19:59-07:00" title="Friday, November 17, 2023 - 16:19">Fri, 11/17/2023 - 16:19</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/Edgar_Mines_Lab_2023_094.JPG?h=c48d9d91&amp;itok=ekILKiys" width="1200" height="800" alt="A SPOT robot navigating autonomously."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/6" hreflang="en">Christoffer Heckman News</a> <a href="/program/robotics/taxonomy/term/66" hreflang="en">Eric Frew News</a> <a href="/program/robotics/taxonomy/term/12" hreflang="en">Sean Humbert News</a> </div> <a href="/program/robotics/jeff-zehnder">Jeff Zehnder</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p>One thousand feet underground, a four-legged creature scavenges through tunnels in pitch darkness. With vision that cuts through the blackness, it explores a spider web of paths, remembering its every step and navigating with precision. The sound of its movements echo eerily off the walls, but it is not to be feared – this is no wild animal; it is an autonomous rescue robot.</p><p>Initially designed to find survivors in collapsed mines, caves, and damaged buildings, that is only part of what it can do.</p><p>Created by a team of 鶹ӰԺ researchers and students, the robots placed third as the top US entry and <a href="/today/2021/09/24/engineers-take-home-500000-international-underground-robotics-competition" rel="nofollow">earned $500,000 in prize money</a> at a Defense Advanced Projects Research Agency Subterranean Challenge competition in 2021.</p><h2>Going Futher</h2><p>Two years later, they are pushing the technology even further, earning new research grants to expand the technology and create new applications in the rapidly growing world of autonomous systems.</p><p>“Ideally you don’t want to put humans in harm’s way in disaster situations like mines or buildings after earthquakes; the walls or ceilings could collapse and maybe some already have,” said <a href="/mechanical/j-sean-humbert" rel="nofollow">Sean Humbert,</a> a professor of mechanical engineering and director of the <a href="/program/robotics/2023/09/20/cu-boulder-offers-new-graduate-program-robotics" rel="nofollow">Robotics Program at CU 鶹ӰԺ.</a> “These robots can be disposable while still providing situational awareness.”</p><p>The team developed an advanced system of sensors and algorithms to allow the robots to function on their own – once given an assignment, they make decisions autonomously on how to best complete it.</p><h2>Advanced Communication</h2><p>A major goal is to get them from engineers directly into the hands of first responders. Success requires simplifying the way the robots transmit data into something approximating plain English, according to Kyle Harlow, a computer science PhD student.</p><p>“The robots communicate in pure math. We do a lot of work on top of that to interpret the data right now, but a firefighter doesn’t have that kind of time,” Harlow said.</p><p>To make that happen Humbert is collaborating with <a href="/cs/christoffer-heckman" rel="nofollow">Chris Heckman,</a> an associate professor of computer science, to change both how the robots communicate and how they represent the world. The robots’ eyes – a LiDAR sensor – creates highly detailed 3D maps of an environment, 15 cm at a time. That’s a problem when they try to relay information – the sheer amount of data clogs up the network.</p><p>“Humans don’t interpret the environment in 15 cm blocks,” Humbert said. “We’re now working on what’s called semantic mapping, which is a way to combine contextual and spatial information. This is closer to how the human brain represents the world and is much less memory intensive.”</p><h2>High Tech Mapping</h2><p>The team is also integrating new sensors to make the robots more effective in challenging environments. The robots excel in clear conditions but struggle with visual obstacles like dust, fog, and snow. Harlow is leading an effort to incorporate millimeter wave radar to change that.</p><p>“We have all these sensors that work well in the lab and in clean environments, but we need to be able to go out in places such as Colorado where it snows sometimes,” Harlow said.</p><p>Where some researchers are forced to suspend work when a grant ends, members of the subterranean robotics team keep finding new partners to push the technology further.</p><h2>Autonomous Flight</h2><p><a href="/aerospace/eric-frew" rel="nofollow">Eric Frew,</a> a professor of aerospace at CU 鶹ӰԺ, is using the technology for a new National Institute of Standards and Technology competition to develop aerial robots – drones – instead of ground robots, to autonomously map disaster areas indoors and outside.</p><p>“Our entry is based directly on the Subterranean Challenge experience and the systems developed there,” Frew said.</p><p>Some teams in the competition will be relying on drones navigated by human operators, but Frew said CU 鶹ӰԺ’s project is aiming for an autonomous solution that allows humans to focus on more critical tasks.</p><p>Although numerous universities and private businesses are advancing autonomous robotic systems, Humbert said other organizations often focus on individual aspects of the technology. The students and faculty at CU 鶹ӰԺ are working on all avenues of the systems and for uses in environments that present extreme challenges.</p><p>“We’ve built world-class platforms that incorporate mapping, localization, planning, coordination – all the high level stuff, the autonomy, that’s all us,” Humbert said. “There are only a handful of teams across the world that can do that. It’s a huge advantage that CU 鶹ӰԺ has.”</p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <script> window.location.href = `/engineering/2023/11/17/building-next-generation-autonomous-robots-serve-humanity`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 17 Nov 2023 23:19:59 +0000 Anonymous 107 at /program/robotics CU 鶹ӰԺ offers new graduate program in robotics /program/robotics/2023/09/20/cu-boulder-offers-new-graduate-program-robotics <span>CU 鶹ӰԺ offers new graduate program in robotics</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-09-20T10:28:02-06:00" title="Wednesday, September 20, 2023 - 10:28">Wed, 09/20/2023 - 10:28</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/MARBLE_robots.jpg?h=f87dcc5c&amp;itok=qXNmacRW" width="1200" height="800" alt="Two underground robots in a cave."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/4"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/7" hreflang="en">Alessandro Roncone News</a> <a href="/program/robotics/taxonomy/term/6" hreflang="en">Christoffer Heckman News</a> <a href="/program/robotics/taxonomy/term/12" hreflang="en">Sean Humbert News</a> </div> <a href="/program/robotics/jeff-zehnder">Jeff Zehnder</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><div class="ucb-box ucb-box-title-hidden ucb-box-alignment-right ucb-box-style-fill ucb-box-theme-black"><div class="ucb-box-inner"><div class="ucb-box-title">&nbsp;</div><div class="ucb-box-content"><h2><strong>Robotics Degree Programs</strong></h2><ul><li><a href="/program/robotics/academics/doctor-philosophy" rel="nofollow">Doctor of Philosophy</a></li><li><a href="/program/robotics/academics/master-science-thesis" rel="nofollow">Master of Science (Thesis)</a></li><li><a href="/program/robotics/academics/master-science-non-thesis" rel="nofollow">Master of Science (Non-Thesis)</a></li></ul><h3><strong>Program Requirements</strong></h3><ul><li>30 credit hours</li><li>1 required course - Introduction to Robotics</li><li>43 course options</li><li>30 dissertation hours (PhD)</li><li>4-6 dissertation hours (Thesis Master’s)</li></ul></div></div></div><p>The 鶹ӰԺ has started a graduate engineering program in robotics to fill a growing need in an in-demand field.</p><p>The CU Regents have approved new Master of Science and PhD degree options in robotics that will provide students a flexible education that merges hardware and software engineering, mathematics and artificial intelligence into a single program.</p><p>“Demand is so high for degrees like this across the country; it’s something students and employers really want,” said <a href="/program/robotics/node/29" rel="nofollow">Sean Humbert,</a> director of the Robotics Program and a professor in the Paul M. Rady Department of Mechanical Engineering.</p><p>The program brings together a wide array of faculty, research and <a href="/program/robotics/academics/courses" rel="nofollow">class options</a> from the College of Engineering and Applied Science, according to <a href="/program/robotics/node/30" rel="nofollow">Chris Heckman,</a> associate professor of computer science and the robotics program.</p><p>“The workforce in robotics is often siloed, with people only being specialists in certain elements. We want students to be able to work across the field in computer science, mechanical, electrical, aerospace, wherever they need to be,” Heckman said.</p><p>Students enrolled in the program can choose from 40+ different courses taught by leading researchers with strong expertise in key areas, including field robotics, reasoning and assurance, smart materials, human-centered robotics and biomedical robotics.</p><p>“CU 鶹ӰԺ is really strong in robotics, and now we’re bringing together all that expertise,” Humbert said. “This field is so interdisciplinary, and we have strong connections and teams both within the university and in industry and the public sector.”</p><p>鶹ӰԺ and Colorado’s Front Range is home to many businesses active in robotics, providing educational partnership and career options for students and graduates, according to <a href="/program/robotics/node/46" rel="nofollow">Alessandro Roncone,</a> associate director of the Robotics Program and an assistant professor of computer science.</p><p>“This program positions students at the nexus of innovative research and real-world application. Not only will they be taught by leading experts in the field, but they'll also have the opportunity to become leaders in robotics and AI. We are committed to fostering creativity and innovation, and our strong tech ecosystem locally provides an unparalleled environment for growth and discovery,” Roncone said.</p><p>In addition to a research-focused PhD, students enrolled in the master’s program can choose from thesis and non-thesis options, providing graduates with opportunities in academia and technical leadership positions in large industry, startups, emergency services and government.</p><p>The program officially launched for the fall 2023 semester, with students transferring into the program from other CU Engineering graduate programs. Prospective students from outside the university will be welcomed starting in fall 2024. That application window is now open.</p></div> </div> </div> </div> </div> <div>The 鶹ӰԺ has started a graduate engineering program in robotics to fill a growing need in an in-demand field. The CU Regents have approved new Master of Science and PhD degree options in robotics that will...</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/large_image_style/public/feature-title-image/MARBLE_robots_0.jpg?itok=qHf_TBO5" width="1500" height="1000" alt> </div> </div> <div>On</div> <div>White</div> Wed, 20 Sep 2023 16:28:02 +0000 Anonymous 103 at /program/robotics 3D display could soon bring touch to the digital world /program/robotics/2023/07/31/3d-display-could-soon-bring-touch-digital-world <span>3D display could soon bring touch to the digital world</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-07-31T14:31:00-06:00" title="Monday, July 31, 2023 - 14:31">Mon, 07/31/2023 - 14:31</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/ncomms-23-05446b_featuredimage_0_png%281%29.jpg?h=196151e3&amp;itok=AEPK56au" width="1200" height="800" alt="A new, shape-shifting display can sense and respond to human touch."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/28" hreflang="en">Mark Rentschler News</a> <a href="/program/robotics/taxonomy/term/12" hreflang="en">Sean Humbert News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Imagine an iPad that’s more than just an iPad—with a surface that can morph and deform, allowing you to draw 3D designs, create haiku that jump out from the screen and even hold your partner’s hand from an ocean away. &nbsp;</p><p>That’s the vision of a team of engineers from CU&nbsp;鶹ӰԺ. In a new study, they’ve created a one-of-a-kind shape-shifting display that fits on a card table. The device is made from a 10-by-10 grid of soft robotic “muscles” that can sense outside pressure and pop up to create patterns. It’s precise enough to generate scrolling text and fast enough to shake a chemistry beaker filled with fluid.</p><p>It may also deliver something even rarer: the sense of touch in a digital age.</p><p>“As technology has progressed, we started with sending text over long distances, then audio and now video,” said Brian Johnson, one of two lead authors of the new study who earned his doctorate in mechanical engineering at CU 鶹ӰԺ in 2022. “But we’re still missing touch.”</p><p>Johnson and his colleagues described their shape display July 31 <a href="https://www.nature.com/articles/s41467-023-39842-2" rel="nofollow">in the journal "Nature Communications."</a></p><p>The group’s innovation builds off a class of soft robots pioneered by a team led by Christoph Keplinger, formerly an assistant professor of mechanical engineering at CU 鶹ӰԺ. They’re called <a href="/today/2023/04/20/grad-student-helps-design-artificial-muscles-you-can-toss-compost-bin" rel="nofollow">Hydraulically Amplified Self-Healing ELectrostatic</a> (HASEL) actuators. The prototype display isn’t ready for the market yet. But the researchers envision that, one day, similar technologies could lead to sensory gloves for virtual gaming or a smart conveyer belt that can undulate to sort apples from bananas.</p><p>“You could imagine arranging these sensing and actuating cells into any number of different shapes and combinations,” said Mantas Naris, co-lead author of the paper and a doctoral student in the <a href="/mechanical" rel="nofollow">Paul M. Rady Department of Mechanical Engineering</a>. “There’s really no limit to what these technologies could, ultimately, lead to.”</p><p>&nbsp;</p><div class="video-filter"><div class="fluid-width-video-wrapper"></div></div><p>&nbsp;</p><h2>Playing the accordion</h2><p>The project has its origins in the search for a different kind of technology: synthetic organs.</p><p>In 2017, researchers led by Mark Rentschler, professor of mechanical engineering and biomedical engineering, secured funding from the National Science Foundation to develop what they call sTISSUE—squishy organs that behave and feel like real human body parts but are made entirely out of silicone-like materials. Co-investigators on the grant include Keplinger, now a director at the <a href="https://is.mpg.de/" rel="nofollow">Max Planck Institute for Intelligent Systems</a> in Germany; Nikolaus Correll, associate professor in the <a href="/cs" rel="nofollow">Department of Computer Science</a> at CU 鶹ӰԺ; and Sean Humbert, professor of mechanical engineering.</p><p>“You could use these artificial organs to help develop medical devices or surgical robotic tools for much less cost than using real animal tissue,” said Rentschler, a co-author of the new study.</p><p>In developing that technology, however, the team landed on the idea of a tabletop display. The research is part of the <a href="/mse/" rel="nofollow">Materials Science &amp; Engineering Program</a>.</p><p>The group’s design is about the size of a Scrabble game board and, like one of those boards, is composed of small squares arranged in a grid. In this case, each one of the 100 squares is an individual HASEL actuator. The actuators are made of plastic pouches shaped like tiny accordions. If you pass an electric current through them, fluid shifts around inside the pouches, causing the accordion to expand and jump up.&nbsp;</p><p>The actuators also include soft, magnetic sensors that can detect when you poke them. That allows for some fun activities, said Johnson, now a postdoctoral researcher at the Max Planck Institute for Intelligent Systems.</p><p>“Because the sensors are magnet-based, we can use a magnetic wand to draw on the surface of the display,” he said.</p><h2>Hear that?</h2><p>Other research teams have developed similar smart tablets, but the CU 鶹ӰԺ display is softer, takes up a lot less room and is much faster. Each of its robotic muscles can activate as much as 50 times per second.&nbsp;</p><p>The researchers are focusing now on shrinking the actuators to increase the resolution of the display—almost like adding more pixels to a computer screen.</p><p>“Imagine if you could load an article onto your phone, and it renders as Braille on your screen,” Naris said.</p><p>The group is also working to flip the display inside out. That way, engineers could design a glove that pokes your fingertips, allowing you to “feel” objects in virtual reality.&nbsp;</p><p>And, Rentschler said, the display can bring something else: a little peace and quiet.&nbsp;</p><p>“Our system is, essentially, silent. The actuators make almost no noise.”&nbsp;</p><hr><p><em>Other CU 鶹ӰԺ co-authors of the new study include mechanical engineering graduate students Vani Sundaram, Angella Volchko and Khoi Ly; and alumni Shane Mitchell, Eric Acome and Nick Kellaris.</em></p></div></div></div><div class="element-max-width1"><div class="article-meta-wrapper"><div class="article-meta"><div class="article-meta-section article-meta-categories"><span>Categories:</span><div class="item-list"><ul><li><a href="/today/science-technology" rel="nofollow">Science &amp; Technology</a></li><li><a href="/today/news-headlines-articles" rel="nofollow">News Headlines</a></li></ul></div></div></div></div></div></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <script> window.location.href = `/today/2023/07/31/3d-display-could-soon-bring-touch-digital-world`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 31 Jul 2023 20:31:00 +0000 Anonymous 73 at /program/robotics CU 鶹ӰԺ team takes home $500,000 in international underground robotics competition /program/robotics/2023/07/12/cu-boulder-team-takes-home-500000-international-underground-robotics-competition <span>CU 鶹ӰԺ team takes home $500,000 in international underground robotics competition </span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-07-12T12:39:39-06:00" title="Wednesday, July 12, 2023 - 12:39">Wed, 07/12/2023 - 12:39</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/louisville1_0_jpg.jpg?h=561546e2&amp;itok=NE3EVJnx" width="1200" height="800" alt="Students observing two robots underground."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/4"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/12" hreflang="en">Sean Humbert News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><div class="advanced-article-content col-lg-8 col-md-8 col-sm-6 col-xs-12"><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>A CU 鶹ӰԺ team has taken home third place and $500,000 in prize money in an international competition that sends <a href="/today/2020/02/05/drones-go-underground-high-stakes-competition" rel="nofollow">teams of robots deep underground</a> to conduct search-and-rescue operations.</p><p>The CU 鶹ӰԺ group, made up of engineers from across the university, took part in the final event of the Defense Advanced Research Projects Agency (DARPA) <a href="https://www.subtchallenge.com/" rel="nofollow">Subterranean Challenge</a> from Sept. 21-24 in Louisville, Kentucky. The competition, now in its third year, has pushed the bounds of what autonomous vehicles are capable of: Over three tense challenges, teams from around the world launched fleets of drones into underground caves, mines, subway tunnels and more to complete a high-tech game of hide-and-seek—searching for lost “artifacts” amid hazards like dust and mud and steep drops.</p><p>The competition strives to develop new technologies that could one day safely locate the human survivors of disasters like mine and cave collapses.&nbsp;</p><p>“I couldn’t be prouder of all of our brilliant and talented graduate and undergraduate students that earned this award and international recognition,” said Sean Humbert, a professor in the Paul M. Rady Department of Mechanical Engineering who leads the CU 鶹ӰԺ team.&nbsp;</p><p>DARPA recognized the CU 鶹ӰԺ group, named Multi-agent Autonomy with Radar-Based Localization for Exploration (MARBLE), Friday, Sept. 24, at a prize ceremony in Kentucky.&nbsp;</p><p>MARBLE includes researchers from CU Denver, the University of California, Santa Cruz and the Massachusetts-based Scientific Systems Company, Inc. Researchers from CU 鶹ӰԺ also hailed from the Department of Computer Science&nbsp;and Ann and H.J. Smead Department of Aerospace Engineering Sciences. In 2018, the group received $4.5 million to join in the challenge and competed in a cohort of six&nbsp;funded teams.&nbsp;</p><p>&nbsp;</p><div class="feature-layout-callout feature-layout-callout-small feature-layout-callout-float-right clearfix"><div class="feature-layout-callout-inner element-max-width-padding"><p class="hero"><a href="https://www.youtube.com/watch?v=SyjeIGCHnrU" rel="nofollow"><strong>&nbsp;Watch live coverage of the final event from DARPA</strong></a></p><p class="hero">&nbsp;</p></div></div><p>Stefanie Tompkins, director of DARPA, addressed the competitors at the awards ceremony: “I have heard some of you say off to the side and some of you say directly to my face that when they first heard about this, they were absolutely positive it was impossible,” she said. “So thank you for ignoring your gut feelings and diving into this competition and proving to all of us that it’s not impossible.”</p><p>For the final event, eight teams, including several un-funded groups, traveled to Kentucky’s famed Louisville Mega Cavern. Spanning nearly 100 acres, this former mine ranks as the biggest building in the state. It includes warehouses, tunnels and natural caverns—complete with dangers like stairs, rough pathways and DARPA-installed “dynamic” obstacles like falling debris.&nbsp;</p><p>To nab third place, MARBLE sent a group of four robots into this perilous environment, including two rolling vehicles and two dog-like robots manufactured by the company Boston Dynamics. The robots worked on their own to earn points by finding series of targets, such as backpacks, cellphones, gas leaks and lost helmets.</p><p>And the group was able to adjust its strategy as it went.</p><p>“Early on at the final event, our team’s perspective began to shift to thinking about our system less as an autonomy experiment and more as a tool to interrogate subterranean environments,” Humbert said. “We started to incorporate additional human-robot interaction elements that resulted in an additional five to seven points in the final round, and ultimately a third-place finish.”</p><p>MARBLE earned 18 points, placing behind second-place finisher CSIRO Data61, led by the Commonwealth Scientific and Industrial Research Organization in Australia, with 23 points. First place and $2 million went to CERBERUS, led by ETH Zurich. MARBLE also earned a special recognition for finding an artifact, a red power drill, before any other team—just 68 seconds into the competition.&nbsp;</p></div></div></div></div><div class="advanced-article-advanced-content col-lg-4 col-md-4 col-sm-6 col-xs-12"><div class="article-adv-content-field advanced-article-field_adv_article_gallery"><div class="field field-name-field-adv-article-gallery field-type-entityreference field-label-hidden"><div class="field-items"><div class="field-item even"><div class="gallery-view-mode-embed node-view-mode-embed clearfix"><div class="gallery-view-mode-embed-content node-view-mode-embed-content"><h3><a href="/today/scenes-subterranean-challenge-finals" rel="nofollow">Scenes from the Subterranean Challenge finals</a></h3><div class="node-view-mode-embed-summary gallery-view-mode-embed-summary"><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even">From Sept. 21-24, 2021 CU 鶹ӰԺ engineers took part in a high-stakes challenge that sent fleets of robots deep underground to search for lost "artifacts."<a href="/today/scenes-subterranean-challenge-finals" rel="nofollow">View all photos »</a></div></div></div></div><div class="view view-photo-gallery view-id-photo_gallery view-display-id-entity_view_2 clearfix view-dom-id-118c8addbe727948544a492a2710c622"><div class="view-content"><div class="views-row views-row-1 views-row-odd views-row-first"><div class="views-field views-field-field-photo"><div class="field-content"><a href="/today/sites/default/files/styles/large/public/gallery/louisville7.jpg?itok=0uUs75nB" rel="nofollow"></a></div></div></div><div class="views-row views-row-2 views-row-even"><div class="views-field views-field-field-photo"><div class="field-content"><a href="/today/sites/default/files/styles/large/public/gallery/louisville6.jpg?itok=L_DwLtAC" rel="nofollow"></a></div></div></div><div class="views-row views-row-3 views-row-odd"><div class="views-field views-field-field-photo"><div class="field-content"><a href="/today/sites/default/files/styles/large/public/gallery/louisville8.jpg?itok=ZuCPOFxu" rel="nofollow"></a></div></div></div><div class="views-row views-row-4 views-row-even"><div class="views-field views-field-field-photo"><div class="field-content"><a href="/today/sites/default/files/styles/large/public/gallery/louisville9.jpg?itok=poVS_sZy" rel="nofollow"></a></div></div></div><div class="views-row views-row-5 views-row-odd"><div class="views-field views-field-field-photo"><div class="field-content"><a href="/today/sites/default/files/styles/large/public/gallery/louisville4.jpg?itok=q0O5ZU4c" rel="nofollow"></a></div></div></div><div class="views-row views-row-6 views-row-even views-row-last"><div class="views-field views-field-field-photo"><div class="field-content"><a href="/today/sites/default/files/styles/large/public/gallery/louisville10.jpg?itok=YbAFHV1e" rel="nofollow"></a></div></div></div></div></div><a href="/today/scenes-subterranean-challenge-finals" rel="nofollow">View all photos</a></div></div></div></div></div></div></div></div> </div> </div> </div> </div> <script> window.location.href = `/today/2021/09/24/engineers-take-home-500000-international-underground-robotics-competition`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 12 Jul 2023 18:39:39 +0000 Anonymous 19 at /program/robotics