{"id":4157,"date":"2019-06-05T11:10:11","date_gmt":"2019-06-05T16:10:11","guid":{"rendered":"http:\/\/blog.dankohn.info\/?p=4157"},"modified":"2020-08-17T17:34:00","modified_gmt":"2020-08-17T17:34:00","slug":"ford-self-driving-vans-will-use-legged-robots-to-make-deliveries","status":"publish","type":"post","link":"https:\/\/blog.dankohn.info\/index.php\/2019\/06\/05\/ford-self-driving-vans-will-use-legged-robots-to-make-deliveries\/","title":{"rendered":"Ford Self-Driving Vans Will Use Legged Robots to Make Deliveries"},"content":{"rendered":"\n<p>from <a href=\"https:\/\/spectrum.ieee.org\/automaton\/robotics\/humanoids\/ford-self-driving-vans-will-use-legged-robots-to-make-deliveries?utm_campaign=roboticsnews-06-04-19&amp;utm_medium=email&amp;utm_source=roboticsnews&amp;mkt_tok=eyJpIjoiWlRBeU1XRXlObU5rT1RZMiIsInQiOiJuVE1handHNVErZzR6XC92ODdPYzBRbFNVc0s0MnhDVGV0WFwvNHlhQnp0OFB2cG9cL0NFY0hiaVdMUFwvSXRUTmtHeTlMZ3duVnpVbHoyeFFLN3BkbmZSbHNQUFlZNGwxQUpnM3U4R1lnZUtKTXBMdWc0SkNpZFRjWktYUGZWb3ExVWgifQ%3D%3D\">IEEE Spectrum<\/a> 22 May 2019<\/p>\n\n\n\n<p>Ford is adding legs to its robocars\u2014sort of.<\/p>\n\n\n\n<p>The automaker is announcing today that its fleet of autonomous \ndelivery vans will carry more than just packages: Riding along with the \nboxes in the back there will be a two-legged robot.<\/p>\n\n\n\n<p><a href=\"https:\/\/spectrum.ieee.org\/robotics\/humanoids\/building-robots-that-can-go-where-we-go\">Digit, Agility Robotics\u2019 humanoid unveiled earlier this year<\/a> on the cover of <em>IEEE Spectrum<\/em>,\n is designed to move in a more dynamic fashion than regular robots do, \nand it\u2019s able to walk over uneven terrain, climb stairs, and carry \n20-kilogram packages.<\/p>\n\n\n\n<p>Ford says in a <a href=\"https:\/\/medium.com\/@ford\/meet-digit-self-driving-delivery-last-mile-solution-418d9995bb97\">post on Medium<\/a>\n that Digit will bring boxes from the curb all the way to your doorstep,\n covering those last few meters that self-driving cars are unable to. \nThe company plans to launch a self-driving vehicle service in 2021.<\/p>\n\n\n\n<figure class=\"wp-block-embed-youtube wp-block-embed is-type-video is-provider-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\nhttps:\/\/youtu.be\/WHWciIxNK2c\n<\/div><\/figure>\n\n\n\n<p><a href=\"https:\/\/robots.ieee.org\/robots\/digit\/?utm_source=spectrum\">Digit<\/a>\n performs flawlessly in the video, although it wasn\u2019t operating fully \nautonomously. It&nbsp;was being teleoperated at a high level via commands \nlike \u201cwalk to this location,\u201d \u201cclimb the stairs,\u201d and \u201cput down the \nbox.\u201d We\u2019re told that Digit didn\u2019t fall over even once during filming, \nbut certainly a bigger challenge for the robot will be to perform this \nwell across the wide variety of homes that it may eventually have to \nhandle, with obstacles like inclined surfaces, different types of \nstairs, overgrown yards, gates, and wayward pets and\/or children. \n  \n    Having a vehicle serve as a base station provides a variety of \nadvantages for Digit. It can carry a smaller battery because it will \nfrequently return to the vehicle to recharge. And while it has cameras \nand a lidar, Digit will have help from the vehicle to do mapping and \npath planning \n   \n  <\/p>\n\n\n\n<p>Having a vehicle serve as a base station provides a variety of \nadvantages for Digit. For example, Digit can get away with a much \nsmaller battery than&nbsp;most large humanoids, because it only really needs \nto operate for a few minutes at a time before returning to the vehicle \nto recharge as it drives to the next delivery stop. And while Digit \ncarries several stereo cameras and a lidar, it will have help from its \ncompanion robovan to do much of the mapping and path planning required \nto carry out a delivery. That\u2019s an advantage, Ford says, because its \nautonomous vehicles are equipped with much more powerful sensors and \ncomputers than Digit could carry alone.<\/p>\n\n\n\n<p>From the Medium post:<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p><em>Digit itself will have just enough sensory power to travel \nthrough basic situations. If it comes across an unexpected obstacle, it \ncan send an image back to the vehicle and have the car figure out a \nsolution. The car could even send that information into the cloud and \nask for other systems to help Digit navigate its environment, providing \nmultiple levels of added assistance while keeping the robot light and \nnimble.<\/em><\/p><\/blockquote>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/spectrum.ieee.org\/image\/MzMwNjk5Mg.gif\" alt=\"Digit robot coming out of self-driving van\"\/><figcaption>\n     Image: Ford \n   <\/figcaption><\/figure>\n\n\n\n<p>This is a very interesting concept, and to learn more about it (and\n about how Digit will handle all the rest of this operation), we spoke \nwith&nbsp;Agility Robotics CEO&nbsp;<a href=\"https:\/\/www.linkedin.com\/in\/damion-shelton-1b5b90a\/\">Damion Shelton<\/a>.<\/p>\n\n\n\n<p><strong><em>IEEE Spectrum:<\/em> Offloading the sensing and \ncomputing required for autonomous navigation is a very interesting \nidea\u2014can you break down what will be done on the robot and what will be \ndone on the vehicle?&nbsp;<\/strong><\/p>\n\n\n\n<p><em>Damion Shelton: The exact split is still to be determined, but \nthe basic idea is to run things that require real-time (or close to it) \nprocessing on the robot, and push other tasks off-board. Examples of the\n former are things like footstep placement, low-level postural control, \nexecution of previously trained RL behaviors, and path-planning out to 3\n to 5 steps. Tasks that could be pushed to the vehicle include storage \nand retrieval of maps, training of RL behaviors, and initialization of \nthe robot\u2019s global pose during deployment. The initialization of global \npose is actually one of the most important things the vehicle can be \nused for, in our view. Absent that, Digit would need to build a local \nworld model from ground zero&nbsp;every time it gets out of the vehicle.<\/em><\/p>\n\n\n\n<p><strong>Having bipedal robots that are mechanically capable of \ntraversing semi-structured terrain is often very far from having bipedal\n robots that are actually able to reliably operate in semi-structured \nterrain without human supervision. How will you develop the confidence \nto deploy Digit in real-world use, and what are the biggest challenges \nyou\u2019ll need to solve?<\/strong><\/p>\n\n\n\n<p><em>We don\u2019t anticipate operating without human supervision for \nquite a while. The form that this takes will relax over time; initially,\n we would expect a human to be present in the immediate vicinity of the \nrobot during operation. After we\u2019re confident that the performance in a \nparticular geofenced area is reliable, direct monitoring could be \nreplaced with \u201ccall center\u201d style central monitoring, but that\u2019s a \nminimum of several years out. From the perspective of data gathering and\n continued refinement of both hardware and software, the fact that \nmonitoring is required in the immediate future isn\u2019t really a detriment.\n Particularly in collaborative applications\u2014say, where the robot is a \nlabor assistant to a delivery driver\u2014the additional cost to have a human\n partially in the loop is close to zero&nbsp;(since the driver is already \ndoing the work now).<\/em><\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/spectrum.ieee.org\/image\/MzMwNzExMw.jpeg\" alt=\"Digit delivering package from Ford autonomous vehicle\"\/><figcaption>\n     Photo: Ford and Agility Robotics \n   <\/figcaption><\/figure>\n\n\n\n<p><strong><a href=\"http:\/\/robots.ieee.org\/robots\/digit\/?utm_source=spectrum\">Digit<\/a>\n will likely have to interact with a variety of non-deterministic, \ndynamic obstacles, like other people or pets. How much of a concern is \nhaving reliable autonomy when there\u2019s potential for all kinds of \nunpredictable edge cases?<\/strong><\/p>\n\n\n\n<p><em>From a test deployment standpoint (tens to hundreds of robots \nin scale) our plan is to avoid edge cases that we\u2019re not able to handle \nand allow just enough uncertainty into the mix to keep our R&amp;D \nmoving forward. For the first 12 to 18 months of testing\u2014starting in \nearly 2020\u2014we anticipate pre-mapping and qualifying all of the \nenvironments we operate in. This is what the majority of autonomous \nvehicle&nbsp;companies have done: Geofence an area you understand, and get \ncomfortable there before expanding. It\u2019s certainly true that we won\u2019t be\n able to deal with a majority of the \u201chard problems\u201d in the world early \non, but we don\u2019t see that as a barrier to deployment. We don\u2019t need to \naddress the most difficult&nbsp;cases, since even the easiest 1\/10th of a \npercent of market is enormous relative to any plausible sustained growth\n rate.<\/em><\/p>\n\n\n\n<p><em>But&nbsp;that\u2019s&nbsp;not to minimize the difficulty of the edge cases. \nYou\u2019re exactly correct that reliability in the real world is \nchallenging\u2014we hope that by getting Digits out in the world as soon as \npossible, we start to collect data on the hard problems even if we don\u2019t\n (yet) have a deployable solution.<\/em><\/p>\n\n\n\n<p><strong>Will Digit be able to interact with humans directly? What would those interactions look like?<\/strong><\/p>\n\n\n\n<p><em>We\u2019re not super focused on human-robot interaction problems, \nother than as they relate to mobility. In a perfect world, Digit blends \ninto the background and interactions are primarily non-verbal. You know \nthat other pedestrians aren&#8217;t going to run into you on a sidewalk by \nhaving a mental model of posture, gait dynamics, and so on. We think a \nlot about those kinds of dynamic cues, but don\u2019t have plans to turn \nDigit into a witty conversationalist. That being said, the production \nversion of Digit is going to have a speaker on it, and a light display, \nboth of which can be used to provide minimalist feedback to the outside \nworld.<\/em><\/p>\n\n\n\n<p><strong>Is this the application you had in mind when you designed Digit? What other kinds of things would you like to see Digit doing?<\/strong><\/p>\n\n\n\n<p><em>Yes, at least in the sense that we believed from the beginning \nthat the best early market for Digit would be in logistics. It\u2019s a \nmarket that requires the mobility of legs (at least in the areas we\u2019re \nfocusing on) while not requiring super advanced AI (in \u201ceasy\u201d \nenvironments), FDA certification (e.g. in-home assistive robotics for \nthe elderly), or harsh environment operations (e.g. firefighting). \nBasically, if you can move through the world and carry a box, you\u2019ve \naddressed the absolute minimalist use case for logistics.<\/em><\/p>\n\n\n\n<p><em>Delivery services are a large and rapidly growing industry, \nwhich also gives us the ability to focus on a profitable use-case from \nday one. Many of the \u201cdull dirty dangerous\u2019 jobs that robots are usually\n targeted at are both quite challenging and and relatively low volume. \nLegs have been talked about for years as a tool for disaster recovery, \nsearch and rescue, and so on, but these are enormously challenging \nenvironments to move through and the business case is hard to \nrationalize out of the gate. Conversely, if we have a fleet of Digits \nthat learn to move through the world with the large training set of \nlast-mile environments, and then simultaneously have the cost pressure \nand economy of scale of a commercial deployment, the odds of us then \nbeing able to offer a competitive product in more specialized markets \ngoes up dramatically.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>from IEEE Spectrum 22 May 2019 Ford is adding legs to its robocars\u2014sort of. The automaker is announcing today that its fleet of autonomous delivery vans will carry more than just packages: Riding along with the boxes in the back &hellip; <a href=\"https:\/\/blog.dankohn.info\/index.php\/2019\/06\/05\/ford-self-driving-vans-will-use-legged-robots-to-make-deliveries\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3,5],"tags":[],"class_list":["post-4157","post","type-post","status-publish","format-standard","hentry","category-ieee","category-robot-news"],"_links":{"self":[{"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/posts\/4157","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/comments?post=4157"}],"version-history":[{"count":1,"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/posts\/4157\/revisions"}],"predecessor-version":[{"id":4696,"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/posts\/4157\/revisions\/4696"}],"wp:attachment":[{"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/media?parent=4157"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/categories?post=4157"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/tags?post=4157"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}