{"id":3313,"date":"2016-01-20T23:38:25","date_gmt":"2016-01-21T04:38:25","guid":{"rendered":"http:\/\/blog.dankohn.info\/?p=3313"},"modified":"2020-08-17T17:38:54","modified_gmt":"2020-08-17T17:38:54","slug":"lets-bring-rosie-home","status":"publish","type":"post","link":"http:\/\/blog.dankohn.info\/index.php\/2016\/01\/20\/lets-bring-rosie-home\/","title":{"rendered":"5 Challenges We Need to Solve for Home Robots"},"content":{"rendered":"<p>Let\u2019s Bring Rosie Home: 5 Challenges We Need to Solve for Home Robots<br \/>\nFrom <a href=\"http:\/\/spectrum.ieee.org\/automaton\/robotics\/home-robots\/lets-bring-rosie-home-5-challenges-we-need-to-solve-for-home-robots\/?utm_source=computerwise&#038;utm_medium=email&#038;utm_campaign=011916\">IEEE Spectrum<\/a><br \/>\nBy Shahin Farshchi<br \/>\nPosted 13 Jan 2016 | 16:25 GMT<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/blog.dan-kohn.us\/wp-content\/uploads\/2016\/01\/RosieAlamyENPA92-1452542705197-1452704447500-300x225.jpg\" alt=\"RosieAlamyENPA92-1452542705197-1452704447500\" width=\"300\" height=\"225\" class=\"aligncenter size-medium wp-image-3314\" srcset=\"http:\/\/blog.dankohn.info\/wp-content\/uploads\/2016\/01\/RosieAlamyENPA92-1452542705197-1452704447500-300x225.jpg 300w, http:\/\/blog.dankohn.info\/wp-content\/uploads\/2016\/01\/RosieAlamyENPA92-1452542705197-1452704447500.jpg 620w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/p>\n<p>Science fiction authors love the robot sidekick. R2-D2, Commander Data, and KITT\u2014just to name a few\u2014defined \u201cStar Wars,\u201d \u201cStar Trek,\u201d and \u201cKnight Rider,\u201d respectively, just as much as their human actors. While science has brought us many of the inventions dreamed of in sci-fi shows, one major human activity has remained low tech and a huge source of frustration: household chores. Why can\u2019t we have more robots helping us with our domestic tasks? That\u2019s a question that many roboticists and investors (myself included) have long been asking ourselves. Recently, we\u2019ve seen some promising developments in the home robotics space, including Jibo\u2019s successful financing and SoftBank\u2019s introduction of Pepper. Still, a capable, affordable robotic helper\u2014like Rosie, the robot maid from \u201cThe Jetsons\u201d\u2014remains a big technical and commercial challenge. Should robot makers focus on designs that are extensions of our smartphones (as Jibo seems to be doing), or do we need a clean-sheet approach towards building these elusive bots?<\/p>\n<p>Take a look at the machines in your home. If you remove the bells and whistles, home automation hasn\u2019t dramatically changed since the post\u2013World War II era. Appliances, such as washing machines, dishwashers, and air conditioners, seemed magical after WWII. Comprised primarily of pumps, motors, and plumbing, they were simply extensions of innovations that came to bear during the industrial revolution. It probably doesn\u2019t come as a surprise that industrial behemoths such as GE, Westinghouse, and AEG (now Electrolux) shepherded miniature versions of the machines used in factories into suburban homes. At the time, putting dirty clothes and dishes into a box from which they emerged clean was rather remarkable. To this day, the fundamental experience remains the same, with improvements revolving around reliability, and efficiency. Features enabled by Internet-of-Things technologies are marginal at best, i.e., being able to log into your refrigerator or thermostat through your phone.<\/p>\n<p>But before wondering when we\u2019ll have home robots, it might be fair to ask: Do we even need them? Consider what you can already do just by tapping on your phone, thanks to a host of on-demand service startups. Instacart brings home the groceries; Handy and Super send professionals to fix or clean your home; Pager brings primary care, while HomeTeam does elderly care. (Disclosure: my company, Lux Capital, is an investor in Super, Pager, and HomeTeam.) So, again, why do we need robots to perform these services when humans seem to be doing them just fine? I don\u2019t think anyone has a compelling answer to that question today, and home robots will probably evolve and transform themselves over and over until they find their way into our homes. Indeed, it took decades of automobiles until the Model T was born. The Apple IIs and PC clones of the early 1980s had difficulty justifying their lofty price tags to anyone who wasn\u2019t wealthy, or a programmer. We need to expect the same from our first home bots.<\/p>\n<p>So it might be helpful to examine what problems engineers need to crack before they can attempt to build something like Rosie the robot. Below I discuss five areas that I believe need significant advances if we want to move the whole home robot field forward.<\/p>\n<p>1. We Need Machine-Human Interfaces<\/p>\n<p>Siri and Amazon\u2019s Alexa demonstrate how far speech recognition and natural language processing have come. Unfortunately, they are no more than a human-machine interface, designed to displace the keyboard and mouse. What we need is a machine-human interface. Where is the distinction? It starts with understanding people, rather than aggregating data and using statistical patterns to make inferences. It can understand our moods and emotional contexts, as an artificial intelligence would. Humans do not interact with one another through a series of commands (well, maybe some do); they establish a connection, and once a computer can take on that role, then we have a true machine-human interface. Scientists are starting to tackle this by applying concepts used in programming toward establishing rules for robot-human conversations, but we\u2019ll need much more if we want to have engaging AI assistants like the one in the movie \u201cHer.\u201d \u200b<\/p>\n<p>2. Cheap Sensors Need to Get Cheaper<\/p>\n<p>Driverless cars will generate hard cash for their operators, so forking over thousands for an array of lidar, radar, ultrasound, and cameras is a no-brainer. Home robots, however, may need to fit the ever-discretionary consumer budget. The array of sensors the robot would need to properly perceive its environment could render it cost prohibitive unless the sensors cost pennies as they do in mobile phones. MEMS technology dramatically lowered the cost of inertial sensors, which previously cost thousands of dollars and were relegated to aircraft and spacecraft. Can computer vision applied to an array of cheap cameras and infrared sensors provide adequate sensing capability? And can we expect lidar to come down in price, or do we need a whole new sensing technology? A startup called Dual Aperture has added a second aperture for infrared hence creating the ability to infer short distances. Meanwhile, DARPA is funding the research on chip-based lidar, and Quanergy expects to launch a solid-state optical phased array, thereby eliminating the mechanical components that raise the cost of lidar. We expect engineers to find creative ways to reduce the cost of existing sensing technology, while obviating others altogether, and hopefully making them as cheap as sensors in our phones today.<\/p>\n<p>3. Manipulators Need to Get a Grip<\/p>\n<p>Loose objects find their way into our homes because they are easy to manipulate with our hands. If we expect a robot to be able to clean and organize these objects as efficiently as humans do, it needs manipulators that are at least as effective as the human hand. Companies such as Robotiq, Right Hand Robotics, and Soft Robotics, among others, have designed efficient and reliable manipulators. Though air-powered inflatable grippers have the advantage of being soft and lightweight, they do require a pump, which is not very practical for a mobile robot. Efforts funded by DARPA at iRobot, SRI, and other labs and companies seem to be taking us in the right direction, helping robots get a grip.<\/p>\n<p>4. Robots Need to Handle Arbitrary Objects<\/p>\n<p>Opening doors, flipping switches, and cleaning up scattered toys are simple tasks for us humans, but compute-intense for machines today. A robot like Roomba performs two tasks: running a suction motor and generating a path along what\u2019s expected to be a flat surface with rigid obstacles. How about washing dishes or folding laundry? These tasks require a suite of capabilities ranging from recognizing objects, identifying grasping points, understanding how an object will interact with other objects, and even predicting the consequences of being wrong. DARPA, NSF, NASA, and European Union science funding agencies are sponsoring much-needed research in this area, but \u201csolving manipulation\u201d will probably require leveraging a number of different technologies, including cloud robotics and deep learning.<\/p>\n<p>5. Navigating Unstructured Environments Needs to Become Routine<\/p>\n<p>Anyone who saw this year\u2019s DARPA Robotics Challenge would appreciate how difficult of a problem it is to navigate and manipulate an unstructured, unknown environment. Those robots were slow. Though driverless cars pose a formidable challenge, it has been proven to be more tractable. Deep learning techniques can help robots recognize soft objects vs. hard obstructions, and human assistants may be able to \u201cteach\u201d robots until the algorithms take over. Like the manipulation problem, real-time navigation requires robots to quickly sense, perceive, and execute\u2014probably several orders of magnitude faster that they DRC winning team is today. Full autonomy won\u2019t happen overnight, but that isn\u2019t a problem: humans can help robots get out of a bind. Willow Garage was a pioneer with its Heaphy Project, where assistance to robots is crowdsourced to remote operators. More robotics and industrial automation companies are embracing the notion of humans overseeing robots, with the expectation of going from the (superfluous) 1:1 human-robot ratio to a single operator being able to oversee\/assist many robots.<\/p>\n<p>Shahin Farshchi is a partner at Lux Capital, where he invests in hardware and robotics companies. Follow him on Twitter:@farshchi<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Let\u2019s Bring Rosie Home: 5 Challenges We Need to Solve for Home Robots From IEEE Spectrum By Shahin Farshchi Posted 13 Jan 2016 | 16:25 GMT Science fiction authors love the robot sidekick. R2-D2, Commander Data, and KITT\u2014just to name &hellip; <a href=\"http:\/\/blog.dankohn.info\/index.php\/2016\/01\/20\/lets-bring-rosie-home\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3,5],"tags":[],"class_list":["post-3313","post","type-post","status-publish","format-standard","hentry","category-ieee","category-robot-news"],"_links":{"self":[{"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/posts\/3313","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/comments?post=3313"}],"version-history":[{"count":1,"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/posts\/3313\/revisions"}],"predecessor-version":[{"id":4987,"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/posts\/3313\/revisions\/4987"}],"wp:attachment":[{"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/media?parent=3313"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/categories?post=3313"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/blog.dankohn.info\/index.php\/wp-json\/wp\/v2\/tags?post=3313"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}