Uncertainty Aware Learning from Demonstrations in Multiple Contexts using Bayesian Neural Networks
| Authors |
|
|---|---|
| Publication date | 2019 |
| Book title | 2019 International Conference on Robotics and Automation (ICRA) |
| Book subtitle | Montreal, Quebec, Canada, 20-24 May 2019 |
| ISBN |
|
| ISBN (electronic) |
|
| Event | 2019 IEEE International Conference on Robotics and Automation |
| Volume | Issue number | 1 |
| Pages (from-to) | 768-774 |
| Publisher | [Piscataway, NJ]: IEEE |
| Organisations |
|
| Abstract |
Diversity of environments is a key challenge that causes learned robotic controllers to fail due to the discrepancies between the training and evaluation conditions. Training from demonstrations in various conditions can mitigate - but not completely prevent - such failures. Learned controllers such as neural networks typically do not have a notion of uncertainty that allows to diagnose an offset between training and testing conditions, and potentially intervene. In this work, we propose to use Bayesian Neural Networks, which have such a notion of uncertainty. We show that uncertainty can be leveraged to consistently detect situations in high-dimensional simulated and real robotic domains in which the performance of the learned controller would be sub-par. Also, we show that such an uncertainty based solution allows making an informed decision about when to invoke a fallback strategy. One fallback strategy is to request more data. We empirically show that providing data only when requested results in increased data-efficiency.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1109/ICRA.2019.8794328 |
| Other links | http://www.proceedings.com/49859.html |
| Downloads |
08794328
(Final published version)
|
| Permalink to this page | |
