Sangam: A Confluence of Knowledge Streams

iSAM2: Incremental smoothing and mapping using the Bayes tree

Show simple item record

dc.contributor Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
dc.contributor Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.contributor Massachusetts Institute of Technology. Department of Mechanical Engineering
dc.contributor Kaess, Michael
dc.contributor Johannsson, Hordur
dc.contributor Leonard, John Joseph
dc.creator Kaess, Michael
dc.creator Johannsson, Hordur
dc.creator Roberts, Richard
dc.creator Ila, Viorela
dc.creator Leonard, John Joseph
dc.creator Dellaert, Frank
dc.date 2013-05-14T20:17:26Z
dc.date 2013-05-14T20:17:26Z
dc.date 2011-12
dc.date.accessioned 2023-03-01T18:09:26Z
dc.date.available 2023-03-01T18:09:26Z
dc.identifier 0278-3649
dc.identifier 1741-3176
dc.identifier http://hdl.handle.net/1721.1/78894
dc.identifier Kaess, M. et al. “iSAM2: Incremental Smoothing and Mapping Using the Bayes Tree.” The International Journal of Robotics Research 31.2 (2011): 216–235.
dc.identifier https://orcid.org/0000-0002-8863-6550
dc.identifier.uri http://localhost:8080/xmlui/handle/CUHPOERS/278965
dc.description We present a novel data structure, the Bayes tree, that provides an algorithmic foundation enabling a better understanding of existing graphical model inference algorithms and their connection to sparse matrix factorization methods. Similar to a clique tree, a Bayes tree encodes a factored probability density, but unlike the clique tree it is directed and maps more naturally to the square root information matrix of the simultaneous localization and mapping (SLAM) problem. In this paper, we highlight three insights provided by our new data structure. First, the Bayes tree provides a better understanding of the matrix factorization in terms of probability densities. Second, we show how the fairly abstract updates to a matrix factorization translate to a simple editing of the Bayes tree and its conditional densities. Third, we apply the Bayes tree to obtain a completely novel algorithm for sparse nonlinear incremental optimization, named iSAM2, which achieves improvements in efficiency through incremental variable re-ordering and fluid relinearization, eliminating the need for periodic batch steps. We analyze various properties of iSAM2 in detail, and show on a range of real and simulated datasets that our algorithm compares favorably with other recent mapping algorithms in both quality and efficiency.
dc.description United States. Office of Naval Research (Grant N00014-06-1-0043)
dc.description United States. Office of Naval Research (Grant N00014-10-1-0936)
dc.format application/pdf
dc.language en_US
dc.publisher Sage Publications
dc.relation http://dx.doi.org/10.1177/0278364911430419
dc.relation International Journal of Robotics Research
dc.rights Creative Commons Attribution-Noncommercial-Share Alike 3.0
dc.rights http://creativecommons.org/licenses/by-nc-sa/3.0/
dc.source MIT web domain
dc.title iSAM2: Incremental smoothing and mapping using the Bayes tree
dc.type Article
dc.type http://purl.org/eprint/type/JournalArticle


Files in this item

Files Size Format View
Leonard_iSAM2.pdf 1.693Mb application/pdf View/Open

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse