Please use this identifier to cite or link to this item: http://hdl.handle.net/20.500.12188/19807
DC FieldValueLanguage
dc.contributor.authorMirchev, Miroslaven_US
dc.contributor.authorPetrovski, Kristijanen_US
dc.contributor.authorJovanovski, Stoleen_US
dc.contributor.authorBasnarkov, Laskoen_US
dc.date.accessioned2022-06-28T09:48:23Z-
dc.date.available2022-06-28T09:48:23Z-
dc.date.issued2016-09-05-
dc.identifier.urihttp://hdl.handle.net/20.500.12188/19807-
dc.description.abstractIn this work we analyze robot motion given from the UTIAS Multi-Robot Dataset. The dataset contains recordings of robots wandering in a confined environment with randomly spaced static landmarks. After some preprocessing of the data, an algorithm based on the Extended Kalman Filter is developed to determine the positions of robots at every instant of time using the positions of the landmarks. The algorithm takes into account the asynchronous time steps and the sparse measurement data to develop its estimates. These estimates are then compared with the groundtruth data provided in the same dataset. Furthermore several methods of noise estimation are tested, which improve the error of the estimate for some robotsen_US
dc.publisherSpringer, Chamen_US
dc.subjectrobot localization · Extended Kalman Filter · noise estimation · real-world dataen_US
dc.titleOn the Kalman filter approach for localization of mobile robotsen_US
dc.typeProceeding articleen_US
dc.relation.conferenceInternational Conference on ICT Innovationsen_US
item.fulltextWith Fulltext-
item.grantfulltextopen-
crisitem.author.deptFaculty of Computer Science and Engineering-
crisitem.author.deptFaculty of Computer Science and Engineering-
Appears in Collections:Faculty of Computer Science and Engineering: Conference papers
Files in This Item:
File Description SizeFormat 
trud_newest14.pdf1.17 MBAdobe PDFView/Open
Show simple item record

Page view(s)

46
checked on Aug 9, 2024

Download(s)

8
checked on Aug 9, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.