Summary
Simultaneous localization and mapping (SLAM) is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it. While this initially appears to be a chicken or the egg problem, there are several algorithms known to solve it in, at least approximately, tractable time for certain environments. Popular approximate solution methods include the particle filter, extended Kalman filter, covariance intersection, and GraphSLAM. SLAM algorithms are based on concepts in computational geometry and computer vision, and are used in robot navigation, robotic mapping and odometry for virtual reality or augmented reality. SLAM algorithms are tailored to the available resources and are not aimed at perfection but at operational compliance. Published approaches are employed in self-driving cars, unmanned aerial vehicles, autonomous underwater vehicles, planetary rovers, newer domestic robots and even inside the human body. Given a series of controls and sensor observations over discrete time steps , the SLAM problem is to compute an estimate of the agent's state and a map of the environment . All quantities are usually probabilistic, so the objective is to compute Applying Bayes' rule gives a framework for sequentially updating the location posteriors, given a map and a transition function , Similarly the map can be updated sequentially by Like many inference problems, the solutions to inferring the two variables together can be found, to a local optimum solution, by alternating updates of the two beliefs in a form of an expectation–maximization algorithm. Statistical techniques used to approximate the above equations include Kalman filters and particle filters. They provide an estimation of the posterior probability distribution for the pose of the robot and for the parameters of the map. Methods which conservatively approximate the above model using covariance intersection are able to avoid reliance on statistical independence assumptions to reduce algorithmic complexity for large-scale applications.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.