Multimodal incremental Smoothing and Mapping Algorithm
Work In Progress
Placeholder for details on how the approximate sum-product inference algorithm (mmiSAM) works. Until then, see related literature for more details.
Algorithm combats the so called curse-of-dimensionality on the basis of eight principles outlined in the thesis work "Multimodal and Inertial Sensor Solutions to Navigation-type Factor Graphs".
Joint Probability
General Factor Graph – i.e. non-Gaussian and multi-modal
Inference on Bayes/Junction/Elimination Tree

Focussing Computation on Tree
Incremental Updates
Recycling computations
Fixed-Lag operation
Also mixed priority solving
Federated Tree Solution (Multi session/agent)
Tentatively see the multisession page.
Chapman-Kolmogorov (Belief Propagation / Sum-product)
The main computational effort is to focus compute cycles on dominant modes exhibited by the data, by dropping low likelihood modes (although not indefinitely) and not sacrificing accuracy individual major features.
Mixture Parametric Method
Work In Progress – deferred for progress on full functional methods, but likely to have Gaussian legacy algorithm with mixture model expansion added in the near future.
Sequential Nested Gibbs Method
Current default inference method.
Deterministic Convolution Approximation
Proposal distributions are computed using trust-region Newton methods through analytical or numerical factor definitions.
Stochastic Product Approx of Infinite Functionals
See mixed-manifold products presented in the literature section.
writing in progress
Full Deterministic Chapman-Kolmogorov Super Product Method
Work in progress