The fusion of information from sensors with different physical characteristics, such as sight, touch, sound, etc., enhances the understanding of our surroundings and provides the basis for planning, decision-making, and control of autonomous and intelligent machines.
The minimal representation approach to multisensor fusion is based on the use of an information measure as a universal yardstick for fusion. Using models of sensor uncertainty, the representation size guides the integration of widely varying types of data and maximizes the information contributed to a consistent...
The fusion of information from sensors with different physical characteristics, such as sight, touch, sound, etc., enhances the understanding of our s...