Sensor Fusion

  • PDF / 152,807 Bytes
  • 3 Pages / 504.467 x 719.855 pts Page_size
  • 37 Downloads / 230 Views

DOWNLOAD

REPORT


Sensor Fusion Pramod K. Varshney Department of Electrical Engineering and Computer Science, Syracuse University, Syracuse, NY, USA

Synonyms

ing technologies, multiple sensors are increasingly being used. This provides improved system performance, resulting in a better understanding of the phenomenon being monitored. In addition, distributed sensing improves robustness and extends spatial and temporal coverage while resulting in shorter response time [1–3]. In order to optimally fuse information acquired from different distributed sensing architectures, advances in theory and algorithm design are required.

Multisensor data fusion

Theory Related Concepts  Data Fusion  Information Fusion

Definition Sensor fusion refers to systems, techniques, theory, and tools that exploit the synergy in the information acquired from multiple sensors to enhance system performance.

Background Conventional systems used single sensors for monitoring phenomenon of interest and make inferences regarding them. Due to significant advances in sensing, networking, and comput© Springer Nature Switzerland AG 2020 K. Ikeuchi (ed.), Computer Vision, https://doi.org/10.1007/978-3-030-03243-2_301-1

Data from multiple sensors can be combined at three possible levels. In data-level fusion, raw sensor data is combined. This requires that data acquired from different sensors be commensurate and the data needs to be transported to a fusion center for centralized processing. This approach has the potential of achieving the best possible performance at the expense of large communication requirements. For noncommensurate data, either feature-level fusion or decision-level fusion is employed. In feature-level fusion, features are extracted from the data which are then fused. In decision-level fusion, higher-level decisions such as detections and estimates are obtained based on data from individual sensors. These decisions are then fused at the fusion center. In feature-level fusion and decision-level fusion, data transmission requirements are lower, but the quality of fused result degrades due to data compression

2

involved in the feature extraction and decisionmaking processes. Several topologies for sensor fusion such as parallel, serial, tree, and network topologies can be used. Choice of topology is often application dependent, but parallel topology is used quite commonly. Sensor fusion is employed to solve a number of generic problems that results in improved situational awareness for the phenomenon under observation. Object and event detection using multisensor data is carried out based on distributed detection theory and decision fusion [4– 6]. For conditionally independent observations, a likelihood ratio-based quantizer is employed at the sensors, and the fusion rule is based on a weighted sum of incoming quantized data. For parameter estimation or tracking problems, quantized data are fused at the fusion center [7, 8]. For tracking, a number of distributed filtering and track fusion algorithms are employed [9, 10]. When tracking multipl