In our previous blog, Tips to Power Through Sensor Development Challenges, we saw how sensors provide approximate measurements of real-world phenomena. These different types of sensors can exhibit inaccuracies from various things such as its design, environmental conditions, and others. We also touched upon solutions for overcoming these challenges such as sensor fusion–the practice of gathering and processing data from multiple sensors.
In this blog we’ll dive deeper into sensor fusion to look at some use cases and explore implementation options.
How Can We Get More from Our Sensors?
As developers, our goal is to improve upon the often noisy and inaccurate data returned from sensors. Sensor fusion is a common solution that utilizes multiple, and often different types of data fed into algorithms to produce better data. It turns out that sensor fusion can also do a lot more.
With the right algorithm, sensor fusion can aid in making predictions, generate inferences from incomplete data, introduce redundancy and fault tolerance, extrapolate human-like and contextual information, and set the stage for more sophisticated data analysis.
Applications for sensor fusion span multiple areas of technology including business, robotics, IoT, gaming, and transportation. For example, a game can fuse data from sensors measuring a player’s heartbeat and movements to infer their current mood. An app can approximate a GPS location when GPS coverage is lost by using other sensors and past historical movements. And spatial coverage of a fighter jet’s radar can be extended by fusing radar data from other nearby jets flying in formation. The characteristics of sensor fusion can be categorized in different ways.
Types of Fusion
Data sources for fusion can be direct or indirect.
- Direct fusion combines data acquired directly from multiple sensors which may be similar (i.e., they all measure the same phenomenon), or different (e.g., different sensor types, virtual sensors, historical data, etc.).
- Indirect fusion utilizes historical data and/or known properties of the environment and human inputs to produce refined data.
Fusion processes can be categorized as low level, intermediate, and high level.
- Low-level fusion combines data from multiple sources to produce refined data.
- Intermediate-level fusion identifies features from data.
- High-level fusion combines refined data from previous data processes (e.g., chaining fusion algorithms together).
A sensor configuration can also be categorized as complementary or competitive.
- Complementary: this configuration involves independent sensors, which may measure the same or different phenomena to provide a more complete measurement. For example, accelerometer and gyroscope data may be fused to identify steps a user takes as they walk.
- Competing: this configuration involves multiple sensors that measure the same phenomenon. For example, aircraft often have multiple speed sensors to provide redundancy and fault tolerance.
Overview of Fusion Algorithms
A fusion algorithm is the key component in sensor fusion because it takes the sensor data as input, then outputs refined data that is often more informative, accurate, and useful than the data from any individual input. Algorithms can vary by the number of inputs and outputs and can be chained together such that each successive algorithm further refines the data.
An algorithm may involve one or more of the following phases:
- Smoothing: multiple measurements are used to estimate the value of a process variable (e.g., GPS position) either in real-time or offline.
- Filtering: the state of a process entity (e.g., current speed) is estimated with current and past measurements in real-time.
- Prediction State Estimation: historical measurements (e.g., speed and direction) are analyzed in real-time to predict a state (e.g., a future GPS position).
Fusion algorithms are sometimes based on formal fusion models such as “JDL Fusion” by the US Joint Directors of Laboratories, and can be implemented on platforms like the Qualcomm® Snapdragon™ 845 and its Qualcomm® Kryo™ CPU or Qualcomm® Hexagon™ DSP. Implementation on the Qualcomm® Adreno™ GPU through the Adreno GPU SDK is also good option since fusion often involves matrix math.
The Kalman Filter
A common sensor fusion algorithm is the Kalman Filter. It uses data measurements from multiple sources (e.g., sensors) acquired over time that are often noisy and inaccurate, and estimates values for variables (e.g., future GPS locations) that are more accurate than would be possible using any single measurement. It does this in two steps:
- Prediction: estimates current state variables (e.g., speed, direction) and uncertainties (e.g., environment factors affecting measurements).
- Update: upon acquisition of the next set of measurements, the filter updates its estimated states, weighting the estimates based on calculated certainties.
Developers use the Kalman Filter to extract relatively accurate information when there is uncertainty (e.g., in scenarios where things are constantly changing), and also to reduce noise, bias, and accumulation errors. For example, a Kalman Filter can be used to estimate the position of an object over time when the GPS signal is lost using other sources such as the accelerometer, gyroscope, and compass sensors, along with historical data.
Since the algorithm runs recursively and only requires current measurements, the last estimated state, and known uncertainties; it’s well suited for implementation as a run-time process on platforms like Snapdragon 845, and can be further optimized for power efficiency using the Snapdragon Power Optimization SDK. The Kalman Filter can also play a role in Machine Learning as described in this whitepaper and can be implemented using our Qualcomm® Snapdragon™ Math Libraries. Machine Learning and associated algorithms can also be accelerated on Snapdragon-based edge devices using the Qualcomm® Neural Processing SDK for AI.
Overcome Imperfect Data with Sensor Fusion
Since data collected from sensors approximates real-world phenomena and is often imperfect, it must be fused with other data to be meaningful in digital systems. Sensor fusion allows us to combine data from multiple sources to generate refined data. With algorithms such as a Kalman Filter running on platforms like the Snapdragon 845, developers can overcome imperfect data and even provide accurate predictions on IoT and mobile devices.
Where are some areas where you think Sensor Fusion can help your development? Be sure to let us know!