The internet of things (IoT) is revolutionizing the way we live and work. By connecting devices and sensors to the internet (and/or our private networks), we are able to collect and analyze incident and environmental data like never before. This data can be used to cue and orchestrate various actions and processes, which in turn enhances operational efficiency while positively impacting life safety.

IoT V1 was largely the use of single sensor types to enhance awareness and early warning. For the most part, these single dots on a map, that blinked when a sensor or endpoint went out of acceptable ranges.  Gunshot sensing is a great example: sensors could determine the origination of the shot enabling public safety to more quickly grab the offender, but the damage was already done. IoT V2, and the leverage of sensor and AI fusion, will provide opportunities for actual mitigation BEFORE an event occurs, but also a dramatic decrease in false positives.  Sensor fusion occurs when we combine data from multiple sensors to get a more accurate picture of what is happening in the environment and is augmented with analytics and AI algorithms.  Sensor fusion is also a critical component of IoT cueing and orchestration.  By collecting data from multiple sensors, we can more easily identify patterns and trigger various actions, but more on that in our next blog post.


Sensor fusion, sometimes referred to as “multi-sensor data fusion”, is a very straightforward concept: two or more sensors are better than one, which ultimately yields a massive reduction in false positives. Based on the desired operational outcome, mash the relevant data from these sensors together and you have sensor fusion. That all sounds easy, but in reality, it’s the software and recognitional algorithms behind the data which make sensor fusion possible, and that is the hard part.  This is the core promise of the Blueforce platform: destroy IoT silos by delivering interoperability between dissimilar and disparate sensor types (even from different manufacturers) and enable connectivity paths and recognition between these different sensor types at the edge of the network.

Simply put: Sensor fusion aims to overcome the limitations of individual sensors by gathering and fusing data from multiple sensors to produce more reliable outcomes with less uncertainty. This correlated information can then be used to make decisions or take certain actions, manual or autonomous. This more comprehensive understanding of the process or situation then offers more and/or deeper insights which can then influence new, more intelligent, and more accurate decisions and reactions to public safety and/or national security decision makers.

The Blueforce IoT and AI software platform leverages a decentralized “hybrid” fusion model per IoT data origination, but it also delivers speed in the movement of processing and dissemination of actionable intelligence to the edge. Specifically, Blueforce fuses heterogeneous or homogeneous sensors while simultaneously applying location, environmental, and human markers to include movement, position, and physiological data. In the operational space (i.e., battlefields, city streets), IoT data can become stale and of non-value in minutes and in best cases, an hour.  This requires high-speed processing and movement of actionable intelligence derived from fusion and the resultant “sense-making” to occur on edge compute devices in the mission space and occurs on the responder, on the sensor head, and even on autonomous delivery platforms.


A variety of IoT and AI fusion models are supported by the Blueforce software platform and are enabled by our inter-process messaging interfaces inherent to Blueforce “plugins”. Plugins are the “mosaic” pieces in Blueforce that extend the platform to allow connectivity to  sensors, algorithms, and even disparate systems.  The Blueforce IPM interface allows plugins to talk to other plugins for data correlation, while also allowing plugins to instruct other plugins to orchestrate action. These models include:

Complementary Fusion
This is the use case where IoT sensors do not directly depend on each other, but when data from each is combined provides a more complete view for what is going on.

USE CASE: For border counter-terrorism operations, a seismic sensor firing can indicate human traffic, but in many cases can be a detection of wildlife activity. This often results in QRF or law enforcement units to scramble only to discover a jackrabbit or other animal native to the environment.  When we add magnetometer sensors to the equation, false positives are reduced by 98% because wildlife doesn’t wear metals nor drive vehicles, where humans do.

Competitive or Redundant Fusion
This is a use case where each sensor delivers independent measurements of the same property, which is extremely useful in validation, accuracy of detection, and in error correction.

USE CASE: For businesses that wish to monitor visitor traffic and behaviors in a store or building, dissimilar sensors may be used to count traffic, each with varying degrees of precision. While signal sensors might be used to detect unique device signatures, many of us wear and/or carry more than one device that emits a signal which can skew counts. That Apple Watch combined with an Apple iPhone can look like two visitors, when in fact it is a single visitor. Leveraging a second sensor in the form of a camera running face recognition can add precision to visitor data by looking for unique faces. Neither are 100% accurate but leveraging both with the use of fuzzy logic can add precision to the probabilities of actual counts.

Cooperative Fusion
This is the use case where information from independent sensors is used to derive information and meaning that would not be available from single sensors.

USE CASE: Cooperative fusion of chemical sensors, and the use of emerging wearables that can monitor human physiology, offer enhanced recognitional support for those working in hazmat and/or CBRNE threat environments. While chemical and multi-gas sensors detect specific agents in an environment, and can also detect rapidly decreasing O2 levels, pulse oximeter technology in wearables can detect changes in blood gases should agents be present that gas/chemical sensors cannot detect.

BlueforceTACTICAL and BlueforceEDGE provide comprehensive and extensible support for body worn and proximate sensors, but also unattended/autonomous sensors. Blueforce’s “mosaic” architecture allows rapid extensibility as new algorithms emerge, but also as new best-in-class IoT sensors come to market. Blueforce plugins allow agencies to move from concept to capability in mere days, without the need of an army of developers.  Furthermore, plugins enable connectivity and sense-making amongst dissimilar sensors for more precise recognition with the data shared amongst responders, decision-makers, but also information systems. The real value proposition and key differentiator is our ability to fuse multiple sensors for enhanced recognitional support and provide autonomous mitigation through event orchestration; more on “orchestration” in our next installment.

To learn more, contact us at 866-960-0204 or send an email to