logo

Blueforce Sidecar

The Blueforce Sidecar features new video content, media coverage, and more. Check back regularly for updates.

Sensor-to-Shooter Artificial Intelligence (S2S-AI)

By John Black, Vice President—Federal, Blueforce Development Corp.

An Eye-opening Moment and Genesis of S2S-AI Rapid Prototyping

Next week, at the Wilcox Industries Range Day at SHOT Show, outside Las Vegas, Blueforce will provide a live fire demonstration of Sensor-to-Shooter Artificial Intelligence (S2S-AI), a proof-of-concept of IoT data fusion, applied to a sniper scenario, one year after its inception. On a sunny day, last January on the range at the Front Sight Firearms Training institute, I picked up a 7.62 NATO short action rifle and with one shot, hit a target silhouette from 1,200 meters…with a 10-knot cross wind. An ex-Delta sniper exclaimed, “You hit the target at center mass. That’s about a 17-foot bullet drop!” I’ve taught basic rifle marksmanship, but I’m no sniper, and these days, I’m lucky, if I can get on a live fire range more than twice a year. What enabled me to perform such a remarkable feat of arms, well beyond my skill level, was the Wilcox RAPTAR-S laser rangefinder and ballistic solver that captures target range, elevation, altitude, and weather and processes it with an onboard ammunition load data to calculate an aiming correction and deliver it to the disturbed reticle of a digital optic. All I had to do was move my point of aim, so that the corrected crosshairs position was on the target and squeeze the trigger.

At that eye-opening moment, Blueforce Development Founder and CEO, Michael Helfrich and I mused about the possibilities that could be unlocked, by connecting the RAPTAR-S, an IoT-ready device, to a BlueforceTACTICAL (BTAC) mobile IoT hub, capable of interfacing with multiple proximate devices, wirelessly, and via cable connections. If we added a precision location device and an external weather meter, we could compute the location of the target, associated with a captured image and metadata, and share it with teammates and command posts to generate new tactical options.

To be honest, our hunt for such a solution began, many years before, while I was conducting screening operations on the Iraqi border and as Michael showed to dropped jaws the first early concept prototypes of handheld sensors, networked through mobile devices. Back then, laser rangefinders with integrated compass and elevation were in service, but there was no way to share data and distribute computation among devices or get data from a growing variety of ground-based and airborne sensors. Even today’s soldiers equipped with state-of-the art fully-digital day/night optics, loaded with sensors, have lacked way to create ad hoc networks with other units and devices without prior coordination, share information horizontally, and quickly adapt to add new capabilities that arrive to the battlefield, every day. That was before Blueforce.

Extending Combat-Proven Devices with IoT Data Fusion to Creating New Tactical Options

Sensor-to-Shooter Artificial Intelligence (S2S-AI) extends the proven (TRL9) RAPTAR-S laser rangefinder and ballistic solver with IoT data fusion to enhance operational efficiency. A Government-assessed proof-of-concept (POC), based on sniper mission scenarios, simultaneously provides ballistics solutions to the disturbed reticle of the shooter’s digital optic, while computing and sharing target locations and metadata with other operators’ visual augmentation systems (VAS).

Blueforce Development is working with Wilcox and other partners to continue development of a fieldable prototype that interfaces with autonomous agents for line-of-sight analysis, sensor discovery, hyperlocal alerts, and target recognition, orchestrated by mission plugins, designed to provide ambient intelligence to hyper-enable the operator with tactical options for target engagement, redirection, surveillance, and handoff. The ability to share the locations of designated targets with associated images and metadata with other operators and command centers, without keying the hand-mic, eliminates errors, while autonomous agents assist with target classification and identification, discovering additional available sensors and shooters, and generating tactical options. The goal is to provide an ambient, embedded intelligence, working in the background and without requiring operators to take their eyes off their targets or their hands off their weapons.

And We Are Just Getting Started!

The forward sensor fusion with edge-based processing in the Blueforce IoT data fusion platform can be applied to transform to a wide range of mission scenarios, far beyond snipers and small arms. S2S-AI enhances existing capabilities, while generating new tactical options. To learn more about the Blueforce technology, S2S-AI, and development of the field prototype, view our releasable presentation online. Contact Blueforce Development to learn how forward sensor fusion with edge-based processing can transform your capabilities and accelerate recognitional decision-making at the edge of your network. If you hit us up, right now, you may get to join us on the range at Wilcox Industries Range Day at SHOT Show 2019!

For more information, call us at 866-960-0204 or send an email to info@blueforcedev.com.

No Comments

Sorry, the comment form is closed at this time.