Sensors: powerful tools for special missions
A range of sensors and accompanying software are becoming a more viable option for organizations of all sizes, making them a strong asset in the arsenal of special missions. Jon Adams finds out how they are being used
More powerful and robust onboard technology is transforming the way that pilots and crew view and interpret the world around them. The reduction in the size, weight, power consumption and cost (SWaP-C) of the equipment available to operators while increasing the resolution, fidelity and processing power means that more special missions are being significantly enhanced. The benefits of better sensors include being able to find and locate individuals in distress, and extend into aiding in firefighting and law enforcement. Locating points of interest from farther away and covering more ground, as well as seeing through smoke, dust or cloud, presents greater opportunities for missions that were limited before, but also safer operations as the situational awareness of hazards is also increased. And each type of sensor is complementary with another, operating on a different wavelength to passively or actively generate an image of the ground below, for different situations and in different environments.
The versatility of sensors for special missions is highlighted by Matthew Harvey, Line Captain with the UK’s National Police Air Service (NPAS), whose work is underlined by using the technology on every mission: “NPAS currently uses a mix of WESCAM MX-10/15 and Teledyne FLIR Star SAFIRE cameras on its rotary fleet but, in my relatively short time here (18 months), I’ve only ever flown in SAFIRE-equipped aircraft. This is a gyro-stabilized system with both infrared and optical sensors that are used in almost all flying operations.”
Jeff Sherwood, Director of Business Development for Skytrac, emphasized how important this equipment is for special missions by providing a point of view not available in any other way: “These systems are critical for obtaining video and other types of data from an airborne vantage point. Aerial footage and sensor data can provide a unique perspective that can improve situational awareness, response time, communication, and coordination regarding various emergency or special mission situations.”
Using sensors on special missions flights makes a “fundamental” difference, according to Scott Richardson, VP of Product for CarteNav. Richardson explained: “Not everything is visible to the naked eye. So when you have sensors like infrared, which is able to see into the dark, you have different spectra that can see in different environments, far better than an unaided observer, and then you have sensors like radar that can also see through weather.”
Low SWaP-C is an important aspect of avionics design as it enables the systems to be installed on various aircraft types. The lower the SWaP, the more aircraft that can be equipped with these systems
The development of processing power to interpret, record, compress and transmit this data in real time is also growing to match the development of the sensors themselves, also with lower SWaP-C, making the equipment feasible to install onboard and affordable to purchase for the whole fleet. “Low SWaP-C is an important aspect of avionics design as it enables the systems to be installed on various aircraft types. The lower the SWaP, the more aircraft that can be equipped with these systems,” added Sherwood.
High-bandwidth satellite downlinks enable base operations to monitor, assess, interpret and record the data to coordinate assets from the information gathered in real time, allowing for safer aerial and ground operations, but also for a more efficient focus on the tasking at hand.
Technology enhancing missions
Electro-optical sensors
Electro-optical (EO) sensors are passive devices that rely on reflections of the visible spectrum of light to capture images, much like the digital camera on one’s phone. These have a strength in that they match what the unaided eye can see but from a much larger distance, and are able to zoom in, sharpen, add contrast, and other digital manipulation to get the most from the data that is being recorded. Furthermore, the interpretation of the images is largely intuitive. However, they do suffer in much the same way that degraded visual environments (DVEs) also prevent the cameras from being able to see through the atmospheric or geographic disturbance that would limit one’s vision, such as cloud cover, mist, haze, vegetation / leafy canopies, dust, rain, snow etc. Also limiting the sensor are low to no light operations, preventing the cameras from operating well on night missions.
Infrared sensors
This leads on to the infrared (IR) sensors. Often housed in the same turret as the EO sensors, the IR systems are extremely complementary. By – also passively – capturing a different part of the electromagnetic spectrum (the heat radiated by objects), the IR sensors are able to record what and where is warmer (or colder) than the surrounding area. This is particularly useful for night operations or in cold or maritime environments when looking for persons who will be warmer than the snow/water that they are in. However, IR also suffers largely from the same penetration problems as EO sensors, where troublesome environments can get in between the sensor and the target of interest. Heavy clouds or rain can prove difficult by absorbing the radiation, Jamie Ross, Search and Rescue Commander of UKSAR AW189 Resilience Team at Bristow, explained: “Forward-looking infrared (FLIR) is most badly affected by atmospheric conditions: the more moisture laden the atmosphere, the worse the camera performs.”
FLIR is most badly affected by atmospheric conditions: the more moisture laden the atmosphere, the worse the camera performs
Synthetic aperture radar
Synthetic aperture radar (SAR) has qualities that add further to the abilities of the EO and IR sensors already mentioned. SAR differs from the other sensors in that it is an active sensor; that is, it sends out radiation (in the radio and microwave wavebands: X-band to P-band) and then captures the reflections/echoes of the pulses generated by the system. The synthetic aperture refers to the way the device gets around the limitation of the requirement for an unfeasibly large aerial. The long wavelengths it needs for higher resolution are achieved by sending pulses at a target from different points along a flight path. The area that SAR excels in over the other sensors is that it can penetrate various conditions, so that clouds, smoke, tree cover, or the darkness of night can be bypassed, allowing the pilot or crew to monitor the ground when other sensors would struggle. However, radar can get disrupted and produce confusing outputs over rough and rolling seas, explained Ross: “Radar is affected by sea state primarily; in heavy seas, the radar clutter can be very challenging.”
Light detection and ranging
Topographic light detection and ranging (lidar) works in much the same way as SAR by being an active remote sensing system but it generally uses laser pulses in the infrared and near-infrared part of the spectrum, and measures the reflections to create a detailed and accurate 3D mapping output. As well as creating maps, lidar’s accuracy is used for creating obstacle warning systems, alerting the pilot if they are too close to an object, for instance. Also, because lidar uses very narrow beam lasers, it can find narrow gaps and penetrate forest canopies. Similar to SAR, lidar is a system that can operate at night and can see through DVEs. Topographical scanning is particularly useful for charting unstable land and alerting ground crew to safe routes after a landslide, flood or earthquake.
Mobile/cell phone detectors
There is also a class of sensor that detects and can communicate with cell phones by acting as a temporary cell network and tower. As long as the phone is on and has battery, it can then act as a beacon to the sensor, which can pick it up and communicate with it and also pinpoint its location. As mobile/cell phones are almost ubiquitous, this type of sensor is becoming more widely used in emergency services roles.
Situational awareness
This is all to say that with the wealth of sensors, the execution and completion of tasks becomes not only easier, but also safer. As special missions are inherently more likely to occur toward the more extreme side of weather and geography – as people need rescuing from situations related to location or meteorological phenomena, or having to fly through smoke during a firefighting sortie, for instance – then having the tools to not only locate the targets of interest but also being able to safely navigate there and back is vitally important.
One thing that people forget about often is safety; the fact that the operator can do work efficiently means they’re more aware of their airspace and any consequences they might have of that just by being more effective with their sensor suite
Being able to see someone from far away, locate a heat source, or see through mist has enormous benefits in an already stressful and demanding role. Furthermore, the computing power of modern systems is also able to interpret, adjust and overlay moving charts and maps so that identification of position and pinpointing a specific location is becoming easier and more user-friendly than ever before.
Knowing exactly where something or someone is in relation to the surrounding environment helps reach that spot with much less likelihood of an incident than otherwise would have been the case. Having access to and taking advantage of the output of multiple sensors at once in one place adds to the improved effectiveness of the mission: “You’ve got one of two things that happen when you increase the efficiency: you have an ability to either double your coverage for your two-hour flight and get more information, or you could reduce your flight time – maybe you just have to cover a certain area; you can reduce that time,” said Richardson. “One thing that people forget about often is safety; the fact that the operator can do work efficiently means they’re more aware of their airspace and any consequences they might have of that just by being more effective with their sensor suite.”
When someone is in trouble, they may or may not be able to signal to a rescue service. Héctor Estévez, CEO of Centum, explained how the Lifeseeker phone location system adds value to the suite of sensors that are being used: “Lifeseeker takes advantage of the signals emitted by mobile phones to accurately determine their position. It has the capability to locate a phone both in areas with and without network coverage, and it does not require any interaction from the phone user to perform the location. This ensures that Lifeseeker can locate individuals even if they are unconscious or unable to operate their phone.” Knowing where someone is, even if they are unconscious or immobile, really improves the efficiency and safety of a mission. Estévez continued: “Pinpointing the precise location of a mobile phone enhances the efficiency and situational awareness of rescue operations, enabling teams to reach individuals faster and make more informed decisions. In addition to its location capabilities, Lifeseeker allows direct communication with the phone user via calls and SMS, further improving coordination and responsiveness during a mission.”
Mounted on gimbals that can rotate a sensor in a turret and be controlled by the pilot or crew, EO/IR sensors can be pointed in almost any direction. Similarly, SAR is often an active electronically scanned array, which can be directed at different areas without moving the antenna itself. Landing through rotor wash in a dusty area, flying through smoke, or hovering near the edge of a mountainside in snow are hazardous situations that may occur during special missions. Having a camera or sensor that can look below, behind, around and through the atmospheric disturbance enhances the ability of the pilot to fly safely without incident. Similarly, being able to speak to someone on the ground through the temporary phone network, can give valuable information about the location that may not be apparent from a distance on board the aircraft.
Radar is absolutely critical as, without it, we wouldn’t safely be able to get to a lot of the scenes/vessels that we’re tasked to
The value of a range of sensors for search and rescue operations is high. “We use a weather/search radar and EO/FLIR, in addition to the traditional eyeballs, for the search elements of a search and rescue mission and also as a navigation aid,” said Ross. “This allows us to conduct a self letdown in poor weather to get visual meteorological conditions (VMC) beneath, in order to make contact either with land or a vessel, depending on the scenario.” He continued: “Radar is absolutely critical as, without it, we wouldn’t safely be able to get to a lot of the scenes/vessels that we’re tasked to when the weather is way below visual flight rules (VFR) minima, and in areas that have no instrument flight rules (IFR) procedure to letdown otherwise.
“Likewise FLIR is used during the final stages of a radar letdown to assist with collision avoidance in the low-level environment. But its primary use is as a search aid where, particularly at night, it is our primary search sensor. Without it at night, we’re really not much use during a search as, even with night vision imaging systems (NVIS) and aircraft illumination, it’s very difficult to spot people either in the water or in the mountains.”
Working around the challenging environments found in special missions is part of the design process for sensors, explained Estévez: “Lifeseeker is designed to perform reliably in a wide range of weather conditions and environments, including rural, mountainous, and maritime areas. Whether facing extreme temperatures, heavy rain, or rugged terrain, the system continues to function without interruption.”
Workload, efficiency and situational awareness can all be improved with a system like CarteNav’s AIMS suite, where it is able to compile and overlay the outputs in one place, Richardson said: “All of these sensors are pretty powerful on their own, just raw. But what we do is give geospatial relevance to things. So it’s actually the layering of this data in a way that’s easy to understand and lets the operator be more efficient and more precise with what they’re doing – cross-queuing.” Richardson continued: “What we do is bring not just the ingesting of the data from these sensors, but an ability to control and cross-queue these sensors. For instance, I can instruct the radar to create a SAR image of an area; I can then look at that image, find points of interest and go look for what I’ve seen there with my EO/IR sensor. The big thing here is efficiency, how quickly can an operator get that job done. This might have been multiple people in the past: there might have been one person operating a radar, one person operating EO/IR, one person annotating things. We bring this all into one user interface that can be adaptable for the use case. You can actually accomplish all those things at one station or at any workstation.”
Leveraging artificial intelligence (AI) and machine learning (ML) can greatly enhance the data output from the sensors. With overlays and tags, an item that appears to be a person, vehicle or hotspot can be enhanced, clarified and flagged for attention. Features like a moving target indicator can identify if there is a person moving and can track them, helping to distinguish humans from the surrounding environment. This greatly reduces the workload of the operator and gives them options about how best to approach any given scenario.
The benefits of software-aided interpretation of data is “huge”, according to Richardson: “It’s something we’ve done a lot of steps towards. One of the things we’re moving towards is both in the way the software is configured and the underlying data. Then what’s made available for you are ways to have profiles that behave very differently in an instant, so I could be an EO/IR operator or a radar operator. That means the software interface needs to adjust to what my subject matter expertise requirement is. But in addition to that, associative filters or alerts are essential, especially in high-traffic areas. We’ve been working in the Netherlands, operational in low-altitude flights, and we were seeing upwards of 30,000 different kinds of tracks being generated. The importance for an operator to set their criteria to easily declutter that picture to get the information they’re looking for is absolutely essential.”
The choice of sensor depends on time of day and environmental conditions, considering what is trying to be achieved
Knowing the difference between and making the best use of the benefits of each type of sensor can vastly improve the outcomes of missions. Also, the ability to quickly switch from one to another is emphasized by Harvey: “The choice of sensor depends on time of day and environmental conditions, considering what is trying to be achieved. By night, while optical sensors are useful, IR is used more often due to its ability to pick up detail that would not be achievable with a visual sensor. For searches of open terrain, a wide area can be covered quickly and the operator has an ability to tweak automatic settings in order to emphasize any thermal differences visually. IR does not give the same clarity of detail as optical sensors, providing that they have enough light to work, meaning by day optical sensors are more likely to be used. However, there may be situations, for example a missing person search over a large area of open land in the winter, where IR is likely to be a better option. The operator can cycle through both to compare immediate performance and use the most appropriate combination.”
The balance for getting the right amount of benefit against the added workload of learning and using the systems is all part of highly skilled teamwork and the training that crews complete. Ross described how the systems integrate within the mission process: “In terms of ease of use, crew responsibilities are generally allocated where the technical crew operate the FLIR from the cabin and the pilots operate the radar from the cockpit, in a crew resource management (CRM)-driven procedure that works extremely well.
“It can be workload intensive, though. The letdown procedure – called a radar FLIR approach – can be a very intensive procedure, and is conducted using a whole-crew concept to ensure a safe profile is always executed. For FLIR searches alone, these are executed in a VFR environment so, although it’s a high-workload task for the technical crew operator, the pilots are free to focus on flight path management, visual search etc., so it’s not so bad. All of this does add to the workload, though.”
Data link and coordination
Real-time collection of data, also data-heavy high-definition video and streaming images, means that transmission of this data securely back to the main operations center can be a challenging business. Sherwood stated that the EO/IR sensors alone generate “a massive amount” of data, adding: “EO/IR systems can capture gigabytes of footage in a single flight.” However, the growth of networks of low-Earth-orbit (LEO) satellites means that high-bandwidth satellite communications (satcom) can relay this information from nearly anywhere in the world to nearly anywhere else in the world. Sherwood explained why satcom is preferential for special missions as it bypasses some of the limitations found with other data transfer methods: “These types of sensors and systems typically rely on line-of-sight communication systems to send data from the aircraft to the ground. This method is limited in effectiveness due to the range and geography of their flight operations. Skytrac’s connectivity avionics utilize the Iridium satellite and cellular infrastructure to extend the transmission of this data beyond the line of sight. The system can transmit this data from anywhere on earth when using the Iridium network as a communication channel.”
This, again, can ease workload and improve safety because the monitoring and analysis of the data doesn’t then have to be all done on board where there is a high workload, and the operations team can then coordinate, direct and aid the airborne assets on the strength of the data that has been collected anywhere in the world and not necessarily in the vicinity of the operation. The importance of this is explained by Richardson: “On the sensor end, you have subject matter experts whose task is to take all these different sensors, understand the state they’re in, listen to the radio chatter. All that information is gathered and then provided to decision-makers, such as size and density of the fire and what do you want to do: send in aerial assets to water-bomb the thing or let it burn off.”
It’s not just the transmission of data that is vital, but integrating the systems so that they are able to be compatible in the first place is hugely important. Sherwood provided an example of one aspect of how Skytrac’s SDL-350 is designed so that it is compatible with the majority of systems: “Our SDL-350 primarily utilizes ethernet as an interface, which is widely supported by sensor systems. This streamlines integration and installation on special mission aircraft.”
This software-based solution provides real-time compression of high-definition images and video on board the aircraft prior to transmission over cellular and satellite channels. This helps maintain the high-quality imagery captured by cameras while reducing data usage costs and fitting within the bandwidth limitations of various communication channels
Following on from the vast amount of data that is generated, the necessity to compress the data in real time is just as important. “Efficiently managing data and transmission over the various communication channels typically necessitates using compression software on board the aircraft. Skytrac has partnered with Videosoft to provide video compression software integrated with our connectivity avionics. This software-based solution provides real-time compression of high-definition images and video on board the aircraft prior to transmission over cellular and satellite channels. This helps maintain the high-quality imagery captured by cameras while reducing data usage costs and fitting within the bandwidth limitations of various communication channels,” explained Sherwood.
It’s not just real-time coordination, but all the data that is gathered from the host of sensors can also later aid with post-mission debriefing to learn from experiences during the missions that have occurred. “Post-mission data can be used for analysis, reviewing the concept of operations (ConOps) and then adjusting for the next time,” said Richardson. “We’re seeing more and more applications for this. The data has value even after an operation. Our focus is capturing that data as accurately as possible from as many sources as possible and making sure it’s available for whatever the customer wants to do with that.”
PAL Aerospace has a guide to ConOps on its website (palaerospace.com/conops).
Special equipment for special missions
Being able to clearly identify the precise location in relation to known and named landmarks helps with communication of position of targets and direction of movement; for police aviation, this is a great boon, explained Harvey: “Sensors link with mapping, leading to the ability to both send the sensor to a location and know the location of what the sensor is pointing at. Both are obviously useful in a police role.”
Sensors link with mapping, leading to the ability to both send the sensor to a location and know the location of what the sensor is pointing at. Both are obviously useful in a police role
The benefits of integration and compatible interfaces for sensors with other systems is explained by Estévez: “Lifeseeker integrates seamlessly with various onboard systems, providing real-time location data that can be displayed on a moving map. The map may include features such as terrain and environmental layers, which are part of the mapping system itself. While Lifeseeker does not directly command onboard elements like cameras or lights, the data it provides can be used by other systems to guide these tools, improving situational awareness and operational efficiency. Additionally, Lifeseeker is compatible with a wide range of mission management systems (MMS), including EuroNav, and can be adapted to meet the specific needs of different operational environments, ensuring versatility in mission planning and execution.”
The usefulness of airborne sensors justifies aerial policing in itself beyond the other roles that police aviation fills, and the increase in utility is warmly anticipated by the operators using them: “In the police role, the sensors are the primary reason for getting airborne in the first place, so while they are items to learn and they do add to workload, that is to be expected. Our tactical flight officers feel that the operation of the system is intuitive and, once familiar, most of its utility can be quickly used. We are currently looking forward to the next evolution of NPAS role equipment as part of NPAS’ transformation,” said Harvey.
Different operators and different domains have different equipment and different ways of doing things. In order to ensure that everyone has access to the best tools, CarteNav is as platform agnostic as possible, with interoperability as a watchword: “We’ve built into the software an ability to understand how different platforms work,” said Richardson. “And we’ve been doing a lot of work looking at something called ATAK or TAK servers and those have become quite ubiquitous. They have a very, very standard way of exchanging data and they’re free. We’re seeing lots of different organizations adopting this. How can we efficiently feed information and standards that third parties can consume? We’re introducing a new tool called Mission Link, which is a way that third parties and the AIMS ecosystem can share things like events and tasks both to receive and to send them. That can then show on electronic flight bags, into back-end systems, and into some reporting and key performance indicator tools.”
New and evolving technology is making its way into and expanding the capabilities of special missions aircraft. Not only is new technology revolutionizing the way, and the time, in which operations are undertaken, but existing technology is being refined so that the low SWaP-C is opening doors for public service and smaller operators to be able to afford to purchase and have room to install the sensors in their fleets.
This development is not ceasing; more capabilities and uses, and better resolution and definition are coming to the special missions sector, making flights safer and helping save more people in the process.
November 2024
Issue
In the AirMed&Rescue November 2024 edition
Special missions often fly in challenging conditions covering large areas, so sensors are used to narrow the search, highlight targets of interest, and perform safer flights, among other benefits; aerial firefighting is facing greater demands upon its resources, so operators and organizations are expanding their assets and abilities with drones that can be flown at a low cost with a high safety margin, allowing the conventional crewed craft to focus on other roles; and the complexity of rear crew winching requires equally complex and involved training to ensure that live missions are conducted successfully and safely.
Plus, we have all of our regular content to keep you informed of the events relating to worldwide special missions.
Jon Adams
Jon is the Senior Editor of AirMed&Rescue. He was previously Editor for Clinical Medicine and Future Healthcare Journal at the Royal College of Physicians before coming to AirMed&Rescue in November 2022. His favorite helicopter is the Army Air Corps Lynx that he saw his father fly while growing up on Army bases.