Topic

Transports

Between 2018 and 2021, EVS worked on an innovative project aimed at creating a distributed AI-based vision system for monitoring birds in airport scenarios. The team consisted of EVS, the customer, the patent holder, a research institute and a system integrator. EVS designed, developed and deployed all the analytics and back-end software stack and contributed significantly to full system conception, design and validation.

Problem

Bird strikes

“Bird strikes” have occurred since humans learned to fly in the space that was previously the realm of birds. Accidents occur mainly during take-off and landing and are increasingly frequent due to the amount of air traffic and the increasing presence of birds. Impacts with larger birds are the most dangerous, but large flocks, whatever the size of the bird, are also a serious threat.

The resulting damage and related costs are very high: aircraft maintenance, flight cancellations and delays with the relative inconvenience to passengers, not to mention the risk of injury. The impact on birdlife itself should not be underestimated.

Airports have always attracted birds. Biological experts studying the phenomenon know perfectly well that the key to solving the problem is observation (data collection and analysis), knowledge of the habits of the different species and changing the environment to make it inhospitable for birds. This is why airport management companies are required to take measures to monitor the occurrence by installing equipment and deploying specialized teams.

Solution

Artificial vision system that uses a set of network cameras

EVS has developed an artificial vision system that uses a set of high-resolution PTZ network cameras, positioned in such a way to constantly monitor the entire airport grounds, the overlying and adjacent airspace and detect the presence of birds and wildlife in general. The video analysis software controls the cameras’ movements, processes the video streams with GPU accelerated video processing, computer vision and machine learning algorithms. The system can detect and track any low-flying birds within the operating space. Moreover, it can classify the bird species as well as estimate the location in the 3D space. All this information is not only collected into a database for statistics and strategic planning purposes; it can also potentially be used for flight safety and to trigger automatic dissuasion systems, such as distress calls.

The target applications mainly concern security in airports but can also be used at wind farms, as well as for monitoring agricultural areas, urban centers, and wildlife oases.

Added value

Accurate and objective statistics to keep the number of birds under control

The system is designed to assist BCU (bird control units) so that they only need to intervene in the field when strictly necessary. In this way, they can focus their efforts on analyzing the enormous amount of data that the system is capable of providing in order to plan effective strategies to make the airport an inhospitable environment for birds.

The value therefore lies in obtaining accurate and objective statistics to keep the number of birds under control in order to lower the risk indexes to within the limits imposed by the regulations.

Project partner

Aerospace sector

Project carried out in collaboration with a research institute on behalf of a private client in the aerospace sector.

Insights

The system is scalable to the operating space

The system is scalable to the operating space.
Processing takes place on an HPC server equipped with multicore processors and an array of GPUs. The calculation uses the CUDA acceleration provided to the GPUs. Each GPU can support up to four 4K cameras simultaneously.

The amount of data to be processed is enormous and requires an extremely thorough optimization of the algorithms. Since birds appear as very small moving objects, their behavior, both in flight and on the ground, needs to be modelled carefully in order to be able to effectively distinguish them from other agents on the scene. Both computer vision algorithms and machine learning methods are used to detect and track the birds, analyze the trajectories, extract the flight properties – such as speed, direction, flock size, wing beat frequency etc., as well as find their geolocation and to classify the particular species.

Bird events are logged into a database to be accessed from a remote front-end through a REST interface. The REST API makes the system easy to interface with workstations, video walls or even mobile platforms, such as tablets and smartphones. It can be used to generate alerts and alarms, to query historical data, to verify which video clips generated what events or even to remotely take control of each single camera to get a closer look at the area of ​​interest.
The System can be interfaced with third-party Video Management Systems for video recording and video tagging with metadata on events.
Additional tools have been developed for camera calibration, configuration, data annotation and model training as well as for testing and measuring performance.
The system is now being tested at several airports in Europe and worldwide.