Safe automated collision avoidance is the key to out-of-sight flights

Skyline of Hamburg

Our skies are getting crowded. More and more drones are in use around the world, and more are being used every day. Compared to today, we will see an increase of more than 550% in drones in use (UAV) in the next 10 years in Germany alone. It is estimated that 126,000 commercial and 721,000 privately used UAV will then be in use in Germany. This will include conventional helicopters and airplanes as well as new types of air cabs (VTOL). Even though only a small proportion of these will be in the air at any one time, the question of how safe use and accident-free flight can be achieved is already being asked today. This is because drones cannot actually be flown on sight for all missions, and a drone pilot cannot always detect all obstacles or other aircraft in the airspace with the naked eye. Collision avoidance therefore becomes an essential safety, permitting and general operational factor for the mass use of drones, both commercial and private. Hamburg, as the second most populous German city, with the second largest port in Europe and the third largest civil aviation location worldwide, is a suitable place to advance the safe use of drones and VTOLs as a European model region for the use of urban air mobility. This is because AI has not yet been used in aviation. The AI software company Spleenlab (SPL) therefore wants to research, test and validate collision avoidance systems for VTOLs (vertical take-off and landing aircraft) beyond visual range through safe autonomous control with the help of artificial intelligence) funded by the IFB Hamburg project KOJAK together with the Technical University of Hamburg (TUHH). AI or machine learning, especially concepts and methods of multi-sensor deep learning have a high potential to detect and classify traffic events in the air and provide for higher safety. The novel and unprecedented fusion of different sensors with artificial intelligence developed by SPL and TUHH will enable aircraft for the first time to autonomously recognize their environment in real time and thus safely avoid collisions with static and dynamic objects.

Hamburg is Urban-Air-Mobility City and third largest aviation location worldwide is the perfect place for KOJAK (Source: hamburg-aviation.de) [Photo by Bernd Dittrich on Unsplash]

In fact, today, neither conventional helicopters, small aircraft or drones have an effective and safe system of collision avoidance with uncooperative airspace participants and objects. Uncooperative airspace participants are those that do not provide their own position to other airspace participants of their own accord and thus must be detected independently. While systems such as T-Cas, ADS-B and FLARM are used for collision avoidance in classical aviation, new approaches to collision avoidance are required for unmanned and/or autonomous systems.

Currently, general aviation and UAVs still rely heavily on visual identification of the (drone) pilot, supported by an Unmanned Air Traffic Management (UTM / ATM) system under development. However, as the level of autonomy and the number of (partially) autonomous airspace participants increases, the collision avoidance and navigation performance requirements of these vehicles also increase under GNSS (cf. GPS) failure scenarios. In addition, in the low airspaces in which drones and, in the future, manned autonomous air vehicles operate, a significantly increased occurrence of uncooperative obstacle objects can be assumed, which requires autonomous and automatable obstacle identification by autonomous airspace participants. In addition to dealing with uncooperative obstacles, ATM system redundancy can be ensured for cooperative airspace participants. Often used optical sensors, such as cameras, currently do not have any other additional functions or supporting programs to perform this task in real time or autonomously. This is where the development of the KOJAK project comes in.

Environment analysis for collision avoidance is performed by means of two sensors and sensor fusion

By fusing the latest sensor technology (camera, lidar) and the use of artificial intelligence, standards for safe collision avoidance and safe GNSS (GPS)-independent autonomous navigation during take-off and landing, as well as in flight, are developed for the first time, especially in extreme situations (e.g. low-level flight with many static collision objects) under real conditions. The intelligent real-time decisions are achieved by the unprecedented fusion of the different sensor data. Through sensor fusion and the use of artificial intelligence, the UAV is enabled for the first time to independently recognize its environment in real time and to safely avoid obstacles, whether static or dynamic. This is realized with the help of the intelligent networking of sensors (camera and lidar) with the AI algorithms developed by SPL. In particular, the focus is on the application of the AI method of machine learning based on artificial neural networks. All of this is being researched and further developed for widespread application as part of the KOJAK funded project.


Company News

See all information around SPLEENLAB.

› News archive

Press | 29.04.2021

Prof. Markus Enzweiler Expert for autonomous mobility new advisory board member

News | 29.04.2021

Safe automated collision avoidance is the key to out-of-sight flights

Press | 24.01.2021

AI multisensor cubes from Spleenlab revolutionize the drone market

News | 02.03.2021

New products for safe BVLOS flights and landing

Products & Applications

Our AI platform enables safe autonomous mobility of any kind of flying and driving vehicles and beyond.

unmanned aerial vehicles

AIR TAXI & AVIATION

autonomous driving

VisionAIry Cubes