Non-Contact System for Global Reporting Format (GRF) Automation: Contaminant Body Detection and Depth Estimation in the Case of Rainy Weather

Abstract

The paper designed a non-contact system in order to perform the application (on a runway) of the Global Reporting Format (GRF) developed by International Civil Aviation Organization (ICAO). The system involves devices that film the surface (a runway in our case) from the air and displays the contaminant (water) body and measures the depth of the water automatically during the inspection. While measuring, data are sent to a computer used as a receiver. The developed devices are automatic devices implemented specially to use during rainy weather or even for some other cases. The aerial system uses a raspberry pi 4 model B, a camera, a laser sensor, an ultrasonic module, a Virtual Network Computing (VNC) and python codes developed by the authors. Results obtained show that using these devices to retrieve the Runway Condition Report (RCR) is very fast and human presence on the runway is not needed. The results obtained using these devices show that the method used herein is a proper solution to the GRF issues in the rainy areas, where the contaminant body detection and the accurate depth measurement were not well estimated because of the lack of a suitable method.

Share and Cite:

Sama, D. , Gnabahou, D. , Bayili, B. and Ouattara, O. (2023) Non-Contact System for Global Reporting Format (GRF) Automation: Contaminant Body Detection and Depth Estimation in the Case of Rainy Weather. Open Journal of Applied Sciences, 13, 2127-2140. doi: 10.4236/ojapps.2023.1311165.

1. Introduction

The GRF is a new ICAO methodology for assessing and reporting runway surface conditions, which intend to reduce the safety risks related to runway excursions [1] [2] , the most common form of runway safety related incident [3] . The Agency for the Safety of Air Navigation in Africa and Madagascar (ASECNA) zone has recently experienced a significant number of runway excursions due to the presence of puddles on the runway [4] . Runway excursion is one of the International Civil Aviation Organization (ICAO) high-risk categories of occurrences (HRCs) [5] . The assessment is conducted by a trained observer (normally airport operations staff) who, using a globally recognized runway condition matrix, allocates a Runway Condition Code (RWYCC) to each third of a runway. This code is complemented by a description of the surface contaminant, including its type, depth and amount of coverage, again per third and using a globally recognized set of descriptors. The outcome of the evaluation is then incorporated into a standard Runway Condition Report (RCR), which is then forwarded to the air traffic control and aeronautical information services for transmission to pilots by SNOWTAM, by Automatic Terminal Information Service (ATIS) and, if necessary, also by radio broadcast. By correlating the RCR with aircraft performance data provided by the manufacturer, the flight crew then is able to calculate their takeoff and landing performance [6] . This matching of standardized observation and reporting with standardized Aircraft performance data is the key advantage of the GRF. Another important element of the GRF is the facility for flight crew to provide their own observations of runway surface conditions, used to confirm the RWYCC or provide an alert to changing conditions. It should also be noted that the GRF applies to all aerodromes including those with and without winter conditions [1] [7] . In the GRF application, there is a practical gap related to the lack of effective strategies for GRF automation. Although there might be existing research on GRF automation in some laboratories in Europe and in America, the current methods which are simple and proven, are still far from meeting the new international requirements for real-time monitoring of runway conditions. The significant growth in traffic increases runway occupation time which must be minimized by human inspections. So, as those Laboratories in Europe and in America [8] and after our first paper on the similar topic dealing with a GRF method (of ground contact measurement) [9] ; we are improving our method by designing a non-contact system to complete the missing part of the work, the runway contaminant body detection and to improve the existing solution, the water depth measurement. Thus, we are interested in making the Global Reporting Format (GRF) application easier and totally automatic during rainy weather. And we are still working in the case of WET surface, a surface with STANDING WATER and a DRY surface.

A raspberry pi 4 model B, a pi camera, a laser sensor, an ultrasonic module, a Virtual Network Computing (VNC) and python codes are used to design a fully automatic system to determinate the contaminant (water) body and depth on the runway, generate automatically a Runway Condition Report (RCR) and send all data wirelessly to the office. Materials and methods are presented in Section 2. Section 3, Section 4 and Section 5 concern respectively Results, Discussions, and Conclusion.

2. Materials and Methods

2.1. Hardware Design and Software

The work of this paper is based on laboratory experiments by using Ouagadougou international airport and Kamboinsin runways data to automatically report the contaminant body and depth on the runway and the Runway Condition Report (RCR). Materials used to design the measurements devices are various. On one hand, there are raspberry pi 4 model B, a pi camera, a laser and an ultrasonic module for runway inspection and data transmission. In the other hand, we have a computer supposed to be at the office to receive data measured on the runway. The system works with software including Virtual Network Computing (VNC) and python codes for runway conditions real-time monitoring.

The materials are presented in Figure 1 according to their uses:

To design the Automatic GRF System Device we have used the following materials and software:

1) Raspberry pi 4 model B & Pi camera,

- Raspberry pi is a series of small single-board computers (SBCs) developed in the United Kingdom and widely used in many areas, such as robotics and weather monitoring, etc.

Figure 1. GRF automatic system illustration.

The Raspberry pi allows us to pre-trained the software for runway dryness and wetness classification.

- The pi Camera module is a camera that can be used to take pictures and high-definition video. This Pi Camera module can be attached to the Raspberry pi’s CSI port using a 15-pin ribbon cable. The camera is used to have a look on the expanse of water on the runway.

2) A laser Sensor which is a module that measures the exact height (in millimeters) of its position to the ground (supposed to be the runway).

3) An Ultrasonic sensor, which is used to measure the exact height (in millimeters) of its position to the water surface.

4) An Hp laptop (computer) is used to write and run the code, and to receive data.

5) Python is a high-level, general-purpose, interpreted object-oriented programming language. It is our main coding language.

6) Virtual Network Computing (VNC) is a graphical desktop-sharing system that uses the Remote Frame Buffer protocol (RFB) to remotely control another computer. We use it to monitor the inspection images provided from the Raspberry pi.

2.2. Basic Principle of Use

The laser and ultrasonic sensors are leveled so that they indicate the same height when they are placed above the dry ground. The Raspberry pi is controlled via Virtual Network Computing (VNC). The whole system can be embedded on an inspection vehicle or an Unmanned Arial Vehicle (UAV) that will perform all the measurements of the runway. But here, due to temporary difficulties accessing the runway with a drone, we are using pre-taken pictures and videos of the runway. When inspecting the runway, the Raspberry pi does the following tasks:

- Classifies the runway surface images accordingly to whether it is dry or wet using a pre-trained model that we obtained by using in situ images.

- Colors the parts of the runway surface according to the condition (dry, wet or Standing water) detected: in our application we use red for dry, brown for wet. For standing water conditions, we need to have a depth of water higher than 3 millimeters (Table 1). As we can’t use the method for the water depth measurement in the images, we use a random function that generates the water depth in order to be able to simulate the case of standing water. That function is not used for real measurements. In real-time, the laser and ultrasonic sensors are used;

- Calculates the condition percentage for each third of the runway thanks to a function;

- Calculate the width of the detection which will be used in the case of standing water;

- Displays the data, real-time images and RCR, to the receiver laptop (computer) and saves them.

Table 1. Runway Condition Assessment Matrix (RCAM) [10] . RCAM-WET and DRY only (based on PANS-Aerodromes (Doc 9981)).

Note. An RWYCC 5, 4, 3 or 2 cannot be upgraded.

The way the electronic modules works together in order to provide an interesting Runway Condition Report (RCR) has been explained in the first papers related to Global reporting Format. The water depth was measured by contact with the ground and this method doesn’t provide any information of the expanse of the water on the runway and the real proportion of each type of condition:

The runway surface is WET if it is covered by any visible dampness or water up to and including 3 mm depth; there is STANDING WATER on the runway surface if more than 3 mm depth of water is on it. When it remains less than 25% visible moisture on the runway surface, it is considered as a DRY surface (ASECNA) [9] .

2.3. Real-World Distance and Real-World Surface Area Equations

To obtain the Real-world Distance, we convert the image pixel dimensions to real-world units (e.g., meters), we need to know the relationship (ratio) between pixel units and our desired units. This is typically achieved by using a known reference distance. For example, if we know the width of a given object in both pixels and real-world units, we can set up a conversion factor (ratio) as shown in the following foundational formula and convert image pixels to real-world measurements:

Let’s have an object in the image with a measured width in pixels (pixel_width) and a measured height in pixels (pixel_height),

real_world_distance = pixel_distance × real_world_width pixel_width (1)

where real_world_width pixel_width = calibration_scale , the calibration scale (ratio).

To calculate the real-world surface area (A) of a limited surface in square meters using the calibration scale, we can use a similar equation as it follows:

A = ( pixel_width × pixel_height ) × ( calibration_scale ) 2 (2)

Then we calculate the percentage of a limited part (wet) of the runway surface, knowing the surface value of the entire third part of the runway, according to the following formula:

Percentage = Limited_Part_Area runwaythird_Surface_Area × 100 (3)

2.4. Water Depth Equations

D: real distance between the Runway surface and the (laser and ultrasonic) sensors’ position;

d: real distance between the water surface and the (laser and ultrasonic) sensors’ position measured by the ultrasonic sensor;

Da: apparent distance between the runway surface and the (laser and ultrasonic) sensors’ position measured by the laser sensor through the air-water;

Depth: height of the water on the runway surface;

Δ D = D a D (4)

n: refractive index of rain water, n ≈1.344 [11] ;

c: light speed in the air;

v: light speed in the water,

c = v n ; [12] (5)

t: round trip duration of laser light in the air part,

t = 2 × d c ; (6)

τ: round trip duration of laser light in the water part,

τ = 2 × D e p t h v . (7)

We know that when the laser is on, the light travels from the laser position to the surface of the runway, and back again to the laser through the water because water is transparent. And the travel lasts (t + τ) at the light speed c (according to the laser incorporated formula).

So,

D a = 1 2 c × ( t + τ ) = 1 2 c × t + 1 2 c × τ (8)

And

D = 1 2 c × t + 1 2 v × τ (9)

We have (by making (8)-(9)):

D a D = 1 2 c × t + 1 2 c × τ 1 2 c × t 1 2 v × τ

D a D = 1 2 c × τ 1 2 v × τ

D a D = 1 2 ( c v ) × τ ,

where according to Equation (7): τ = (2 × Depth)/v.

So, D a D = 1 2 ( c v ) × 2 × D e p t h v , with v = c/n according to Equation (5).

Thus,

D a D = 1 2 ( c c n ) × 2 × D e p t h c n

D a D = 1 2 c ( 1 1 n ) × 2 × n × D e p t h c

D a D = ( 1 1 n ) × n × D e p t h

D a D = ( n 1 ) × D e p t h

Therefore,

D e p t h = D a D n 1 , or D e p t h = 2.906977 × ( D a D ) (10)

In the case we have an ultrasonic sensor that can retrieve the distance d we can continue as it follows:

As D = d + D e p t h , Equation (10) becomes D e p t h = D a d D e p t h n 1 .

So, n × D e p t h D e p t h = D a d D e p t h .

Finally,

D e p t h = D a d n or D e p t h = 0.744048 × ( D a d ) (11)

3. Results

We have provided our results in tables and figures:

When an accurate ultrasonic sensor is associated, the variation of the device (laser + ultrasonic) height does not affect the same water depth value according to the Formula (11). In fact, the laser and ultrasonic sensors are leveled just before the experiments start.

The RCR’s system reliability is being checked by using it in the presence of Ouagadougou airport inspectors, in the case of WET surface, a surface with STANDING WATER and a DRY surface. Experiments are done at the laboratory by using in situ data, where the airport ICAO location indicator (DFFD) [15] and the Runway number (04) in the RCR message are used as default data.

4. Discussions

By using laser and ultrasonic sensors and optical (light) properties, (illustrated in Figure 2), we retrieved the Equations (10) and (11). Then we proceeded to some experiments to check the equations. The process was to place the laser at different heights and find the relationship between ΔD which is the difference (DaD) and the water depth (Depth). Figure 3 shows experiments done at heights of 645 mm, 846 mm, 1059 mm and 1637 mm which give almost the same equation as Equation (10). The comparison is done in Figure 4. We can see that the graphs tend towards the reference graph, the one representing the Equation (10). And the graphs obtained for the heights higher than 1000 mm are almost merged in the considered range of water depth (from 0 mm to 15 mm) [16] .

Differences between the computed results and the experiments can be explained on the one hand by the measurement’s errors due either to the sensor accuracy and on the other hand to the reading errors.

Anyway, these errors affect very slightly the runway retrieving conditions and we still have the same Runway condition report (RCR). According to the Equation (11), and the accuracy (±1 mm) of the laser and ultrasonic sensors, the accuracy of the measurements is ±0.744048 × (1 mm + 1 mm) = 1.488 mm [17] [18] .

Figure 2. Use of laser and ultrasonic sensors to retrieve water depth on a runway.

Figure 3. Use of laser at different height to retrieve water depth on ground.

Figure 4. Comparison of the Figure 3 plots with the Equation (10) plot [13] .

So, the accuracy of the water depth measurements depends specially on the materials ones.

Figure 5 shows that the accuracy of our system to detect wetness and dryness is very good although we still have to improve it. And the most important is that it doesn’t affect the Runway Conditions Report (RCR) of which an example is given in Figure 6. In fact, in Figure 6, the acronym DFFD means that measurements are taken at Ouagadougou airport [15] . The 8 digits (10111557) that follow indicate the day and time at which measurements were made. In this case, the measurements were recorded on October 11 at 03:57 p.m (15:57). The next digits (04) represent the recorded runway number. The remaining digits and letters in the message highlight the runway conditions code (RWYCC = 5), all the thirds of the runway are entirely (100) wet with no reported (NR) level of water (depth ≤ 3 mm) [14] . Therefore, our system can be considered as a proper application for an easier and faster Global Report Format (GRF), provided that we improved it more.

The inspection for a 3000 meters long runway (as in the case of Ouagadougou airport) lasts about 5 minutes.

Indeed, by using a drone flying at the speed of 10 meters per second [19] , the runway occupation time during measurements ( T measurements ) is:

T measurements = 3000 10 = 300 seconds = 5 minutes

The most interesting part with this system is that it sends immediately and wirelessly the data (images and computed aeronautical information) at the inspection office. So, our system will allow the runway inspectors not to go on the runway under the rain to evaluate the RCR. And they won’t occupy the runway which is usually used because of the traffic growth [8] . The drone will have around only 5 minutes to stay on a runway of 3000 meters against around 20 minutes the runway inspectors currently spend on it [9] .

Figure 5. Detection of dryness and wetness on Ouagadougou & Kamboinsin airdromes runways.

Figure 6. Example of Runway conditions (wetness case) report (RCR) [14] .

In addition, the system is easy to use because it works autonomously. Inspectors will only have to run it.

Using this system will also protect inspectors from rain because they won’t have to go out from their office for the different measurements. As it is automatic, there will be no making mistakes risk while writing the RCR, because after taking itself the measurements, it will generate, memorize, and send the RCR automatically and immediately.

5. Conclusion

This paper has presented an automatic system for the Global Reporting Format inspections. The system we developed is an autonomous and automatic application implemented for rainy weather. It uses a raspberry pi 4 model B, a camera, a laser sensor, an ultrasonic module, a Virtual Network Computing (VNC) and python codes developed by the authors. These codes put in application a system designed from sensors and Runway Condition Report (RCR) determination ways defined by ICAO. Results obtained show that using that system to retrieve the Runway Condition Report (RCR) is fast and human presence on the runway is not needed. Using this system doesn’t affect the Runway Conditions Report (RCR) and the inspection data are generated, memorized, and sent automatically and immediately to the supervision office. This system can therefore be considered, and one can try to improve it with more accurate sensors or any other suggestion.

Acknowledgements

I would like to express my appreciation for the additional support received throughout the course of this project, which may not have been captured in the author contribution or funding sections. I am grateful for the administrative support provided by the Higher Education Support Project (PAES) and the Delegation to National Aeronautical Activities (ASECNA/DAAN, Burkina Faso). Their instrumental assistance in various aspects, such as laboratory management, equipment’s supply, and logistical coordination was invaluable in ensuring the smooth progression of the research. Finally, I would like to thank the members of the GRVC Robotics, Vision and Control Lab of the University of Seville for fostering an environment of collaboration and knowledge exchange.

Author Contributions

All authors contributed to the study conception and design. Material preparation, data collection and analysis were performed by Dieudonné SAMA, Brahima BAYILLI, Oula Fayçal Denilson OUATTARA. The first draft of the manuscript was written by Dieudonné SAMA, supervised by Doua Allain GNABAHOU and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Data Availability

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] ICAO (2021) Implementation of Global Reporting Format for Runway Surface Conditions (GRF) Guidance Based on Management of Change (MOC). 1.
https://www.icao.int/safety/SiteAssets/Pages/GRF/GRF%20Management%20of%20Change_1.0.pdf
[2] ICAO (2019) Cir 355, évaluation, mesure et communication de l’état des surfaces de pistes.
https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwiKzM3QntWCAxUtcKQEHZrTDkQQFnoECAwQAQ&url=
https%3A%2F%2Fwww.bazl.admin.ch%2Fdam%2Fbazl%2Ffr%2Fdokumente%2FFachleute%2FFlugplaetze%2FICAO%2Ficao_circular_355_assessment_measurement_and_reporting_of_runway_surface_conditions.pdf.download.pdf%2FICAO%2520Circular%2520355%2520Assessment%2C%2520Measurement%2520and%2520Reporting%2520of%2520Runway%2520Surface%2520Conditions.pdf&usg=AOvVaw28u_uT_f5PNC1WToSKrgS6&opi=89978449
[3] Manzi, N.M. (2020) Méthodologie de compte rendu mondial sur l’état de la surface des pistes (GRF).
https://www.icao.int/WACAF/Documents/Meetings/2020/GRF2019/1.%20G%C3%A9n%C3%A9ralit%C3%A9s.pdf
[4] ICAO (2013) Inclusion de l’observation d’état de la piste dans les messages METAR/SPECI et met report/spécial élaborés dans la région AFI. Présentée par l’ASECNA.
https://www.icao.int/WACAF/Documents/APIRG/SG/2013/apirg-met-sg11/WP11_fr.pdf
[5] ICAO (2022) Global Aviation Safety Plan (Doc 10004) 2023-2025.
https://www.icao.int/safety/GASP/Documents/10004_en.pdf
[6] Blanchard, G. and Rado, Z. (2019) Runway Weather Information Systems: State of the Art and Main Issues for Standardization. ICAO/ACI Symposium Implementation of the New Grf for Runway Surface Condition, Montreal, March 28th 2019.
https://www.icao.int/Meetings/grf2019/Documents/Presentations/GRF2019%20S8%20Guilhem%20Blanchard%20and%20Zoltan%20Rado%20-%20STAC%20and%20AST.pdf
[7] Agence pour la Sécurité de la Navigation Aérienne en Afrique et à Madagascar (ASECNA) (2021) Vingt-et-quatrième réunion du Groupe régional AFI de Planification et de Mise en œuvre (APIRG/24). Mise en œuvre du GRF à l’ASECNA et défis.
https://www.icao.int/WACAF/Documents/APIRG/APIRG%2024/IP06%20-%20APIRG24%20-%20Mise%20en%20oeuvre%20du%20GRF%20et%20d%C3%A9fis.pdf
[8] (2019) Ophélia, un outil de prévision des hauteurs d’eau sur les pistes pour l’information aéronautique. Lyon, France.
https://www.cerema.fr/fr/actualites/ophelia-outil-prevision-hauteurs-eau-pistes-information
[9] Sama, D., Gnabahou, D.A., Ouattara, F., Zidouemba, M., Diassibo, O. and Sandwidi, S.A. (2022) Global Reporting Format (GRF) Application Automation for Runway Surface Conditions in West Africa. Advances in Aerospace Science and Technology, 7, 135-145.
https://doi.org/10.4236/aast.2022.73009
[10] ICAO (2020) Doc 9981, Procédures pour les services de navigation aérienne Aérodromes. Canada, 1.
https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwi3lq-1qdWCAxX3VaQEHTiuDCMQFnoECAkQAQ&url=
https%3A%2F%2Fwww.bazl.admin.ch%2Fdam%2Fbazl%2Ffr%2Fdokumente%2FFachleute%2FFlugplaetze%2FICAO%2Ficao_doc_9981_pans-aerodromes.pdf.download.pdf%2FICAO%2520doc%25209981%2520PANS-Aerodromes.pdf&usg=AOvVaw3YvkMs0sW3cohdQtQBNn5K&opi=89978449
[11] Taras P. and Jens, R. (2017) Relation between Raman Backscattering from Droplets and Bulk Water: Effect of Refractive Index Dispersion. Journal of Quantitative Spectroscopy and Radiative Transfer, 208, 172-178.
https://www.sciencedirect.com/science/article/abs/pii/S0022407317309433
[12] Wikipedia-Refractive Index.
https://en.wikipedia.org/wiki/Refractive_index
[13] Matplotlib (2023) Matplotlib: Visualization with Python.
https://matplotlib.org/
[14] Manzi, N.M. (2019) Le rapport sur l’état des pistes (RCR). ICAO Dakar Uniting aviation, ICAO Dakar.
https://www.icao.int/WACAF/Documents/Meetings/2019/ICAO-ACI-Implemnetation%20runways/3.%20Rapport%20sur%20l%27%C3%A9tat%20des%20pistes.pdf
[15] ICAO, (2023) Location Indicators (Doc 7910/187). Only Manual Version.
https://store.icao.int/en/location-indicators-doc-7910
[16] Civil Aviation Authority of Sri Lanka (2021) Manual on Global Reporting Format (GRF) Implementation.
https://www.caa.lk/images/pdf/guidance_material/subject_specific/SLCAP_2500.pdf
[17] JRT (2023) JRT M703A 40 m Laser Distance Meter Module Sensor.
https://www.jrt-measure.com/laser-distance-meter-module/52419167.html
[18] DFROBOT (2020) SKU:SEN0358, URM14.
https://wiki.dfrobot.com/URM14_RS485_Precision_Ultrasonic_Sensor_200KHz_SKU_SEN0358
[19] How Fast Can a Drone Fly? Remoteflyer.
https://www.remoteflyer.com/how-fast-can-a-drone-fly-with-examples/

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.