Applications of Commercial Truck Dash Cameras for Work Zone Inspection and Monitoring

Abstract

Transportation agencies manage hundreds of active work zones, often spread across large geographic regions. Traditionally, verifying compliance with maintenance of traffic (MOT) plans has required in-person inspections by staff or contractors which is time-consuming, resource-intensive, and often requires extensive driving with an image or video recording device. With tight staffing requirements, it is only possible to inspect a small subset of work zones using field visits. The emergence of commercial truck dash cameras that can provide images at 1 second intervals from several trucks a day now enables an agency to review dash camera images to “virtually” drive their work zones. This has the benefit of not only drastically reducing travel time and costs, but also provides an opportunity to perform repetitive, weekly inspections of the work zones to determine compliance with an agency’s practices. A series of case studies covering sign placement, lane configuration, temporary work zones, maintenance closures, and pavement marking applications is presented. The paper concludes with an example of two different artificial intelligence (AI) models that can be used to process these images to determine the presence of a work zone on interstate roads. Across a suite of 40 images from 8 states and 20 interstate routes, both models performed very well and demonstrate considerable opportunity to integrate commercial dash camera images with AI models to screen work zones at scale for further human review.

Share and Cite:

Overall, M. , Mukai, J. , Sakhare, R. , Desai, J. , Horton, D. and Bullock, D. (2026) Applications of Commercial Truck Dash Cameras for Work Zone Inspection and Monitoring. Journal of Transportation Technologies, 16, 95-114. doi: 10.4236/jtts.2026.161006.

1. Introduction

In 2023, there were over 100,000 work zone crashes across the United States [1]. Agencies devote considerable time and effort to carefully designing work zone geometries, markings and signs to ensure high quality maintenance of traffic (MOT). However, work zones are extremely complex due to the large numbers of stakeholders ranging from initial design engineers, prime contractors, and sub-contractors. Work zone signs, markings, barrier walls and barrels are items that often require considerable inspection effort to ensure the work zone MOT is consistent and compliant with agency standards. To illustrate the dynamic nature of work zones, images from roadside monitoring cameras, also called Intelligent Transportation System (ITS) cameras, are shown below.

Figure 1 shows an ITS camera image along interstate I-65 in Indiana just upstream of a temporary work zone that requires a reduced speed limit. In this case, the static speed sign is located in close proximity to a work zone speed limit trailer. The static speed limit sign shows a 65 mile per hour (MPH) speed limit while the work zone speed limit trailer about 120 feet further downstream shows a 55 MPH speed limit when flashing. The Manual on Uniform Traffic Control Devices (MUTCD) states that existing permanent regulatory signs shall be removed or covered if a temporary traffic control zone requires regulatory measures that are different [2]. These types of situations occur quite frequently and are very difficult for agencies to monitor and provide timely feedback. In this particular case, while a roadside camera was able to capture the scenario and could be used for subsequent feedback, the high cost of road side cameras make this type of monitoring virtually impossible to repeat at scale.

Figure 1. ITS camera image on I-65 in Indiana showing a static speed limit sign and a work zone speed limit trailer in close proximity.

Figure 2 shows two ITS camera images along I-65 in Indiana in an active work zone where 3 lanes of travel are being maintained and a concrete barrier wall is being installed directly adjacent to live traffic. Although this sequence was captured on a roadside ITS camera, this is another example of an active construction zone that is very difficult to monitor for compliance with standard practices. When one considers this is a moving work zone activity with barrier wall being placed at approximately 900 feet per hour, road side cameras and/or inspectors are challenged to monitor these activities as the work progresses.

Figure 2. ITS camera images on I-65 in Indiana showing the installation of a concrete barrier wall during the day next to live traffic.

While inspectors can drive-through a zone for inspection purposes, this type of inspection is costly and labor intensive, often requiring multiple agency members: one to drive and one to document the work zone. In some cases, it is done with one driver and a video recorder, but that requires further effort to review the video after the drive is complete. As a result, field inspections are often only performed for the most complex work zones, and then relatively infrequently. Agencies may delegate work zone inspection to MOT sub-contractors and require them to drive the work zone on a regular basis, typically weekly to perform inspections. However, this simply shifts the work from the agency to the sub-contractor, which is still costly and labor intensive.

Research Objective

The objective of this paper is to evaluate emerging commercial truck dash camera imagery collected at scale across large sections of the network at approximately 1 second intervals in an efficient manner on a regular, perhaps weekly basis. Such inspection provides an agency with images of work zone MOT signs, striping, lane use patterns, as well as lane and shoulder conditions. In summary, agencies need a cost-effective mechanism to “virtually” drive their work zones on a regular basis and identify irregularities that warrant further study. This paper illustrates several work zone inspection use cases that can be done effectively with commercial truck dash camera images and rapidly reviewed in the office. The paper concludes by illustrating how current machine learning/artificial intelligence technologies can further improve efficiency in work zone review.

2. Background

Some studies have looked at using cameras to detect work zones for autonomous vehicles [3] and to detect changes in speed limit in work zones [4]. There have been recent developments in using unmanned aerial vehicles (UAV) for traffic monitoring and management [5]-[11], estimation of traffic flow [12], identification of defective road markings [13], and pavement conditions [14]-[19]. Studies have used UAV for work zone related monitoring [20]-[23] but these techniques are still in their infancy.

2.1. Agency Operated Camcorders and Dash Cameras

Perhaps the most common inspection technique for transportation agencies is to drive through a work zone to perform inspections. In earlier years, this was often done in two person teams with one driving and one taking notes. More recently, many agencies have transitioned to using consumer electronic devices, such as GoPros, that can geotag latitude and longitude on a digital image every 0.5 to 1.0 seconds. This has the ability of providing precise location information if further follow up is required. However, this still requires significant travel to and from job sites and does not scale well.

2.2. Commercial Dash Camera Technology

Many private motorists and public agencies have begun including dash cameras in their vehicles for risk management and loss protection. The commercial trucking industry has been particularly aggressive at integrating dash cameras across their fleets and there are now a number of dash camera providers for the commercial truck industry. Some of those providers have business models with consented user agreements that allow truck images to be shared with external stakeholders such as public agencies. While commercial truck dash cameras have been used to verify roadway conditions [24], the use of dash cameras for work zone related tasks is an emerging opportunity.

2.3. Coverage of Commercial Dash Cameras

Past studies have reported some commercial truck dash camera providers have 30,000 - 60,000 trucks with images that may be downloaded at approximately 1 second intervals on demand [25]. That study reported that virtually all sections of Indiana interstates are covered by imagery at least a few times a day during each day of the week, with some modest decrease in coverage on weekends when there are fewer commercial trucks on the interstates. Some of the busier corridors, such as I-65 in Indiana, frequently have 4 or more trucks per hour that can provide dash camera imagery at 1 second intervals. Figure 3 shows example imagery from a truck on southbound I-69 on August 12, 2024. Using linear referencing techniques, each commercial truck dash camera image is associated with an interstate route, direction of travel and a mile marker location along the route [25].

Metadata associated with the image provide valuable context for the images. At the top of the image shown in Figure 3(a), the following metadata attributes are visibly encoded on the image overlay:

  • Internal Purdue Identifier (PU929770);

  • State (IN);

(a)

(b)

(c)

Figure 3. Dash camera images along I-69 in Indiana verifying MUTCD work zone design. (a)Arrow board at merge point; (b)LANE ENDS MERGE LEFT sign 1000 feet in advance of merge point; (c) RIGHT LANE CLOSED AHEAD sign 2500 feet in advance of merge point.

  • Route (I-69S);

  • Mile Marker (273.3);

  • Local Date and Time: (2024-08-12, 17:16:15);

  • Speed (22 MPH).

Comparing Figure 3(b) and Figure 3(c), the mile marker changes to 273.5 and 273.7, the time to 17:15:12 and 17:14:42, and the speed to 16 and 27 MPH, respectively. The visible side mirrors in the lower left and right corner also provide evidence that these images were captured from a commercial truck. These dash camera images from Figure 3 can be used to identify MUTCD traffic control devices such as signs and markings. A subject matter expert can then cross reference the images with standard agency practices. Figure 4 is adapted from the MUTCD [2] to illustrate the practices transportation agencies typically follow for sign placements near an interchange entrance ramp. In this simple example, one can see the arrow board in Figure 3(a), 2 orange W4-1 signs in Figure 3(b), and 2 orange W20-5 signs in Figure 3(c) align with the configuration recommended in Figure 4.

2.4. Near Term Dash Camera Inspection Workflow for Agencies and Data Costs

This technology is at the point where imagery from relatively large segments of roadways can be downloaded on a weekly basis. The current download capacity of Purdue is to retrieve approximately 10,000 images per week from commercial dash camera providers. That imagery can cover approximately 400 miles of interstate. Early efforts to scale this indicate approximately 400 miles of interstate can be monitored per week with a data service cost on the order of $3400/month. Depending on the specific agency’s need for images per week, the scaling of the data can be increased or decreased based on the requirements.

Figure 4. MUTCD Typical Application 44: Work in the vicinity of an entrance ramp.

In the near term this is a very cost-effective replacement for traditional “windshield” inspection where an agency or contractor drives the work zone. In Indiana, a work zone of interest can be automatically downloaded on one day and a quick frame by frame “virtual” drive to examine key elements and determine if they have changed can be done the following day in approximately an hour of combined machine and manual effort. The following section discusses a series of use case studies. Several examples from outside of Indiana have been incorporated to demonstrate how well these commercial truck dash camera inspection techniques scale nationally.

3. Methodology

3.1. Use Case Studies

This section presents five selected use case studies highlighting how commercial truck dash camera images can be used in the verification of sign placement, temporary work zone lane usage, temporary work zones and maintenance closures, as well as pavement markings and shoulder conditions. Proper placement of signage within work zones is important to ensure that motorists are able to safely and predictably navigate through work zones.

The MUTCD provides guidance, but by the nature of work zones, roadway conditions are regularly changing and contractors and agencies must verify that work zone signage is not only positioned in the correct place but that any work zone asset is compliant with the MOT plans. While the use case studies presented in this section all focus on interstate roadways, the techniques are scalable to any roadway where there is sufficient commercial dash camera coverage.

3.1.1. Sign Placement (Use Case Study 1)

Work zone warnings signs and arrow boards, such as those shown in Figure 3, are examples of MOT that an agency may inspect to determine if the correct design is in place and the correct message is being conveyed to motorists. Figure 5 presents an image of a truck traveling along southbound I-69 near mile marker 273 on August 12, 2024. On the right side of the image, an entrance ramp that will later merge with the mainline traffic downstream is visible. The right side of the entrance ramp has a work zone speed limit sign of 55 MPH while the left side of the entrance ramp has a work zone speed limit trailer showing a speed limit of 50 MPH when flashing. While these messages may be displaying different speed limits under different conditions, the placement of speed limit trailer might be better positioned further away from the static speed limit sign to limit the possibility of confusion from the perspective of the driver. The ability for a transportation agency to verify signage on a frequent basis can allow the agency to be proactive

Figure 5. Dash camera image along I-69 in Indiana showing a static work zone speed limit sign and a dynamic work zone speed limit trailer displaying different speed limits.

and agile in determining that the correct information is being conveyed to motorists.

3.1.2. Work Zone Lane Usage (Use Case Study 2)

On large freeway work zones, there can be several MOT phases throughout the length of a project. The MOT pattern may change several times, which can result in different lane restrictions and lane usage depending on the phase. In Figure 6, several images are shown from a commercial truck along I-65 in Indiana that is approaching a bifurcation where the left lane crosses over to the opposite side of the freeway along the northbound lanes and the right lane continues on the southbound side of the freeway. The following callouts verify the location of the images are at the same location while the MOT setup changes:

  • Callout Y points to the yellow edge line, delineating the inside edge of the roadway;

  • Callout E points to the white edge line, delineating the outside edge of the roadway;

  • Callout t points to a tree west of I-65;

  • Callout d points to a drainage culvert.

Figure 6(a) and Figure 6(b) shows that on August 8, 2024, of the three lanes that were currently configured when approaching the bifurcation, lanes 1 and 3 were open for vehicular travel. Conversely, Figure 6(c) shows on October 9, 2024 that lanes 1 and 2 were open and lane 3 was now closed due to a shift in MOT for

(a)

(b)

(c)

Figure 6. Dash camera images along I-65 in Indiana showing difference in lane usage at bifurcation over 2 months. L1, L2, and L3 represent lane 1, lane 2, and lane 3, respectively. (a) Dash camera image on August 9, 2024 showing lane 1 and 3 open; (b) Dash camera image on August 9, 2024 showing lane 1 and 3 open; (c) Dash camera image on October 9, 2024 showing lane 1 and 2 open

the specific phase. These images illustrate the ability to perform a detailed inspection to assess status of construction phasing and more importantly, efficiently evaluate the lane configuration and signage from both a plan compliance and driver’s perspective.

3.1.3. Temporary Work Zones (Use Case Study 3)

Work zones that are short-term and temporary in nature can be challenging to track from an agency perspective as they can be completed relatively quickly and the MOT is often placed and taken down within the same day. Furthermore, while maintenance activities may only need to close a single lane for a few minutes or hours, they can be scheduled to take place over a range of days making it hard for an agency to track exactly when the work took place and more specifically, what the MOT looked like. Figure 7 and Figure 8 are both examples of temporary

Figure 7. Dash camera image along I-70 in Illinois showing shoulder running of several vehicles due to lane incursions from work crew.

Figure 8. Dash camera image along I-83 in Maryland showing shoulder running due to lane incursions from work crew.

work zones where crews are installing pavement markings with lane incursions from the work crew in place. In Figure 7, a work crew along I-70 in Illinois is installing skip line pavement markings between both travel lanes and in Figure 8, a work crew along I-83 in Maryland is also installing skip line pavement markings. Both of these examples involve the vehicles being diverted onto the inside shoulder. From the metadata shown on the top of the camera images, the speed of the vehicles in these two use cases are 15 and 26 MPH. Having the ability to visually record these type of work zones allows for transportation agencies to better understand how construction related activities that require a lane incursion can affect driver behavior. Longer term, they can help provide an important training tool for designers as well as roadway contractors on the impact these types of challenging construction activities have on traffic.

3.1.4. Maintenance Lane Closures (Use Case Study 4)

Transportation agencies may have maintenance activities occurring at any time of the day on any day of the year. These activities can take several minutes, several hours, or even several days depending on the specific maintenance taking place. In Figure 9, a truck is driving along eastbound I-70 in Utah on May 5, 2025. In Figure 9(a), the start of a lane closure can be seen with several orange barrels on the shoulder at mile marker 8.8 and the start of the lane taper further downstream from there. Figure 9(b) shows that 0.3 miles downstream at mile marker 9.1, a single right lane closure is in effect and that a vehicle within the barrels is being used for a bridge inspection. Bridge inspections can often require temporary lane closures for the safety of those performing the inspection. With this type of remote inspection by dash cameras, an agency can perform many more inspections than driving out to remote sites. With improved efficiency, more feedback can be provided, as well as generating a library of good practices for future training.

(a)

(b)

Figure 9. Dash camera images along I-70 in Utah showing a temporary work zone for a bridge inspection. (a) Start of taper for temporary lane; (b) An articulated boom lift truck inside barrels for a bridge inspection.

3.1.5. Pavement Markings and Shoulder Conditions (Use Case Study 5)

Work zones often require lane restrictions and lane shifts which can require temporary pavement markings, temporary width constraints, and temporary shoulder conditions. In Figure 10, a truck along the northbound direction of I-65 on June 24, 2025 is within a work zone which has a temporary lane constructed that is using part of the original shoulder. The original pavement markings have been removed and temporary ones have been installed for the duration of this

Figure 10. Dash camera image along I-65 in Indiana showing pavement marking and shoulder inconsistencies.

work zone phase. The ability for transportation agencies and contractors to be able to monitor the condition of the pavement markings is important to evaluate the compliance of the work zone with the design. In addition, week by week monitoring can be performed with dash camera images to monitor changes in shoulders and identify guardrail or MOT devices that require maintenance.

4. Feasibility Study of Using AI Models to Inspect Dash Camera Images

While dash camera images can allow transportation agencies the ability to virtually drive any and all their work zones as frequently as needed, the images still need to be inspected and verified manually, which can become labor intensive with the increasing number of dash camera images. Several researchers have looked at the feasibility of using Artificial Intelligence (AI) for incident detection [26]-[28] and to also predict traffic conditions [29]-[34]. Other researchers have used AI for the recognition of traffic signs [35]-[37], but the ability for an agency to implement these models at a scalable level is not currently present.

As part of ongoing studies of work zones, Purdue monitors interstate speeds in 9 states in a pooled fund study. Those speeds are collected remotely from connected vehicles and occasionally exhibit anomalies such as queued traffic or in some case, total closure [38]. To investigate irregularities in the observed speeds or unexpected queueing, Purdue collects approximately 10,000 dash camera images weekly from commercial trucks across 400 interstate miles in those 9 states. Construction work zones are one of the most common anomalies detected in the connected vehicle data and require manual inspection of dash cameras to confirm. To evaluate the feasibility of automating the verification of a work zone, two AI models, ChatGPT o4-mini and Gemini 2.5 Flash were chosen due to their wide accessibility and usability to process images from these locations. The following AI prompt was submitted to the AI models to analyze a diverse set of 40 images from 8 of the partner states:

“Can you analyze these photos one by one from a driver's perspective and tell me if the driver is in a construction work zone in their direction of travel? Please provide commentary on what you see in the image to support your answer and summarize the results in a table.”

Table 1 provides a summary distribution of the 40 dash camera images submitted to the AI models that spanned 20 interstates routes in the 8 states. The authors selected 31 images that showed some type of evidence of a construction work zone and 9 images where there was no visible evidence of a work zone. Table 2 summarizes the results from the AI prompt. For this study, the AI models’ responses were sorted into one of three categories: Correct if the model correctly identified if it was within or not with the work zone, False Negative (FN) if the model did not identify the work zone while it was within one, and False Positive (FP) if the model incorrect identified a work zone when it was not within one.

Table 1. Summary of images submitted to ChatGPT o4-mini and Gemini 2.5 Flash.

State

Number of Images

in Work Zone

Number of Images not in Work Zone

Number of Unique Interstate Routes

Delaware

2

0

1

Illinois

7

3

5

Maryland

6

0

3

Michigan

1

3

2

Pennsylvania

4

0

2

Texas

4

1

2

Utah

2

1

2

Wisconsin

5

1

3

Totals

31

9

20

Table 2. Summary of results.

ChatGPT

Gemini

Correct

FN

FP

Correct

FN

FP

36

4*

39

1

Legend: FN: False Negative, did not identify work zone; FP: False Positive, incorrectly identified work zone; *: Two of the False Negatives noted the vehicle was not in a work zone, but approaching one.

Neither of the two models made any mistakes indicating the presence of a construction zone when no construction zone was present (FP). However, there were 5 errors by the models on indicating that no construction zone was present when in the authors’ judgement the image included a work zone. Figure 11 shows the response from the models for 5 images where there was disagreement between the authors and the AI model. A potential reason for Figure 11(a) and Figure 11(b) receiving “No” responses from ChatGPT could be due to work zone signage saying work is ahead, giving the possibility of work not being conducted at the location of the images. Figure 11(d) and Figure 11(e) received “Approaching” a work zone response from ChatGPT, potentially for similar reasons as well. With the exception of Figure 11(c), each of the models responded with information that showed the model recognized the presence of a work zone in the direction of travel, but did not say the driver was in the work zone. Figure 11(c) noted the presence of a work zone on the other side of the road. This too seemed to be a plausible response.

While the results presented only use commercially available AI models, the study provides the feasibility of a work zone engineer being able to inspect several sets of dash camera images across a number of work zones, allowing them to rapidly triage and analyze these work zones in a quick manner. Compared to the traditional method where a set of engineers would have to drive out to each work

Figure 11. Images and AI response from each model for the five images where the authors disagreed with the AI model. The response the authors disagreed with is shown as red text.

zone, record several drive throughs in each direction, note anomalies or anything of interest in the moment, and then drive back to their office to further analyze the videos.

5. Conclusions

The use case studies presented in this paper suggest that commercial truck dash cameras are a viable and scalable data source for transportation agencies to inspect and monitor their work zones. This approach drastically reduces the need to physically drive through a work zone to inspect signs, striping and infrastructure. The ability to virtually drive the work zone can allow for a robust inspection and monitoring for a much larger set of work zones on a more frequent basis. Additional benefits accrue from reducing the frequency of traffic exposure for transportation agency and construction staff in inspecting work zones. The use case studies shown by this study focus on interstate highway work zones, but the techniques easily scale to any roadway with sufficient dash camera coverage.

While human inspection of commercial dash cameras is scalable and can be more efficient than manually driving the work zones, the process for inspecting all these images can be labor intensive. To address the repetitive nature of inspecting thousands of images on a weekly basis, an evaluation was performed using two commercial AI models to scan a representative set of images from 8 states to determine if a work zone was present. Table 2 summarizes those results and Figure 11 provides illustration of responses from the AI models where there was disagreement with the authors. In all cases, the 2 AI models gave reasonable responses to the prompt, with some subtle variations in classification that would also occur with humans. These results are very promising and suggest there is an opportunity for further scaling these techniques to systematically process these images with perhaps even more complex questions for identifying damaged infrastructure, sign visibility issues, and ultimately compliance with agency standards and design plans.

However, some limitations of using commercial truck dash camera images is that while the coverage across interstates is usually strong within urban areas and trucking corridors, coverage in rural areas, overnight, and on weekends could be sparse. Furthermore, image quality and position can vary from truck to truck which can become an issue if specific text needs to be read off of road signs.

In summary, this paper has demonstrated the use of commercial dash cameras to virtually inspect work zones over several use cases to assist agencies in monitoring their work zones. Furthermore, the feasibility of using AI models to rapidly analyze these images and videos has promising results. To be able to accurately and efficiently utilize these processes to analyze several work zones within a given time period, agencies will need to train their staff not only in how to view the images and what to look for, but also what AI prompts would be best for their given work zones.

Acknowledgements

Commercial vehicle dash camera images used by this study were provided by Vizzion. The feasibility study was performed using ChatGPT o4-mini and Google Gemini 2.5 Flash. This study is based upon work supported by the Joint Transportation Research Program administered by the Indiana Department of Transportation and Purdue University. The contents of this paper reflect the views of the authors, who are responsible for the facts and the accuracy of the data presented herein, and do not necessarily reflect the official views or policies of the sponsoring organizations. These contents do not constitute a standard, specification, or regulation.

The authors affirm that no AI or LLMs were used in any capacity in the drafting of this manuscript. However, Figure 11 quotes exact responses from the two AI models, with attribution, used to analyze the five images.

Authors’ Contributions

The authors confirm contribution to the paper as follows: study conception and design: M.W.O., R.S.S., D.M.B.; data collection: M.W.O., J.M., R.S.S., D.H. and J.D..; analysis and interpretation of results: MW.O., D.M.B.; draft manuscript preparation: M.W.O. and D.M.B. All authors reviewed the results and approved the final version of the manuscript.

Funding

The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported in part by the Joint Transportation Research Program, and in part by Purdue University and Indiana Department of Transportation under agreement A249-18-ON180087 and Agreement STIND 75458. This research was also supported by Transportation Pooled Fund study number TPF-5(514) with participation from FHWA and state agencies from DE, IL, IN, MD, MI, PA, TX, UT and WI.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] (2025) Work Zone Traffic Crash Trends and Statistics.
https://workzonesafety.org/work-zone-data/work-zone-traffic-crash-trends-and-statistics/
[2] United States and Federal Highway Administration (2009) Manual on Uniform Traffic Control Devices for Streets and Highways [2009 Edition Including Revision 1 and Revision 2 Dated May 2012].
https://rosap.ntl.bts.gov/view/dot/37175
[3] Shi, W. and Rajkumar, R.R. (2021) Work Zone Detection for Autonomous Vehicles. 2021 IEEE International Intelligent Transportation Systems Conference (ITSC), Indianapolis, 19-22 September 2021, 1585-1591.[CrossRef
[4] Zhu, M., Sidhu, A. and Redmill, K.A. (2024) Enhancing Digital Speed Limit Detection in Work Zones: A Camera-Based Approach with Multi-Frame Processing. 2024 IEEE 27th International Conference on Intelligent Transportation Systems (ITSC), Edmonton, 24-27 September 2024, 4013-4019.[CrossRef
[5] Li, X., Tan, J., Liu, A., Vijayakumar, P., Kumar, N. and Alazab, M. (2021) A Novel UAV-Enabled Data Collection Scheme for Intelligent Transportation System through UAV Speed Control. IEEE Transactions on Intelligent Transportation Systems, 22, 2100-2110.[CrossRef
[6] Cheng, P., Zhou, G. and Zheng, Z. (2009) Detecting and Counting Vehicles from Small Low-Cost UAV Images. Proceedings of ASPRS 2009 Annual Conference, Baltimore, 9-13 March 2009, 9-13.
[7] Outay, F., Mengash, H.A. and Adnan, M. (2020) Applications of Unmanned Aerial Vehicle (UAV) in Road Safety, Traffic and Highway Infrastructure Management: Recent Advances and Challenges. Transportation Research Part A: Policy and Practice, 141, 116-129.[CrossRef] [PubMed]
[8] Heintz, F., Rudol, P. and Doherty, P. (2007) From Images to Traffic Behavior—A UAV Tracking and Monitoring Application. 2007 10th International Conference on Information Fusion, Quebec, 9-12 July 2007, 1-8.[CrossRef
[9] Kanistras, K., Martins, G., Rutherford, M.J. and Valavanis, K.P. (2013) A Survey of Unmanned Aerial Vehicles (UAVs) for Traffic Monitoring. 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, 28-31 May 2013, 221-234.[CrossRef
[10] Ro, K., Oh, J. and Dong, L. (2007) Lessons Learned: Application of Small UAV for Urban Highway Traffic Monitoring. 45th AIAA Aerospace Sciences Meeting and Exhibit, Reno, 8-11 January 2007, 596.[CrossRef
[11] Barmpounakis, E.N., Vlahogianni, E.I. and Golias, J.C. (2016) Unmanned Aerial Aircraft Systems for Transportation Engineering: Current Practice and Future Challenges. International Journal of Transportation Science and Technology, 5, 111-122.[CrossRef
[12] Ke, R., Li, Z., Tang, J., Pan, Z. and Wang, Y. (2019) Real-Time Traffic Flow Parameter Estimation from UAV Video Based on Ensemble Classifier and Optical Flow. IEEE Transactions on Intelligent Transportation Systems, 20, 54-64.[CrossRef
[13] Bu, T., Zhu, J. and Ma, T. (2022) A UAV Photography-Based Detection Method for Defective Road Marking. Journal of Performance of Constructed Facilities, 36.[CrossRef
[14] Zhu, J., Zhong, J., Ma, T., Huang, X., Zhang, W. and Zhou, Y. (2022) Pavement Distress Detection Using Convolutional Neural Networks with Images Captured via UAV. Automation in Construction, 133, Article ID: 103991.[CrossRef
[15] Silva, L.A., Sanchez San Blas, H., Peral García, D., Sales Mendes, A. and Villarubia González, G. (2020) An Architectural Multi-Agent System for a Pavement Monitoring System with Pothole Recognition in UAV Images. Sensors, 20, Article 6205.[CrossRef] [PubMed]
[16] Tan, Y. and Li, Y. (2019) UAV Photogrammetry-Based 3D Road Distress Detection. ISPRS International Journal of Geo-Information, 8, Article 409.[CrossRef
[17] Qiu, Q. and Lau, D. (2023) Real-Time Detection of Cracks in Tiled Sidewalks Using Yolo-Based Method Applied to Unmanned Aerial Vehicle (UAV) Images. Automation in Construction, 147, Article ID: 104745.[CrossRef
[18] Zhang, Y., Zuo, Z., Xu, X., Wu, J., Zhu, J., Zhang, H., et al. (2022) Road Damage Detection Using UAV Images Based on Multi-Level Attention Mechanism. Automation in Construction, 144, Article ID: 104613.[CrossRef
[19] Saad, A.M. and Tahar, K.N. (2019) Identification of Rut and Pothole by Using Multirotor Unmanned Aerial Vehicle (UAV). Measurement, 137, 647-654. [Google Scholar] [CrossRef
[20] Martínez-Sánchez, J., Piñeiro-Monteagudo, H., Balado, J., Soilán, M. and Arias, P. (2023) Improving Safety in the Maintenance of Infrastructures: Design of a UAV-Based System for Work Zone Monitoring. Transportation Research Procedia, 72, 2518-2525.[CrossRef
[21] Ham, S., Noh, S., Seo, D.J., Kang, S. and David, D.S.K. (2020) Real-Time Work Zone Traffic Management via Unmanned Air Vehicles. Transportation Consortium of South-Central States (Tran-SET), Louisiana State University (Baton Rouge La), United States. Department of Transportation. University Transportation Centers (UTC) Program.
https://rosap.ntl.bts.gov/view/dot/58928
[22] Kim, K., Kim, S. and Shchur, D. (2021) A UAS-Based Work Zone Safety Monitoring System by Integrating Internal Traffic Control Plan (ITCP) and Automated Object Detection in Game Engine Environment. Automation in Construction, 128, Article ID: 103736.[CrossRef
[23] Messenger, R., Islam, M.Z., Whitlock, M., Spong, E., Morton, N., Claggett, L., et al. (2023) Real-Time Traffic End-Of-Queue Detection and Tracking in UAV Video. International Journal of Intelligent Transportation Systems Research, 21, 493-505.[CrossRef
[24] Mathew, J.K., Desai, J., Sakhare, R.S., Hunter, J. and Bullock, D.M. (2025) Spatiotemporal Analysis of Pavement Roughness Using Connected Vehicle Data for Asset Management. Journal of Transportation Technologies, 15, 1-16.[CrossRef
[25] Sakhare, R.S., Desai, J., Mathew, J.K. and Bullock, D.M. (2024) Assessing the Interstate Coverage of Commercial Trucks Capable of Providing Roadway Imagery via On-Vehicle Dash Camera in the United States. IEEE Access, 12, 173517-173529.[CrossRef
[26] Dia, H. and Rose, G. (1997) Development and Evaluation of Neural Network Freeway Incident Detection Models Using Field Data. Transportation Research Part C: Emerging Technologies, 5, 313-331.[CrossRef
[27] Wang, R., Fan, S. and Work, D.B. (2016) Efficient Multiple Model Particle Filtering for Joint Traffic State Estimation and Incident Detection. Transportation Research Part C: Emerging Technologies, 71, 521-537.[CrossRef
[28] Dia, H. (2001) An Object-Oriented Neural Network Approach to Short-Term Traffic Forecasting. European Journal of Operational Research, 131, 253-261.[CrossRef
[29] Jiang, H., Zou, Y., Zhang, S., Tang, J. and Wang, Y. (2016) Short-Term Speed Prediction Using Remote Microwave Sensor Data: Machine Learning versus Statistical Model. Mathematical Problems in Engineering, 2016, Article ID: 9236156.[CrossRef
[30] Lv, Y., Duan, Y., Kang, W., Li, Z. and Wang, F. (2014) Traffic Flow Prediction with Big Data: A Deep Learning Approach. IEEE Transactions on Intelligent Transportation Systems, 16, 865-873.[CrossRef
[31] More, R., Mugal, A., Rajgure, S., Adhao, R.B. and Pachghare, V.K. (2016) Road Traffic Prediction and Congestion Control Using Artificial Neural Networks. 2016 International Conference on Computing, Analytics and Security Trends (CAST), Pune, 19-21 December 2016, 52-57.[CrossRef
[32] Wu, Y., Tan, H., Qin, L., Ran, B. and Jiang, Z. (2018) A Hybrid Deep Learning Based Traffic Flow Prediction Method and Its Understanding. Transportation Research Part C: Emerging Technologies, 90, 166-180.[CrossRef
[33] Theofilatos, A., Yannis, G., Kopelias, P. and Papadimitriou, F. (2016) Predicting Road Accidents: A Rare-Events Modeling Approach. Transportation Research Procedia, 14, 3399-3405.[CrossRef
[34] Król, A. (2016) The Application of the Artificial Intelligence Methods for Planning of the Development of the Transportation Network. Transportation Research Procedia, 14, 4532-4541.[CrossRef
[35] Li, Y. and Wang, W. (2011) Traffic-Signs Recognition System Based on FCM and Content-Based Image Retrieval. International Journal of Digital Library Systems, 2, 1-12.[CrossRef
[36] Dai, H., Zhang, X. and Yang, D. (2018) Road Traffic Sign Recognition Algorithm Based on Computer Vision. International Journal of Computational Vision and Robotics, 8, 85-93.[CrossRef
[37] Li, Y., Li, J. and Meng, P. (2022) Attention-YOLOV4: A Real-Time and High-Accurate Traffic Sign Detection Algorithm. Multimedia Tools and Applications, 82, 7567-7582.[CrossRef
[38] Sakhare, R.S., Desai, J., Li, H., Kachler, M.A. and Bullock, D.M. (2022) Methodology for Monitoring Work Zones Traffic Operations Using Connected Vehicle Data. Safety, 8, Article 41.[CrossRef

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.