Eye on the Sky: A UAP Research and Field Study off New York’s Long Island Coast ()
1. Introduction
“Eye on the Sky” was a year-long investigative field research study of significant importance. It was conducted at Robert Moses State Park (RMSP), a barrier island (Fire et al.). It began in July 2022 and continued to July 2023 off the south shore of Long Island, NY. This location is known for its unique environmental conditions and potential for discovering unexplained aerial phenomena based on a significant number of retrospective UAP cases reported off America’s coastline. Sightings of unidentified anomalous phenomena (UAP) by the military and civilians alike are becoming increasingly common. Encounters with unknown aerial objects detailed in such prominent military-related cases as Gimbal and Nimitz (Mizokami, 2019) [1] (U.S. Navy, 2020) [2] (Cooper, 2019) [3], among others, are now embedded in the UAP lexicon with the growing reports of unknown objects viewed by commercial airline pilots only adding to the volume. While these and other UAP reports inspired this paper, it is true that initiatives driven by the U.S. Congress and put in place by the Pentagon also spurred motivation. The recent establishment of the All-domain Anomaly Resolution Office (AARO), charged with investigating hundreds of trans-medium UAP incidents, both legacy and new (Department of Defense, 2022) [4], provided added credibility and gravitas to the existence and exploration of mystery objects in our skies. Since much of the official data on unknown objects is kept classified, it is vital that the private sector also step up efforts of inquiry into these phenomena, which could pose a risk to our national security, geographical integrity, and passenger aviation safety. Of equal importance is the realization of a potential gap in our understanding of nature, our reality, and questions about where humanity stands in relationship to possible unknown aspects of something else displaying its shadow existence.
2. The Research
2.1. Inception of a Mobile Laboratory
The principal research team designed, developed, and adapted a recreational vehicle (RV) as a mobile field laboratory to collect data efficiently. Such a self-contained mobile research vehicle offered us a centralized location and systems integration to survey the environment and provide mobility and safety while providing 360-degree situational awareness. Designing and retrofitting a mobile laboratory was a logistical, engineering, technological, and scientific challenge. The design was initially conceptualized as an expeditionary vehicle with a completely functional field laboratory. The purpose is to provide a way and means to conduct a wide range of environmental studies, such as electro-smog surveys to European established safety limits; the study of meteorological events, such as hurricanes, ball lightning, and other types of exotic atmospheric events (dark lightning, super-potential lightning, transient light phenomena). Ultimately, it was deployed as a technological means to study the UAP phenomenon.
The Nightcrawler, a fully instrumented mobile Sensor platform, was designed and retrofitted with the latest equipment. This revolutionary system of networked surveillance cameras and detection devices, the cornerstone of our research, was specifically designed to observe, detect, track, record, and collect aerial object data. Its primary function was to differentiate prosaic objects from unknown objects that occupy our skies and coastlines. As engineers in this landmark study, we deployed a cutting-edge proto-type mobile laboratory vehicle and surveillance system that provides a centralized location and systems integration for surveying the environment while ensuring personal safety.
The vehicle was equipped with various instrument technology to determine unknown target detection and acquisition. They provided a sensible methodology for scientific reconnaissance and surveillance observations. Cumulative sensor data collections within an extensive temporal range established the data sets necessary for our field study.
The Nightcrawler offered unique construction specifications to minimize external electrical EMI emissions that could affect sensitive measurements. This included incorporating a metal mesh within the walls, ceiling, and floor to produce a Faraday cage. A Faraday cage minimizes externally radiated EMI, and the added construction materials offer additional protection against harsh, threatening environments.
2.2. Site Selection
A coastal Long Island study was chosen based in part on anecdotal historical records of observations, reflecting the more significant number of reports near the coast. The UAP ODNI report (ODNI, 2021) [5] describes the U.S. Naval fleet’ and pilot incursions off the Pacific and Atlantic seaboard of the United States. These incursions occurred as close as 10 Nautical miles (NM) from the coast. This persistence in phenomenology, as reported by these trained observers, who qualify as expert witnesses under Federal Rule 702, offered us a promising setting for field research. Based on these premises, a ten-month study would ensue. Long Island’s geographic location is well within the eastern regional perimeter (Long Island extent, 120 miles).
The geographic regions chosen were Robert Moses State Park on the southern coast of Fire Island National Seashore, Cedar Beach on Long Island’s North Shore, with access to the Long Island Sound, and The Great South Bay, which is Long Island’s largest body of water inland, separating Fire Island from Long Island’s South shore. These locations are referenced in Figure 1.
Figure 1. The site location.
We established the field study’s geographic perimeter and baseline parameters similar to a typical or standard Forensic environmental scene assessment. This included the carefully selected site for secured and controlled conditions. We obtained a scientific field research permit from the state Parks Department to have unrestrained access and control over the area/region. For instance, park security restricted human activity, low light pollution, and reduced human-caused air traffic at specific timeframes, which offered us almost ideal research conditions. The area was chosen based on retrospective historical data of high UAP reports, the 2014 USN F/A-18 Super Hornet pilot reports, and pilot admissions through personal conversations. Area dimensions were identified and documented through scene surveys, GPS mapping, and laser range finders. Primary and secondary surveys were accomplished using instrumented means, such as radar scans, M/E-O (Multi-spectral Electro-optical) devices, and electromagnetic measurements. All baseline data was recorded, examined, processed, and preserved on the scene for internal controls over the evidentiary data.
Figure 2. Aerial view of Robert Moses State Park on the Fire Island National Seashore.
Figure 2 photo is an aerial view of Robert Moses State Park, situated on the southernmost point of Long Island, on a barrier island called the Fire Island National Seashore. The map depicts the geographic location of Long Island’s mainland in the left region of the map. To get there, a bridge connecting the mainland to Robert Moses State Park must be crossed. The bridge crosses the Great South Bay to Robert Moses State Park on the Fire Island National Seashore. This is the southernmost point of Long Island, serving as the barrier to the Atlantic Ocean. This area was chosen for our field study because of its low light pollution, minimal air traffic after midnight, isolation from the rest of Long Island, and because it provides an excellent Atlantic Ocean sky view (Google, 2023) [6].
2.3. Early Discoveries
Since July of 2022, we observed a light phenomenon that most often presented itself as a spheroid under a luminous state, while under decreased luminosity, it seemed to display a polyhedral appearance morphologically. Fewer sightings appeared as ovaloid and cylindroid. The spheroids have been consistent and present from July 7, 2022, to March 30, 2023. These observations displayed unusual flight characteristics, extreme velocities, and long periods of inactivity, appearing stationary. These objects also displayed unusual physical characteristics: fluctuating states of albedo, glinting, rotation, variations in spectral range, and change of state from luminous to illuminous (Teachoo, 2023) [7]. This light phenomenon, at times, demonstrated swarm-like behavior that did not fit a pattern of regular air traffic and could not be verified by Automatic Dependent Surveillance-Broadcast (ADS-B) data (FCC, 2023) [8].
An anomalous transient light phenomenon (ATLP) was discovered with the inclusion of night vision (NV) devices, which led the research team to expand its equipment inventory to include enhanced multispectral instrumentation. Visual observations by eyes alone were unremarkable. Using an active radar system and multispectral electro-optical instrumentation enabled us to detect unusual amounts of air traffic (see unknowns section) that aircraft location data, such as automatic dependent surveillance-broadcast (ADS-B), could not correlate to visual observations.
EM sensor-based observations were followed up with acoustic and electromagnetic field surveys, such as acoustic and electromagnetic spectrum analysis, including the static and dynamic realms. These surveys and spectral analyses were performed on the data. The geographic location of Robert Moses State Park (RMSP) offers controlled conditions to conduct field research.
The resultant ten-month field study from July 2022 through May 2023 required us to update equipment as needed, contingent on circumstances, and enact technological and instrumentation enhancements and equipment upgrades that necessitated the Nightcrawler 1 replacement with an even more expansive and elaborately equipped Nightcrawler 2 mobile platform. We developed innovative designs for unidentified anomalous phenomenon (UAP) detection, including active radar in two scan geometries and IR extended spectral range, removing the Infrared (IR) cut filter. This allowed us to capture an exotic phenomenon of light in the form of spheroids, which displayed an ambiguous, elusive, and transitory nature. Multiple unidentified objects detected throughout the study are detailed. Object spectral range beyond that of visible light is characteristic of the observed phenomena; that is, observance of the phenomena predominantly lies outside the spectral range of human EM perception.
During this intensive study, we observed and cataloged unknown air traffic (“unknowns”) moving in lower atmospheric skies at our seaside location on the south shore of Long Island, NY.
The primary objective of this study was to determine if aerial phenomena of an unknown nature exist over a coastal location and to characterize their properties and behaviors. To achieve this, a combination of Primary (qualitative) and Secondary (quantitative) field observation methods were utilized in this data-centric study.
The Nightcrawler, a fully instrumented mobile Sensor platform, was designed and retrofitted with the latest equipment. This revolutionary system of networked surveillance cameras and detection devices, the cornerstone of our research, was specifically designed to observe, detect, track, record, and collect aerial object data. Its primary function was to differentiate prosaic objects from unknown objects that occupy our skies and coastlines. As engineers in this landmark study, we deployed a cutting-edge proto-type mobile laboratory vehicle and surveillance system that provides a centralized location and systems integration for surveying the environment while ensuring personal safety. The research involved ten months of meticulous data collection and analysis at the field site, 12-hour overnight observation shifts, and two months of research team meetings for careful review, analysis, and data collation. The objective was to determine if aerial phenomena of an unknown nature exist over a coastal location and to characterize their properties and behaviors. A combination of qualitative and quantitative field observation methods was utilized in this data-centric study, utilizing Forensic engineering principles and methodologies that guided the study. The challenges set forward were object detection, observation typology, and characterization, where multispectral electro-optical devices and radar were employed due to limited visual acuity and intermittent presentation of the phenomena. The primary means of detection utilized a 3 cm X-band radar operating in two scan geometries, the x- and y-axis. Multispectral and hyperspectral electro-optical devices were used as a secondary detection and identification method. Further data emphasis was placed on the use of high-frequency (HF) and low-frequency (LF) detectors and spectrum analyzers incorporating electromagnetic (EM) field transducers (ultrasonic, magnetic, and RF) to record spectral data in these domains. Data collection concentrated on recording a wide or broad bandwidth of the electromagnetic spectrum, including visible, near-infrared (NIR), short-wave infrared (SWIR), long-wave infrared (LWIR), ultraviolet (UVA, UVB, and UVC), and the higher energy spectral range of ionizing radiation (alpha, beta, gamma, and X-ray) recorded by Geiger-Müller counters as well as special purpose semiconductor diode sensors.
A complete telemetry monitoring system, with meteorological, atmospheric, and environmental displays, is inside the Nightcrawler. What makes this system unique is the use of detection systems such as Active radar and hyperspectral optical systems. Active radar required us to obtain an experimental license from the FCC to deploy this technology on a land-based mobile vehicle for UAP research. To our knowledge, no other organizations use active radar for this purpose. Furthermore, the radar system is innovative and proprietary, incorporating two systems mounted and oriented in two scan geometries (mounted in Horizontal and Vertical orientations). Scanning a target in this way helps us better understand its distance, speed, and size by painting a detailed cross-sectional profile of the target (Harley D, 1981) [9].
3. Principal Research Methodology
The principal research method utilized a forensic science model for its applied multi-disciplined approach to using the sciences. In this case, a forensic engineering model using problem-solving applications for a diverse and wide range of situations requiring investigative and instrumented analytical principles is ideally suited. We chose these scientific standards because they are routinely tried and tested in novel situations and governed by the federal rules of scientific admissibility for the courtrooms, where people’s lives depend on decisions and outcomes.
3.1. Established Criteria for a Field Research Investigation
The established criteria for a field research investigation included:
Minimal light pollution
Open space parking and access to beachfront real estate
Limited human activity after core park hours
Human activity: researchers, stargazers, fishermen (require Permit), security patrols.
Public restricted after 7:00 PM (park closed after dusk)
Large parking areas to conduct research (liberty to establish equipment perimeter)
Minimal air traffic after 12 AM
Ability to make low-light observations, such as satellites.
Geography:
Geographic position chosen based on pilot observations, US naval strike group (USS et al.) exercises off the eastern coast, where offshore encounters took place in 2014, noted as the Whiskey 72 (Warning Area 72), the Northern tip 271 miles from RMSP (Rogoway, 2020) [10].
Geographic location relative to Hudson Canyon, the deep-water trench is approximately 75 miles, and certainly within the observational range of where the pilots have made their observations offshore. This also becomes apparent in our observations as well. There also are some retrospective data suggesting a relationship between UAPs and water (USO reports).
The remote geographic location chosen is a unique opportunity to conduct a field research study under controlled conditions.
The Nightcrawler mobile laboratory and its researchers were granted a long-term research permit by the New York State Parks, Recreation & Historic Preservation to conduct an “Eye on the sky” coastal study of UAPs.
The FCC granted this research vehicle an experimental active radar use license. This is unprecedented to date, especially for a UAP research study.
3.2. Instrument Utilization
The applied methods comprised both qualitative and quantitative methods. Qualitative methodology was a primary means of analyzing, detecting, and determining air traffic typology. We developed innovative designs for unidentified anomalous phenomenon (UAP) detection, including active radar in two scan geometries. Utilization of marine active X-band radar on a modified platform covering two scan geometries, the X- and Y-axis, allowed better cross-section scans of distance and size. Advanced electro-optical devices offered immense clarity to the visible limitations of the surrounding environment by extended detection of the spectral wavelength beyond visible Light (VIS, NIR, SWIR, and LWIR (Lo YT, 2013)) [11].
We modified our camera system to extend the IR spectral range by removing the Infrared (IR) cut filter on our fixed array while incorporating additional portable hyperspectral devices. This allowed us to capture an exotic phenomenon of light in the form of spheroids, which displayed an ambiguous, elusive, and transitory nature. Multiple unidentified objects detected throughout the study are detailed. Object spectral range beyond that of visible light is characteristic of the observed phenomena; that is, observance of the phenomena predominantly lies outside the spectral range of human EM perception.
The quantitative phase was a secondary means of using instruments for analysis, such as spectrum analyzers with acoustic transducers and accelerometers to measure infrasound to ultrasound, 0.4 Hz - 1 Mhz. Spectrum magnetometers were used for magnetic field (H-field) and electric (E-field) surveys. RF spectrum analyzers encompass (Broadcom staff, 2020) RF surveys from 10 MHz to 6 GHz, while our optical spectrum analyzers, such as the spectroscopy radiometer, using Waves software, cover the optical spectrum from 350 nm to 1150 nm. Our Binary detection instrumentation covers the full UV spectrum, and the IR detector covers the NIR to SWIR. Our high-resolution LWIR cameras are equipped with optical zoom capability. Accessories include equipment sensors, transducers, Software, Geiger Muller counter, atmospheric gas sensors, and Lowrance GPS (Saltwater Sportsman Staff, 2007) [12]. Weather station (Weather Bug staff, 2023) [13], Stellarium Application (Stellarium staff, 2020) [14], Marine traffic APP (MariAPP) (MariAPP staff, 2023) [15], FlightRadar24 (Flightradar24 Live Traffic, 2023) [16] and AirNav (AirNav staff, 2019) [17]. ISS Starlink (Space.com staff, 2014) [18], NOAA (buoy data) (NOAA staff, 2023) [19]. Google Earth (Google staff, 2023) [20]. NOTAMs (FAA staff, 2014) [21].
Integrating these tools as part of our assessment methodologies helped our team separate known objects from unknown objects and identify outliers as possible anomalies. Identification of any object can be challenging under certain conditions. While using radar and ADS-B applications is extremely helpful, we are cognizant that not all aircraft have ADS-B capability (for instance, General Aviation Aircraft and Experimental Aircraft). Also, Part 91 aircraft operating under VFR conditions are not required to file an FAA Flight Plan. Although the FAA highly recommends a flight plan for every VFR flight, it is not a requirement. Commercial VFR and all IFR traffic must have an operational ADS-B (FAA, 2023) [22]. Having a process in place and doing our due diligence to filter known objects carefully and concisely left us with unknowns.
Integrating these tools as part of our assessment methodologies helped our team separate and identify the known objects (prosaic), leaving the unknown objects or outliers as possible anomalies (those objects that do not fit known prosaic explanations).
3.3. Multispectral Electro-Optical (ME-O) Application
Example 1
Utilization of 125X and 60X high magnification telescopic cameras and Infrared cameras (NIR, SWIR)—for surface and lighting detail, such as shape/geometry silhouettes, navigation lights, anti-collision lighting, landing lights, taxi lights, typology, aspect ratios, etc. A sample of an identification methodology was utilized and performed during our field investigation and study at RMSP.
This evening, a video was taken at night with multi-electrooptical (MEO) devices of the western sky in the direction of JFK Airport (38.2 miles away). This was one among many initial test performances of our equipment to determine its efficacy and reliability. In this case, the objects in question are two star-like dots of light in the sky. See Figure 3 below.
Figure 3. Two unidentified objects.
Figure 3 is a still image from a video taken at RMSP in the direction of the western sky at night. The arrows identify two distant objects. The lower object displays motion, and the object at the higher elevation is stationary.
The lower object was identified as an airliner whose Transponder signal was on ADS-B. The object was further resolved and photographed using a Nikon camera with an optical zoom of 60X (see Figure 4). We can easily see the anticollision, navigation, and landing lights, with flash patterns and colors (Nikon, 2020)) [23]. At the higher elevation, notice the object above the airliner. This object was stationary, appearing round with a steady luminosity. For unresolved and unidentified objects at a greater distance from the observer, we use a Nikon P1000 camera with 125X magnification. The object at the higher elevation appeared unremarkable with both zoom-mode cameras. Its appearance was similar to other stars and planets in the sky (A white orb of light with glimmering edges and a slight blue hue around it). It was identified using an Android App, “Stellarium” to determine celestial bodies. This second object was identified as the planet Venus (Stellarium, 2015) [24].
Figure 4. Magnified view of airliner.
Figure 5. Star mapping software application.
Figure 4 shows a 60X enlarged view of an object identified as a commercial airliner. We clearly see the red anticollision beacon lighting, small wing lights near the fuselage, and the forward white landing lights on the fuselage near the nose (the blue hue may be due to interior lighting shining out the windows). Figure 5 is Stellarium, a star mapping software, showing the relative position of Venus at the time of observation (the arrow in the image points to Venus). Stellarium is an App used to identify celestial objects, such as planets, stars, satellites, and other celestial bodies, based on sky coordinates, such as elevation and azimuth, which coincides with where the software device (cell phone and app) is positioned and pointed at the region of the sky.
Multispectral electro-optics are utilized in a multi-spectral range from UV, visible to the far-reaching hyper-spectral range, to find identifiable signatures. Military aircraft may use formation lights, which occupy a proportion of the IR spectrum, to stay hidden, stealth operation. IR lighting is a possible exception for some stealth military aircraft. IR is not a factor for regular commercial air traffic.
Figure 6 and Figure 7 show that several Planetary cameras, including SWIR, were used during the RMSP research study. Figure 8 shows extremely low Lux and IR capability using the Sony IMX-464. The OHY5III462 uses a Sony Starvis IMX-462 sensor. Figure 7 shows a front view of the three cameras using the Sony IMX464 and IMX991 Starvis sensors. The SWIR camera has a spectral range of 400 nm to 1750 nm.
Figure 6. Starvis low light planetary cameras.
Figure 7. SWIR cameras.
Figure 8. The low light capability of the Sony IMX 462 sensor is due to NIR QE response.
A note about IR and the technology used to capture and record it. Some IR sensors are mainly light, especially within the NIR range. This encompasses anything from that produced by IR LEDs to heat sources producing fringing NIR energy. This is also the case with CMOS technologies. In GaAs technologies, camera imagers and InGaAs photodiodes used in binary detectors are more sensitive to IR emissions from heat sources. IR measurements are not the same across the board. Our optical spectra devices perform quite differently depending on the detectors’ technology. Once a heat source reaches a temperature of higher spectral wavelength, no additional light is produced, which is addressed by the thermal use of silicone or germanium lens, which is for metal. This includes the MWIR and LWIR.
The Forward-Looking InfraRed (FLIR) is a high-resolution long-wave (LWIR) thermal imaging system incorporating a telescopic system giving 40X magnification to acquire a unique and specific thermal signature of the object in question (see Figure 9). A thermal signature is based on two parameters, emissivity, and reflectivity, which are inverse relationships. This imaging information is valuable in discerning heated engines/propulsion systems, surface details, geometry, silhouette, shape, and composite material of a surface structure (such as a metal like aluminum (Wei Huang, 2016)) [25].
Figure 9. Thermal camera model OTS-4T Thermal Imaging Camera. 640 X 480 Sensor, 40X magnification.
Figure 10 is a photograph of a Boeing 737 taken by a monocular ATNOTS 4T, HD FLIR thermal camera, at an approximate elevation of 1000 feet on a glide path into Long Island MacArthur Airport. Note the darker areas in a black hot palette, the rear engine cowling, landing lights, and exhaust streams. Note: The detail in the image, which results from the thermal capacitance, allows us to differentiate subtle temperature variations, as low as one-tenth of a degree, based on the thermal conductivity of materials and the location of heat sources.
Figure 10. Thermal image taken by a thermal camera.
The thermogram image in the form of a silhouette is a Boeing aircraft at an altitude of 28,000 feet. The zoom range is an optical-mechanic setting and not digital. The optical zoom will give us a smaller FOV without using image resolution detail to achieve the same effect in a digital zoom mode. There are two basic ways of zooming in photography: optical and digital. Optical zoom leverages the physical change in a lens to adjust the distance between the camera sensor and the subject. In contrast, digital zoom uses magnification technology to enlarge an area of an image (thereby compromising the integrity of the picture by cutting down on the megapixels (LEIBOVITZ, 2023)) [26] (Figure 11).
Figure 11. HD FLIR thermal camera, black hot setting at a magnification of 25X.
Figure 12 was taken with an HD FLIR thermal camera, black hot setting, and magnified (30x). This is a view of an airliner at 30,000 feet. Figure 13 is a 25x zoom image of an HD FLIR thermal camera, black hot setting. This is a magnified view of an airliner at 28,000 feet. in all the images above, the temperature differential of the high-altitude sky and the internal heating within the fuselage gives the airliner a black silhouette thermal signature. This is due to the extreme temperature of the air at those altitudes, which can be many degrees colder than 40 degrees Fahrenheit or Celsius. When there is an extensive but limited temperature range of the thermal cam, we see a limited range of the color palette.
Figure 12. Thermal image at 30X magnification.
Figure 13. Thermal at 25X magnification.
3.4. Using LW FLIR Thermal Imaging in the Determination of Possible Thermal Inversions, Which Could Invalidate Radar and RF Data
The HD monocular FLIR camera provides well-defined atmospheric temperature gradients in a rainbow color palette. This type of visual display gives us a unique way of identifying temperature inversions, which could produce radar reflection artifacts. the rainbow color palette was chosen, although it is similar to an iron bow with higher wavelength colors representing the hottest part of the image and lower wavelength colors representing the coldest parts, adding more colors to contrast boundaries, which might only have subtle temperature differences. This is a suitable means for finding boundaries of subtle temperature variations to define temperate layers in the atmosphere. We conduct this scan several times a night during the field investigation to account for atmospheric conditions that may negatively impact the radar scans.
Figure 14 shows the gradual temperature difference as layers move from a warmer surface temperature to rising cooler temperatures in color gradients, using a rainbow color palette. The image represents average temperature gradients (Image taken at RMSP with handheld FLIR). The thermogram demonstrates a no-temperature inversion. See Figure 15.
Figure 14. Thermal image using a Rainbow palette to show temperature gradation.
Figure 15. Normal temperature gradation.
Figure 16. Atmospheric temperature inversion.
Figure 14 and Figure 15 illustrate an average atmospheric temperature gradient vs. an atmospheric temperature inversion. Figure 16 shows a temperature gradient that does not follow a linear progression of change.
3.5. Understanding Confounding Variables; Eliminating Prosaic Explanations
Whenever the skies are active with commercial air traffic, confounding variables emerge, potentially influencing object observation and classification; having an understanding and discerning identifiable features is necessary for identification, such as follows:
1) Know where the airport locations are.
2) Define the circular/elliptical flight patterns within the airport flight corridor.
3) Observe aircraft in flight holding patterns during approach. This is a typical process by which ATC (air traffic control) delays an aircraft’s travel as a safety measure until cleared to proceed. Typically, the pilot will guide the aircraft in an elliptical flight pattern, making minor turn adjustments. To a ground observer, the airliner appears as an object displaying limited motion for long periods.
4) Observe glide slopes (low horizon approaches).
5) Understand that an aircraft’s Aspect ratios (structural length and shape) appear to change for the observer based on the aircraft’s position or orientation relative to the line of sight.
6) Understand Aircraft lighting systems (patterns: lamp locations, intensity, flash vs non-flash, color).
7) Know the Navigation lights on aircraft wing tips. Green-colored light is on the right side (starboard side), Red-colored light is on the left side (port side), and white lights are on the tail section (empennage) and nose. The differing colors allow pilots of nearby aircraft to use a visual method to distinguish another airliner’s orientation and direction.
8) Anti-collision lights are bright red flashing proximity beacons on the aircraft’s top and ventral (underside).
9) Landing lights are downward-tilted bright white lights mounted on the ventral side of the wings near the fuselage. They are an indicator system for a landing and usually remain on until an altitude of 10,000 feet is reached. Taxi lights can usually be seen on an aircraft’s nose wheel strut as it travels along the runway. Formation IR lights (military) are used for stealth operations.
10) Acoustic variability is affected by:
Altitude
Atmospheric/weather conditions (wind, humidity, temperature)
Propulsion systems (Prop. Turboprop, Turbine or Turbofan blades, rotor blades)
Environmental phenomena, such as Temperature Inversions, which can produce
Radar Anomalies; False targets; Return size and Depth, Fathom, and reflections. Properly adjust for Rain or Sea Clutter (Harley D, 1981) [9]
Distance identification can be Utilized in high-magnification 60X - 125X Electro-optical camera equipment to adequately visualize and discern artificial aircraft features, such as Lighting effects, shape, and engine contrails. This can also be achieved at other wavelengths like LWIR using high-resolution FLIR cameras with a minimum of 30X zoom capability that can discern the thermal signatures of manufactured objects from unknowns at great distances, such as the thermal silhouettes of fuselage and engines and engine locations. Understand Biomass’s motions, temperature variations, wing oscillations, and flight patterns (Birds, Bats, Insects).
Use Active Radar in combination with ADS-B software applications to identify manufactured aircraft.
4. Computations
4.1. Radar Target Tracking and Acquisition (TTA)
The Nightcrawler mobile laboratory has two Furuno 4 KW onboard marine-based radar systems operating in the 3 cm X-Band wavelength, which transmits a frequency of 9.410 GHz (±30 MHz). These are low-profile 4 kW radome antenna types. The radar system configuration takes advantage of two scan geometries by the physical orientation of each antenna. One antenna makes a scan in the Horizontal or X axis, while a second antenna scans in the Vertical direction (Y axis). The X-Y Array provides a unique and more remarkable tracking ability in the elevation, azimuth, and distance range parameters.
Modern Ocean vessels generally use two types of radar for navigation: S-band and X-band. The advantage of X-band radar over S-band is the increased discrimination of a smaller target using a narrow-transmitted pulse and smaller Waveguide. The narrow Pulse helps to discriminate a small target from a larger one by differentiating one target from another. The narrower the pulse profile, the better the target separation, which could be missed by a wider 10 cm beam produced by an S-band wavelength. Their original use is for ship traffic control and navigation. The images produced by marine radars detect hard targets such as ships and coastlines and reflections from sea waves and sea spray known as “sea clutter”. Given a wind component that produces elevated sea waves, the Line of Sight of the Radar Horizon produces backscatter from the sea, which becomes visible in radar images. Such reflections of ocean waves are primarily due to resonance between the radar waves and the features at the water surface (Bragg Scatter) [15]. Because the radar wavelength is in the centimeter range, only very short water waves reflect the radar waves. This “Sea clutter” contains valuable information on the actual sea state and provides a reference to align the antenna radome pitch. Sea clutter sensitivity and “Rain Clutter or Sky Clutter” are adjusted for maximum detection of Objects as small as 1 meter in diameter.
X-band radar systems have a short 3 cm wavelength, which makes them sensitive to moisture in the atmosphere, which interferes with these short pulse returns; this makes X-band radar ideal for detecting coastal sea spray and rainfall, making the X-band frequency ideal for weather radars to monitor rainfall. They are also sensitive to tracking migratory patterns of Biomasses such as birds, bats, or insect swarms. The 3 cm antenna wavelength is the horizontal beamwidth, which is directly proportional to the transmitted wavelength but inversely proportional to the effective width of the antenna array. The Antenna width is directly related to the wavelength divided by the operating frequency for an X-band radar operating at 9.4 GHz, as illustrated in Figure 17.
Figure 17. Antenna wavelength calculation.
The second Furuno radome antenna, which is mounted to the top rear of the Nightcrawler, is oriented at a Zenith to the sky. A Y-axis orientation allows us to track object movement and velocity as it crosses the path of the mobile laboratory. A cross-sectional scan conducted under two geometries gives us an additional data advantage. Target cross sections can vary; depending on size, a positive return may not be possible by an X-axis scan alone. Y-axis scans give precise results of cloud heights and ceilings, especially storm clouds and intensities. Vertical scans can offer additional information about an object that may not be possible when using X-axis scans alone. An Object’s altitude, size, and shape can be more accurately determined when the object is overhead and within a 4-mile Radius of the radome.
The Furuno Radar key features, an 8.4 inch color LCD screen, and a fast Target Tracking (TT) function allow the user to track up to ten targets manually or automatically. After selecting a target, displaying a speed and course vector takes only seconds to determine the safety of approaching or moving targets. When using Echo Trail mode (Doppler), moving objects will show a gradation of target trails in their wake, making it possible to gauge the movement of targets at a glance.
Once the Nightcrawler arrives at the place of deployment and before every study, the radar system must be initialized, calibrated to the environment, and tested for functionality. Parameters, such as Range (Nautical miles), Rain and Sea Clutter Sensitivity, Gain (Pulse strength), Off Center (when forward viewing), Echo Stretch (Time persistence of detection), and radome pitch. The redone pitch is critical to eliminate false returns from Sea conditions, such as Wave Crests. Buoy data can help determine the height of sea swells. The forward X-axis radome is pitch-controlled via a motorized articulated arm. The angle of the pitch can be adjusted from a flat horizon of 0 degrees to a maximum elevation of 30 degrees. Pitching the antenna compensates for the horizon angle formed by the Waveguide (see Figure 18). The horizontal beam width and the vertical beam width are two characteristics of a Radar antenna. While the horizontal beam width partly depends on the width of the antenna, the vertical beam width is fixed. It is generally 20/30 degrees to avoid the effects of pitching and rolling for maritime use. This, of course, is not an issue for a land vehicle. Radomes with an antenna width of about 40 cm have an average horizontal beam width of 5.7˚.
Figure 18. Physical structure of the internal waveguide within the radome.
Some practical issues must be overcome before validating a marine-based radar on land. Compensation of waveguide vertical angle is required by antenna pitching. The waveguide directs the horizontal beam off the center line of the radome by ±15 degrees, as shown in Figure 19, with the calculated angle. This is to compensate for the directed beam during the pitching and rolling of the vessel. Maritime vessels are best equipped with the radar antenna mounted to the highest point on the vessel for two reasons: to compensate for the optical or visible horizon and the radar horizon. The optical horizon is the limiting factor for determining maximum sea scanning distance, and the radar horizon is a function of antenna design. Sea Clutter is generally minimized by taking this approach. The design intent was to allow for Sea conditions, such as wave crests, which can be adjusted to be within proximity to the Vessel.
Figure 19. Waveguide beam angle and Line-of-Sight (LOS) calculation.
Commercially available Marine radar systems have a nominal range of 24 to 48 miles. However, the earth’s curvature limits this by how high the antenna mounts. Due to RF refraction, the radar horizon is slightly better than the line of sight. The general formula for calculating the radar horizon is shown in Figure 20.
Figure 20. Visible radar horizon based on LOS.
The examples shown in Figure 21 are the distance limitation of object detection if the contact is at an altitude no more significant than the transmitter. In example 1, the height of both the transmitted pulse and target at 3.66 meters is limited to 5.95 NM due to the visible horizon. In example 2, the object at an altitude of 266 meters is not subject to Earth’s curvature, even though we still have reduced scan altitude. Another issue that needed to be addressed was the limited visibility of the sky with marine radar. The radome waveguide gives us a 15-degree elevation to the sky. Objects within a 3-mile radius cannot be detected at an altitude greater than 4244 feet. Pitching the radome antenna by 30 degrees (45 with a waveguide angle of 15 degrees at the center), we gain additional Altitude, approximately three times in magnitude. Sky coverage comes at a cost by minimizing our field of view to within approximately 60 percent of the sky, where the antenna is pitched upward. Our research is directed off the coast and over the ocean; any impact that might occur over land is minimal to the study.
Figure 21. Gives examples of radar distance based on LOS heights between vessels or vehicles.
4.2. Methods of Determining Object Size and Distance
Occasionally, we will get a radar reflection of a target or targets providing a distance measurement along the line of sight of the radar antenna(s). For any degree of accuracy, certain conditions must be met. If the object is captured on the nightcrawlers vertical or Y-scan radar, the object must be within a 2-nautical mile swath of the beam and at a North-South distance no further than 16 miles from the vertical scanning Radome. Angle and altitude can then be determined to give us the most accurate indication of its position relative to the Nightcrawler. Once the target’s position has been determined, the approximate object size can be determined by camera image data, knowing the Field of view and Focal length. For a size-distance relationship, a vector position must be determined through radar or a LIDAR device for distance measurement and angle (Line-of-Sight (LOS)).
Figure 22. Y-scan radar, sweep depth.
Figure 23. Lidar based rangefinders.
Figure 22 shows Nightcrawler’s Y-Scan radar, which gives us an object’s position and limited size information. Figure 23 on the right is LIDAR-based Rangefinders from Vortex. Two models are shown, a monocular and a Binocular type. Both have long-range capabilities, 4000 and 5000 yards, just under 3 miles. The Vortex Laser Rangefinders use beam-directed IR energy at 840 nm, which does not occupy the visible part of the spectral wavelength. When used with an image taken through an NV camera, object size can be determined through computation (See Figure 24 and Figure 25 below). Our team is exploring using Lidar-based detection and their reflective glare patterns to determine object presence and surface characteristics, such as reflectivity/emissivity and the effects of polarization. There seems to be an interesting phenomenon associated with these Spheroids, where light energy in close proximity experiences a form of interference to light, a lensing effect. This might explain the fringing of color, loss of apparent resolution, and overall distortion of what we see with our eyes or record, which may be characteristic of the phenomena. Reflected light appears to be disorganized at times.
Figure 24. Image height and distance calculations based on camera lens geometry.
Figure 25. Image size and distance ratios based on internal camera lens geometry.
The Methodology used to determine the size and distance of an object is based on the geometric relationship of Similar Triangles and Congruent Angles. A camera image determines the object’s size or distance using the camera’s optical parameters and specifications: Field-of-View (FOV), Focal length, and image sensor size. The distance to the object can be determined by Radar or Lidar measurements. In either case, a vectored position is quickly established, the displayed radar polar measurement or Lidar (Range finder) Line-of-sight (LOS) data, distance, and viewing angle. This mathematical relationship is shown in the figure below.
5. Discussion
The Nightcrawler “Eye on the Sky” field research study was the culmination of multiple months of intensive observations from sunset to sunrise, which extended into ten months on coastal Long Island. The regions included the south shore, Fire Island National Seashore at RMSP, and the north shore, Cedar Beach area. We included the Corey Beach/Great South Bay inland region as a comparative baseline. Our study’s findings, meticulously gathered and analyzed using two instrumented mobile sensor platforms for data collection, point to an exotic phenomenon that is not reducible to straightforward explanations and remains largely unexplained.
During this period, we observed odd displays of light that frequented the coastlines, especially more abundant on and off the south shore. Whether they are more of a coastal phenomenon or if there are equal numbers of incursions inland remains to be seen, pending further studies. These findings open up exciting possibilities for future research and discovery. We have observed more extraordinary displays over or near the coast of Long Island. One cautionary aspect of visual sightings from human observers on the ground is perceptual errors in judging precise location and position. For instance, we observed aircraft and unknown objects that appeared overland. When attempts were made to correlate that location with radar tracks and triangulation methods, the actual position was near or off the coast rather than inland. Depth of field can be distorted, depending on unknown size variation, minimal detail of physical attributes, and foreground/background landmark contrasts. The objects can appear closer or farther than they are. This is not to say that land incursions do not occur, but we must remember that according to USGS data, Long Island’s landmass is only 23 miles at maximum width. So, what may appear to hover over land may be a relatively short distance on the coast. Most alleged sightings may be a coastal phenomenon with occasional incursions over land.
As we began noticing these unknown objects/luminosities through enhanced instrumented means, we performed our due diligence to eliminate the known quantities from the unknowns. We followed the scientific method and process through stringent field protocols to collect data and evaluate possible evidence. We speculated and hypothesized about what other natural explanations might make sense, and we explored and investigated the possibility of maritime surveillance drones or drones that might be used for dredging operations or coastal and sea-floor mapping operations. We also considered DoD/military or foreign adversarial activities. These became the least likely explanations for us.
The unknown objects appear to be elusive, and they frequently remain just outside of the spectral range of human sensory perception. They are primarily seen within the infrared bandwidth and may occasionally be observed in the visible spectrum when their albedo/luminosity is sufficiently intense to be seen. We have observed a barely perceivable low Lux to high Lux light output from such objects. On rare occasions when these objects and luminosities were perceptible in the visible spectrum, they assumed variations in color frequency, such as white, blue (occasionally iridescent), orange, and red. The unknown objects have displayed quantifiable features so that we could assign some tentative taxonomic and morphometric details for referencing the differing forms, as discussed in the body of the paper (such as four to five general variations in shape geometries, and consistencies in the relative size of 1.5 meters to 3.0 meters in diameter). Their behaviors were transitory, random, indifferent, and purposeless at times, and there were moments when these objects demonstrated an awareness of us and some level of organization, intelligence, and even interaction. There were instances where we observed more bizarre features, such as luminous objects suddenly appearing and rising from or out of the ground, where none had been seen previously. Occasionally, we would see them sitting on the ocean surface or passing beneath it without disturbing that medium and any sounds. We then wondered if these objects had any solid, loosely dense, or non-physical mass. We have noted some interesting outliers in spectrum analyses during presentations of this phenomenon. Measurements of E-M field power flux densities (the sum product of electrical and magnetic fields) have detected 1.79 GHz and 4.066 GHz signals. We also found a correlational relationship between our Emf measurements and the unusual ultrasonic signals we detected multiple times during the coastal light phenomenon presentations. Both the Emf and acoustics displayed 40 KHz block separations between signals.
Was there a more likely natural explanation, such as an unusual atmospheric phenomenon, in the form of uniquely stacked water crystals and prism effects or plasmas? Were these bizarre luminosities due to some unknown electrical, chemo-electrical dynamo, or ionized luminescent gaseous states, as described under the umbrella of Transient Luminous Events (TLEs)? This is a relatively new area of atmospheric science, and they most often occur above the Troposphere (at least based on current observational evidence). Could previously unknown types of TLEs occur at lower altitudes, between the sea, land, and low cloud ceilings, and could their transits or drifting motions be influenced by the opposing electrical properties of environmental surroundings and geomagnetic field lines? We have observed multiple unexplained flashes in the sky, which have occurred during presentations of these luminosities. While we considered this idea one plausible explanation, it has not satisfied all the observational parameters we noted nor answered our experiences.
Learning more about Unidentified Anomalous Objects is vital for national security and aviation safety, but it also raises much bigger existential questions for scientists, philosophers, and theologians.
Let us consider just for a moment the potential implications to the integrity of our skies alone by turning to the tragic case of TWA Flight 800, a humanitarian disaster that struck in the Atlantic Ocean just tens of miles away from this project’s leading field study site of Robert Moses State Park. Questions remain despite the “official” explanation of the fuel tank explosion. The event was witnessed by many credible onlookers who claim they saw an object or a light traveling at a high velocity towards the ill-fated plane that early summer evening in July 1996 (Ruppe, 2020) [27]. Since this time, the topic of UAPs has reverberated worldwide via sightings and media reportage, making it prudent to ponder whether an unexplained UAP might account for the alleged object sighted by witnesses that could have collided with TWA Flight 800 mere minutes after take-off. An unknown object striking the jetliner could account for the following events, including the center fuel tank rupture. Furthermore, the resultant loss of 230 international passengers demands that we seriously consider this possibility to understand and prevent additional near misses and potential incidents.
Often, the reported UAPs may or may not be detected by radar. One NARCAP study over a decade ago reviewed serious aviation safety-related cases involving near collisions and simultaneous sightings of spherical UAPs (Roe, 2010) [28]. Since then, observances by professional aviators of unknown objects and lights in the skies posing risks have only accelerated (NBC, 2022) [29] and newly formed aviation safety organizations have joined the fray to address this burgeoning challenge (Graves, 2023) [30]. An additional point needs to be made concerning this topic. The testimony of trained observers like aircraft pilots is powerfully relevant to case studies of UAPs/UFOs. They are well-educated and well-trained in aeronautical sciences/aeronautics, and they have a vantage point with experiences most of us will never have. Our research studies apply and follow a Forensic science standards model. Under Federal Rule 702, regarding the scientific admissibility of expert witness testimonies for the courtrooms, pilots qualify as Expert witnesses. Many times, when they have reported sightings of unknown objects in their airspace, it is occasionally accompanied by multiple parameters, such as Radar-Visual-Sensor observational data. This may be considered hard or soft data, depending on one’s perspective. However, in either case, it should be considered baseline data. In all fairness, we do not expect a pilot to define the properties of a bolide/meteor in our skies. However, we also do not expect an astronomer to define the properties of an avionics system that appears in our skies.
During our research, our Team witnessed incursions of unknown luminous objects and light phenomena firsthand over the open North Atlantic Ocean and the Long Island Sound, which abut Long Island and New York’s southern and northern coastlines. If funded, we are committed to commencing Phase II and beyond to shed further light on this captivating phenomenon. In the interests of science and the advancement of human knowledge, information gleaned and collected on UAPs should not remain bottled up in secret behind closed doors.
Our field investigation and study have uncovered new clues to this phenomenon and raised important questions. For instance, the Low Observability factor can explain why we have so little awareness of these things and why these phenomena have remained enigmatic. We often assign words such as UFOs, UAPs, Ghost Lights, Phantoms, Entities, and the Paranormal because we do not know what they are. Defining them has fallen into our belief structures. Our fieldwork shows that most anomalous objects we observed resided primarily in the Infrared spectrum, especially deep within the SWIR bandwidth. No significant heat signatures were observed, and they are occasionally seen within the visible bandwidth. The objects display a fluid variability in the spectral range. The full scope has yet to be determined. They appear to respond inconsistently to IR light or are sensitive to IR. The objects observed appear to be between 1.5 - 3.0 meters in size, and the intermittent reflections on radar may be due to a small cross-section or an interference. The objects appear to display an affinity for large bodies of water and a curiosity for shipping traffic.
These objects may not be confined to regions within the Earth’s atmosphere. A pioneering Swedish scientific research study by astrophysicist Dr. Beatriz Villarroel and the VASCO team, released in January 2023, postulated a theory based on anomalies discovered on astronomical photographic plates collected from the early 1950s. Dr. Villarroel and her VASCO colleagues found nine star-like objects on the plates that appeared and suddenly vanished from view within a half-hour period in some cases. These “Transients” appeared in geosynchronous orbits around the Earth before the advent of manufactured artificial satellites were launched into space. What interested our team is the similarities in the physical properties of VASCO’s GSO Transient objects outside the atmosphere and the unknown objects we observed within Earth’s atmosphere. Both displayed reflective glints of light that may indicate something with complex surface geometries, such as polyhedral shapes. There are also similarities in the types of formations, such as objects aligned in linear lines, binary sets, and triangular configurations. We cannot say with certainty that these are all the same objects, but the similarities suggest an intriguing possibility about the transmedium extent of their range (Villarroel, 2023) [31].
If no answers are found within the current paradigm, the bigger hypothetical question to ask is, is the 3-Dimensional construct of our senses and perception an accurate representation of reality? Are there extra or higher and more complex spatial dimensions that we cannot perceive but our mathematics tells us are possible? Could a shadow zone or shadow biosphere co-exist with us between these niches (Petkowski, 2016) [32]? In this scenario, an ultra-terrestrial or crypto-terrestrial hypothesis might not be so far-fetched, even if unlikely. We should not be afraid to ask such questions and be willing to explore the possibility that there may be gaps in our understanding of nature, no matter how absurd it may seem. Paradigm shifts in science and thinking have happened before. They are certain to happen again. The fact that human beings are explorers who have reached out to planets and distant stars beyond our own should open the door to the possibility that “Others” may have done the same. The ETH (Extraterrestrial Hypothesis) should also be explored as a possibility, even if considered remote.
6. Conclusion
The fieldwork of the Nightcrawler and its engineering team is an invaluable contribution to the burgeoning on-the-ground observational science of UAP. The work furthers the serious, evidence-based research that must be done to lay the foundation for real progress, understanding what UAP could be and how they may operate and behave as observations suggest. Without the careful, instrumented observations conducted throughout this investigation, there would be no foundation for determining the substantial, coherent, and consistent methodology for the observational science of UAP. The engineering team brings decades of engineering and practical, hands-on technical experience to bear on their research, yet another valuable contribution to a field that can benefit not only from academic scientists, who aim to secure confirmatory observations under stricter laboratory conditions but also from the more foundational field experience of those having to negotiate the complexities of observation of UAP in authentic contexts. Because UAP is observed in real-life situations rather than under laboratory conditions, the work of field scientists is key for progress in the scientific study of UAP. Indeed, because of this kind of work, we can have science in the first place.