Part III: Explaining the “Extra” Heat of Intergalactic Hydrogen Clouds with Probabilistic Spacetime ()
1. Introduction
This is the third of three articles elucidating how the Probabilistic Spacetime Theory (PST) explains previously unexpected experimental and observational research findings. Each of the three articles delineates the PST’s explanatory underlying mechanism for an unexpected phenomenon. The purposes of these articles are: 1) to offer explanations of the unexplained and 2) to promote observational and experimental research concerning both the PST’s facets and predictions and its relative value compared to other models.
This article addresses the discrepancy discovered between observational findings and simulation outcomes concerning the thermal parameters of intergalactic hydrogen clouds when the redshift z < 2. As described in Section 2 below, observations regularly show more heat in such clouds than is predicted by the best of our simulations. Something is causing this “extra” heat, but that factor has not been determined. The latest hypothesis in that regard involves a type of dark matter called the dark photon. That hypothesis is described in Section 2.1.
Section 3 offers a different explication for the extra heat, this one based on the PST. First, to set the framework for understanding that explication, the relevant portions of the PST are delineated. That accounting is followed by the description of how the PST explains both the extra energy found in the intergalactic clouds and why that extra energy has been found only when z < 2.
The final section of this paper offers both an integrative summary concerning the three articles in this three-part series and a comparison of the dark entity explanatory model to the PST. The conclusion from that comparison is that the PST’s probabilistic spacetime offers simpler and a far wider set of explications for cosmological phenomena than do dark entities.
2. Observations versus Simulations
Astronomic spectroscopy is the measurement of the spectrum of electromagnetic radiation emanating from stars and other celestial bodies. Spectroscopy can be used to probe numerous attributes of those bodies. The studies described below employed that methodology to investigate the temperature of intergalactic hydrogen clouds (IHCs). The specific technique employed in these studies used the type of spectroscopy called the Lyman-alpha forest (Ly-α), this involving a series of absorption lines in the spectra of distant astronomic bodies. As the light from the electron transition of neutral hydrogen atoms travels through multiple gas clouds with different redshifts, multiple absorption lines are formed. Thermal properties of the astronomic body under study can be ascertained based on the thickness or narrowness of the spectral lines, the thicker the set of lines, the greater the amount of energy and hence the temperature of the object.
The Ly-α technology has been considered reliable for a long time, being employed for a myriad of applications. Investigations of various astronomic entities and characteristics using the Ly-α have been conducted for over 25 years (e.g., [1] - [6] ). Of relevance to this paper, many studies have used the Ly-α to assess the temperature of IHCs [7] [8] [9] [10] .
Recently, a consistent discrepancy relative to the temperature of IHCs between what the Ly-α showed and what the best simulations indicated has become clear. The problem of incongruity between the observed and simulated (or theorized) IHC temperature was first described in 2014 [8] . A near-five-fold difference suggested the observed temperature to be much larger than predicted [7] . Since that time, further research has been able to correct for some of that disparity through improved simulations [11] [12] , revised estimates of the ultraviolet background in empirical models [13] [14] , and the addition of black hole feedback that effects the intergalactic medium [15] . The culmination of such work is to demonstrate the difference between the observed and simulated has shrunk to a factor of about two [9] , and the remaining incongruity only exists where z ≤ 2.5 [9] .
The Dark Photon Explanation
The most recent investigation concerning this incongruity offered a “dark photon” explanation for the remaining discrepancy (within the context of low-redshift Ly-α observations) [10] . Dark photons are hypothetical particles thought to be force carriers related to dark matter analogous to how ordinary photons carry the electromagnetic force. The dark photon particle can be viewed as extending the gauge group of the Standard Model (SM) as a new spin-1 gauge boson. This attribute allows it to couple very weakly with charged particles through kinetic mixing with ordinary photons. Of high relevance to the cited investigation, at least some dark photons in the context of certain kinetic mixing conditions are thought to convert spontaneously into SM photons [10] [16] . It is the dark material’s conversion to photons that is hypothesized to account for the extra heat found in IHCs compared to what existing simulations predicted.
That recent investigation involved nine hydrodynamical simulations, using three different values of the (ultralight) dark photon mass and three different values for the kinetic mixing. The result was a “best fit” model (from among the nine simulations) within a z = 0.1 context that resembled the observed data (at 1α) by indicating an improvement of 5.3 eV per baryon beyond earlier simulations, as compared to the observed maximum excess of 6.9 eV per baryon [10] .
The researchers concluded that their study was a “first clear indication” that dark matter energy injection can be a compelling explication for the linewidth discrepancy. Acknowledged by the researchers was that other astrophysical, non-canonical heating processes may explain the dissimilarity instead [10] .
It may be important to remember that this research ran its tests in a low-red-shift environment. Very similarly designed investigations using larger redshifts (z > 2) have not found a significant discrepancy between observed Ly-α values and simulation outcomes [2] [6] . (Researchers from the larger redshift studies therefore did not invoke dark photon explanations for their results.) A suggested factor causing this difference in the investigations’ outcomes is the fact that densities of the IHCs studied have been greater when the z > 1.5, as these studies have concentrated on IHCs residing at the outskirts of galaxies [9] . The factor of systematically differing mass densities is of significance to the discussion in Section 3.2.
3. The PST and the “Extra” IHC Energy
Initially in this action, the relevant principles of the Probabilistic Spacetime Theory (PST) are described. The PST’s explication of the “extra” IHC energy will then be presented based on those tenets.
3.1. The Basic Tenets of the PST
The PST has five main principles:
1) Spacetime is the fundamental entity of the universe.
2) Once a quantum of spacetime (called a “probability”) exists, it cannot be destroyed.
3) All fields are derivative from spacetime (which in volume is called the “probability field”).
4) The probability field has phases.
5) Derivatives of the probability field cause it to be self-attractive.
Details concerning all these tenets are described in the original article presenting the theory [17] . The PST’s explanation concerning the etiology of extra IHC energy only involves the first, fourth and fifth principles. These three will therefore be explained briefly here.
The first principle was described in our Part I paper in this three-part series [18] in the following way:
Briefly, the PST posits spacetime is not simply a void or empty container of energy fields but is itself composed of wave functions of probabilistic energy. These energy fragments are the most fundamental entities in the universe. Nothing else is more fundamental. Everything in the universe has its roots in the probabilistic energy we call spacetime.
The fourth principle, that the probability field has phases, means that spacetime itself goes through phase changes. As principle 1 indicated, the probabilistic energy we call spacetime is fundamental to all entities in the universe. Everything is derived from this energy, depending on the energy density within a volume. At baseline energy density, the probability field produces virtual particles whose energy is very rapidly reabsorbed. Magnetism is also produced from the baseline probability field energy by the constant swirling of that energy among the wave functions that are spacetime. At the next density threshold, the massless gauge bosons come to exist. These include photons (and gluons). The phase change into photons requires a local increase in energy density of the probability field beyond a certain threshold. The photons then carry the spacetime-generated magnetism, as is their role as the electromagnetic force transmitter. Through this ongoing interaction between the electromagnetic field and the newly created photons, the maintenance of protons’ energy in this phase is also facilitated.
There are other energy thresholds for phase changes of the probability field, but these are not of relevance to this paper. For details of further phase thresholds, see the original article describing the theory [17] .
The PST’s fifth principle states the probability field is self-cohesive due to its own derivatives. The probability field generates magnetism and virtual gauge bosons everywhere there is spacetime. The generated electromagnetic field brings magnetism to everything in the universe, while the generated (virtual or SM) photons facilitate the transmission of that magnetism. Both derivatives attract the charged energy within the field from which they are generated. Through an interactive process, the probability field is self-cohesive. (Interestingly, research investigating “dark matter” has also described the entity being interactive with itself [19] . However, dark matter is a construct that does not exist within the PST.) If the cohesion gets too great (too pervasive and too persistent) within a volume, the degree to which the local field’s probabilistic energy is available for creating magnetism and gauge bosons diminishes. Strongly cohered local probability field (in the form of mass, for example) no longer generates phase changes due to the lack of available, relatively unattached energy.
3.2. How the PST Explains the Extra Energy
The PST explains the extra energy regularly found in z < 2 IHCs by applying the fourth and fifth principles. Photons are created through phase changes of the probability field when the local energy density is beyond a certain threshold. The baseline density probability field is insufficient (except randomly and infrequently) to bring enough energy to any one location to reach the density threshold for a phase change. That is why most of the universe does not generate light (i.e., does not experience phase changes from the probability field to photons). However, the presence of a low-density cloud attracts an increase in probabilistic energy within the cloud’s loose bounds. That increase in energy density results in a phase change of some of the surrounding probability field into SM photons. The new photons, in turn, add heat to the IHCs that was not reflected by the IHCs themselves.
Additionally, the finding of significant differences in the degree of excess IHC heat when comparing simulations involving z < 2 versus z > 2 can be explained by the PST. As suggested above (in Section 2.1) the fact this different set of findings correlates with the degree of mass density of the IHCs is of high relevance. Studies where z > 2 have involved IHCs located at the outskirts of galaxies and hence are of higher density, whereas the z < 2 investigations substantially use more isolated IHCs, these being quite typically of lower density [9] . The PST sees this correlation as highly relevant to explicating the different findings.
As was described concerning principle 5, the amount of self-cohesion of the probability field is dependent on the local energy density. If the energy is near baseline, nothing happens beyond what the baseline can do. If the energy is somewhat increased to be above the field’s threshold for creating gauge bosons, the self-cohesion is increased but the swirling nature of the probabilistic energy is not significantly impeded. If the local energy is too great, however, then the cohesion of the field becomes paramount, and the likelihood for further phase changes becomes minimal.
When in the proximity of low-density IHCs, the probability field is attracted and energized towards creating gauge bosons. Photons are created that serve to heat the local IHC. But when the local environment is a high-density IHC (or other high-density mass), the attraction is both stronger and likely to result in significantly fewer phase changes. In that environment, new photons are created in far fewer numbers, and the high-density IHC does not receive substantial extra heat generated from the local probability field.
4. Discussion
Both the dark photon model and the PST offer descriptions of a mechanism causing the excess heat found in observational versus simulation studies of z < 2 IHCs. These theorized mechanisms are quite different. The former relies on a specialized type of dark matter and its hypothesized transition to SM bosons under certain circumstances. The latter suggests that the mechanism is a phase transition from baseline spacetime energy. Fascinatingly, despite very different hypothesized mechanisms, both models directly express the idea that newly created SM photons are the source for the excess heat. The PST goes one step further than the dark matter model, though, by explaining why investigations of IHCs at z > 2 do not show that same thermal discrepancy.
Differentiating between these two models, to test for their relative veracity, would seem very useful. Such a direct test may be conducted with a single piece (or collective set) of research using all combinations involving: 1) numerous thresholds in redshift and 2) numerous thresholds in IHC mass density. If the correlation between IHC mass density and excess heat compared to simulations is supported (with “correlation” meaning coinciding with the inverted “U” relationship between IHC mass density and the production of photons), then the PST’s model would seem to 5 have more veracity. If, however, the correlation is not supported, and if the findings are theoretically related to dark photons, then the dark photon model would be considered more accurate.
5. Discussion across Parts I, II, and III Papers
The current research reviewed in this three-part series all resulted in documenting there was more energy within the studied phenomenon than was predicted. Muons showed greater precession than expected by the Standard Model [20] . Black holes were found to grow without accretion or mergers [18] . Certain IHCs measured hotter than our best designed simulations suggested would be the case [10] . Various explications for the extra energy were offered by the researchers, but always with dark matter or dark energy as the hypothesized cause or among the hypothesized causes. The PST (which was developed by integrating empirical and observational findings [17] ), on the other hand, gave an alternative explanation for the extra energy in each case without employing any dark constructs.
This three-part series was designed to demonstrate what the PST has to offer both theorists and empiricists. In contrast to the set of dark entity models, the PST offers straightforward explanations for different kinds of extra energy findings based on a single model with just one primary and a total of five principles. The primary tenet in each case is that spacetime is the fundamental entity (energy) in the universe. This premise puts the PST in contrast to most other models which view spacetime as either the container of the important fields or as irrelevant, but not as the primary source of energy.
The PST rejects all dark entities, because the PST’s principles make dark constructs unnecessary. For example, dark matter is not needed to explain excess gravitational lensing or why stars at the edges of galaxies do not get ejected despite their relatively high velocities. The PST uses spacetime itself, the basis for gravity itself, as the explanation of these phenomena. Clumps of spacetime (higher than baseline densities of probabilistic energy) serve as a gravitational force beyond what visible mass causes directly [17] . Additionally, the process of strong cohesion of spacetime clumps around mass is what the PST posits is the real basis for “dark matter halos” around galaxies. The PST says these halos are relatively dense volumes of spacetime (and hence serve as sources of gravitation beyond that associated with the galaxy mass) that are significantly cohered to the large body of mass. It may be important to note that an observational investigation has documented that there is uneven clumping within galaxy halos contrary to some conceptualizations of dark matter [21] but consistent with the PST [17] .
Investigations over scores of years have never resulted in confirmed evidence for dark matter particles. Therefore, designing an experiment that would differentiate between dark matter models and probabilistic spacetime is clearly problematic. Additionally, the amorphously defined characteristics of dark matter can easily lead to the misinterpretation of newly discovered phenomena as supportive to the construct when the discovery instead, or at least non-differentially shows support for the PST. For example, the results from a very recent observational study were interpreted as supporting the hypothesis that there is a dark matter spike (i.e., a density distribution alteration) surrounding black holes [22] . This result was thought detectable in low mass X-ray binaries, to explain the faster than expected orbital decays of the companion star. However, the PST mandates that any black hole with an accretion disk would necessarily have a beyond-usual clumping (increased density) of spacetime surrounding the black hole, and this would act exactly as the surmised dark matter, by increasing the local gravity.
Likewise, another experiment reportedly being formulated would test for dark matter using the only force with which it is guaranteed to interact, gravity. This planned experiment, called Windchime, is designed to test for dark matter by using a yet-to-be-created detector that would be able to measure tiny variations in gravity (without concomitant changes in the proximity of mass) as the earth moves through the galaxy [23] . The researchers are reportedly prepared to say that if they discover such gravitational variations, their finding will substantiate the existence of dark matter. But a proper conclusion from such a finding could be the opposite. The variations in gravity potentially found by such a study would only document that gravity varies as we move through the cosmos. The PST directly predicts gravitational variability in its concept of the probabilistic clumping of spacetime. Within the context of a lack of positive findings from any other search for a dark matter particle, the Windchime experiment could be significantly damaging to the concept of dark matter by more directly favoring the PST’s concept of spacetime clumping.
The PST’s explanations for cosmic phenomena quite regularly contrast to explications based on dark entities. In addition to addressing the three phenomena described in this three-part series, the PST has offered non-dark resolutions to the Hubble tension [24] , the black hole information paradox [25] , and the enigma of how primordial black holes can be supermassive [17] , despite all these phenomena having received explanations involving dark matter or dark energy. The fact that the PST offers explications for all these phenomena with the same five tenets indicates it is quite parsimonious while being significantly comprehensive in scope. The hope of the authors is that research will be conducted comparing the PST’s predictions against those from any other models.