Projections between Euclidean Volumes with Information Due to Spontaneous Symmetry Breaking

Abstract

The goal of our research was to determine a coupling between information theory, geometry and multi-dimensional projections. This was accomplished after preliminary mathematics was presented to determine an alternative method for the illustration of multi-dimensional spaces. That was developed with a unique series that gives structure to integer exponents of power sets. The desired coupling is concisely illustrated in a single figure which includes three cyclic phases that are isomorphic to the three phases of Euclidian, rectangular cuboids. The series enables projections between n- and m-dimensional volumes. The associated figure also illustrates how vertical and/or horizontal symmetry breaking or symmetry emits or absorbs information.

Keywords

Share and Cite:

Houston, L. (2023) Projections between Euclidean Volumes with Information Due to Spontaneous Symmetry Breaking. Journal of Applied Mathematics and Physics, 11, 3519-3528. doi: 10.4236/jamp.2023.1111223.

1. Introduction

What would be the value of a simple equation that projects informational structures from one dimension to any other dimension, enabling dimensional reduction or expansion? And what if it was determined that all such reductions could produce 3-dimensional images of rectangular cuboids [1] ?

If it were determined that all Euclidian-dimensional reductions produced 3-dimensional images of cuboids, this would provide a useful constraint on the equation’s output. It would indicate that any dimensional reduction of an informational structure could result in a three-dimensional cuboid shape, which could simplify analysis and visualization of the data. In addition, it provides a more viable connection to physics than traditional information theory [2] .

However, it is important to note that dimensional reduction and (or) projection is a complex problem that has been extensively studied in mathematics and computer science. There are already many existing techniques for dimensional reduction, such as principal component analysis [3] and t-SNE (t-Distributed Stochastic Neighbor Embedding) [4] that are widely used in data analysis and machine learning. It is unlikely that a simple equation could completely replace these methods, but it could potentially offer a novel approach to dimensional reduction/projection that has unique advantages or applications.

In this paper, using a unique factorization theorem, we present an alternative way to visualize higher-dimensional space as direct “images” of information. We show that a more analytic description of information is a countable space that contains cuboid volumes extracted from exponential sets. Our paper also presents visual confirmation of John Wheeler’s famous statement: “It from bit” or every physical entity, every it, derives from a bit [5] .

2. Standard Information Theory

From Shannon’s famous paper on communication [6] , information is defined as follows. Given n possible outcomes for an observation of a system with n randomly occurring states, the average uncertainty prior to observation is defined as the entropy, H [7] .

For equally probable states,

$H=\mathrm{log}\left(\frac{1}{p}\right)$ , (1)

where p = 1/n is the probability of observing a particular state. After the state is observed, the information gained has the magnitude H:

$H\to I$ . (2)

The equation for entropy, here from information theory, is essentially the same as that used in thermodynamics, hence the name.

Numerically, information is measured in bits (short for binary digits). One bit is equivalent to the choice between two equally likely choices. For example, if we know that a coin is to be tossed but is unable to see it as it falls, a message telling whether the coin came up heads or tails gives us one bit of information. When there are several equally likely choices, the number of bits is equal to the logarithm of the number of choices taken to the base two. For example, if a message specifies one of sixteen equally likely choices, it is said to contain four bits of information. When the various choices are not equally probable, the situation is more complex [8] [9] .

3. Information Based on Symmetry Breaking

Symmetry in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, “symmetry” has a more precise definition, and is usually used to refer to an object that is invariant under some transformations, including translation, reflection, rotation or scaling [10] . We propose that information is fundamentally hidden by symmetry and released by spontaneous symmetry breaking. This proposal is supported by the following basic geometrical Figures:

These “volumes” or it-boxes are internally uniform and thus, are internally symmetric. We denote these Euclidean or rectangular volumes with the equation:

$V\left(x,n\right)={x}^{n},\text{\hspace{0.17em}}\left\{n=1,2,3,\cdots \right\}$ (3)

Equation (3) denotes an exponential set, which is uncountably infinite with the cardinality of the real numbers, $ℝ$ . In order to extract information from within (3), one would need to identify single, nearly infinitesimal points within the volumes shown in Figure 1. That would require a tremendous amount of resolution or an extremely large amount of energy.

However, if we were able to partition the volume, V, into a sum of distinct, symmetric, countable volumes, then, since these smaller volumes are countable, the it-boxes become bit-boxes. We illustrate this point in Figure 2.

Any asymmetry between the volumes within a bit-box represents 1 bit of information or the choice between two equally probable states. Note that rather than using a single cube, we will hence, use two concatenated cubes, shown in Figure 3 to construct our bit-boxes. This interprets a single cube as an it-box and two coupled it-boxes as a bit-box.

At this point, we can redefine a quantity of information as something other than “entropy”. While information is traditionally a measure of the “a priori uncertainty” in an observed message, information, in this theory, is defined as the “a posteriori asymmetries in an observed image, due to either vertical planer and/or horizontally planer symmetry-breaking.

Figure 1. A 1-D line segment, a 2-D square and a 3-D cube.

Figure 2. Three bit-boxes in 1-D, 2-D, and 3-D spatial volumes.

Figure 3. An illustration of “it from bit”, based on our dimensional projection theory.

This clearly implies that symmetry does not radiate information, while asymmetry does.

We will now prove an important theorem, which is the essence of this paper.

4. Theorem I (Projection Theorem)

If $n,k\in {ℤ}^{+}\cup 0$ and $m\in {ℤ}^{+}$ ,

Then

$n={\sum }_{k=0}^{m-1}⌊\frac{n+k}{m}⌋$

where $⌊\text{ }⌋$ is the floor function.

Proof:

Let:

$S=⌊\frac{n}{m}⌋+⌊\frac{n+1}{m}⌋+⌊\frac{n+2}{m}⌋+\cdots +⌊\frac{n+m-1}{m}⌋$

Case (1): $n\le m$ :

We can rewrite S as:

$S=\left[1+\frac{n-m}{m}\right]+\left[1+\frac{n-m+1}{m}\right]+\left[1+\frac{n-m+2}{m}\right]+\cdots +⌊\frac{n}{m}⌋$

If n=m, then S=m. If n = m − 1, then S = m − 1. If m = 1, then S = 1.

Case (2): n > m:

Let n = mp + R, R < m. Then S can be written as:

$S=\left[1+\frac{mp+R-m}{m}\right]+\left[1+\frac{mp+R-m+1}{m}\right]+\cdots +\left[1+\frac{mp+R-1}{m}\right]$

$S=mp+\left[1+\frac{R-m}{m}\right]+\left[1+\frac{R-m+1}{m}\right]+\cdots +\left[1+\frac{R-1}{m}\right]$

Which reduces to

$S=mp+R$

$S=n={\sum }_{k=0}^{m-1}⌊\frac{n+k}{m}⌋$ $⊡$

Corollary:

${x}^{n}={x}^{{\sum }_{k=0}^{m-1}⌊\frac{n+k}{m}⌋},\text{\hspace{0.17em}}x\in ℝ$

Or more specifically,

${x}_{m}^{n}={x}^{⌊\frac{n}{m}⌋}\ast {x}^{⌊\frac{n+1}{m}⌋}\ast {x}^{⌊\frac{n+2}{m}⌋}\ast \cdots \ast {x}^{⌊\frac{n+m-1}{m}⌋}$ . (4)

This series projects an n-dimensional “volume” into an m-dimensional “volume”.

Consider the case for m = 3:

${x}_{3}^{n}={x}^{⌊\frac{n}{3}⌋}\ast {x}^{⌊\frac{n+1}{3}⌋}\ast {x}^{⌊\frac{n+3-1}{3}⌋}$ . (5)

This equation specifically generates the 3 cuboid states: a cube, a long cuboid, and a wide cuboid. In this case, there is no difference between the input dimensions and the output dimensions. We can produce graphic images by treating each factor in Equation (5) as elements of an n-dimensional volume within 3-dimensional space (i.e., length, width, and height). The unit cubes that comprise our images are called “voxels”. Voxels are essentially 3-D pixels, but instead of being squares, they are perfect cubes of unit volume. In theory, voxels are the ideal modeling technique for replicating reality. After-all, our world is made of something akin to voxels (but they are much smaller, and we call them “sub-atomic particles”). If we have a high enough density (or “resolution”) and the proper rendering techniques, we can use voxels to replicate real-world objects that would be impossible to differentiate from the real thing. Note that the unit cubic cells (i.e., boxes or voxels) shown in Figure 4 are contiguous, while the individual cuboids are spatially separated. ${x}^{n}$ is an exponential set that is uncountably infinite, with the same cardinality as the real numbers, $ℝ$ . This precludes distinct, real-valued measurements, required to acquire, or store countable packets of information. However, geometry enables precise images of cuboids, as one way around this impediment, because the images consist of countable bit-boxes within separate cuboids that can undergo continuous, invariant translation.

5. Cuboid Phase Evolution Using Geometry and Information Theory

Figure 4 shows exact models of cuboids that occur, commonly, in certain natural minerals, such as diamonds [11] . The pictures are also isomorphic to solid (cube), liquid (long cuboid), and gas (wide cuboid) phases of matter [12] . Cuboid phases are also cyclic with distinctive phase transitions, based on geometry and shown in Figure 5.

6. Compatibility with Standard Information Theory

Our initial goal was to make multi-dimensional projections of spatial volumes. That goal is clearly illustrated in Figure 5. By using geometry and abstract algebra, we find that we are able to construct “images” of information, while maintaining compatibility with standard information theory. Thus, we define information as a countable property of space in which the magnitude of information is the

Figure 4. Some examples of cuboids (cubes, long cuboids, and wide cuboids) produced by the finite series given in (4). The phases evolve vertically, from top to bottom.

Figure 5. An illustration of the coupling between vertical and horizontal symmetry breaking, multidimensional cuboids and information theory, in which n indicates the dimension of the associated volume and its measure in bits, while 2n is the volume.

volume of a cuboid expressed as a physical image.

Another way to obtain information from ${x}^{n}$ is by applying a logarithm (the inverse of an exponential set) to both sides of (4):

${\mathrm{log}}_{m}\left({x}_{m}^{n}\right)={\mathrm{log}}_{m}\left({x}^{⌊\frac{n}{m}⌋}\right)+{\mathrm{log}}_{m}\left({x}^{⌊\frac{n+1}{m}⌋}\right)+{\mathrm{log}}_{m}\left({x}^{⌊\frac{n+2}{m}⌋}\right)+\cdots +{\mathrm{log}}_{m}\left({x}^{⌊\frac{n+m-1}{m}⌋}\right)$ . (6)

Which reduces to:

${\mathrm{log}}_{m}\left({x}_{m}^{n}\right)=⌊\frac{n}{m}⌋{\mathrm{log}}_{m}\left(x\right)+⌊\frac{n+1}{m}⌋{\mathrm{log}}_{m}\left(x\right)+\cdots +⌊\frac{n+m-1}{m}⌋{\mathrm{log}}_{m}\left(x\right)$ (7)

And, for m=x, becomes:

$n=⌊\frac{n}{m}⌋+⌊\frac{n+1}{m}⌋+⌊\frac{n+2}{m}⌋+\cdots +⌊\frac{n+m-1}{m}⌋$ . (8)

This result is just Theorem I.

Based on our premise that information is countable volume, then Equation (4) is an exact measure of information within a countable subspace of a given volume:

$V\left(x,n\right)={x}^{n}$ :

Entropy, H, based on Information Theory, measures the average information within a message. It relies on statistics that critically depends upon individual probabilities, ${p}_{i}$ :

$H=-{\sum }_{i=1}^{n}{p}_{i}{\mathrm{log}}_{2}{p}_{i}$ (9)

In contrast, the information ( ${I}_{n,m}$ ), released or stored due to vertical and/or horizontal symmetry-breaking or symmetry increases shown in (4) and (5), is not stochastic and is defined as:

${I}_{n,m}\equiv {\mathrm{log}}_{m}\left({x}_{m}^{n}\right)$ . (10)

${I}_{n,m}\left(x\right)=⌊\frac{n}{m}⌋{\mathrm{log}}_{m}\left(x\right)+⌊\frac{n+1}{m}⌋{\mathrm{log}}_{m}\left(x\right)+\cdots +⌊\frac{n+m-1}{m}⌋{\mathrm{log}}_{m}\left(x\right)$ (11)

Nonetheless, we can still derive the same results for the scalar measure of information in terms of bits for both methods. The difference is that information theory does not include physical images for information.

Here are some examples, considering binary information or letting x = 2, so that ${\mathrm{log}}_{2}\left(x\right)=1$ . Observe that because, x = 2, the information is measured in bits, exactly in accordance with Shannon’s information measure. Here are some examples:

$\begin{array}{l}{I}_{1,2}\left(x\right)=⌊\frac{1}{2}⌋+⌊\frac{2}{2}⌋=1\text{\hspace{0.17em}}\text{bit}\\ {I}_{2,2}\left(x\right)=⌊\frac{2}{2}⌋+⌊\frac{3}{2}⌋=2\text{\hspace{0.17em}}\text{bits}\\ {I}_{3,2}\left(x\right)=⌊\frac{3}{2}⌋+⌊\frac{4}{2}⌋=3\text{\hspace{0.17em}}\text{bits}\\ {I}_{4,2}\left(x\right)=⌊\frac{4}{2}⌋+⌊\frac{5}{2}⌋=4\text{\hspace{0.17em}}\text{bits}\end{array}$ (12)

The computer coding that is needed for dimensional projection has minimal complexity. We illustrate this with a basic software code that inputs x, n and m and outputs length, width, and height of the associated cuboid. Looking at the images shown in Figure 4 and Figure 5, we can confirm that by counting the number of voxels in each cuboid image, we recover the expected volume equal to (length) × (width) × (height).

7. The Basic Algorithm for Dimensional Projection

40 g = 1

100 print ‘for the volume: x^n, the projection into the volume: x^m’

150 print ‘input x,n,m’;

200 input x,n,m

205 print ;

255 for k = 0 to m

258 if k = m then goto 600

350 f = floor((k+n)/m)

355 print;

370 g = (x^f)*g

387 print

390 print ‘ ‘,g

400 print ;

450 g = 1

500 next k

600 end

Sample Runs:

input x,n,m? 3,5,3

3

9

9

>run

for the volume: x^n, the projection into the volume: x^m

input x,n,m? 3,52,3

129140163

129140163

387420489

>run

for the volume: x^n, the projection into the volume: x^m

input x,n,m? 3.14159,4,3

3.14159

3.14159

9.869588

>run

for the volume: x^n, the projection into the volume: x^m

input x,n,m? 3,3,3

3

3

3

Observe that each of these runs produced results that are easily normalized, consistent with scale invariance.

8. Conclusions

We started with a theorem that proves the representation of any positive integer as a non-linear, finite series. The series was used to create exponential sets and to visualize them as 3-D cuboids; necessary ingredients for the storage or extraction of information based on symmetry, in particular, vertical and/or horizontal symmetry. The geometric images produced, made the exponential sets countable in the form of Euclidean structures that model rectangular cuboids. The equation or finite series that produces n-dimensional to m-dimensional transformations is given in (4):

${x}_{m}^{n}={x}^{⌊\frac{n}{m}⌋}\ast {x}^{⌊\frac{n+1}{m}⌋}\ast {x}^{⌊\frac{n+2}{m}⌋}\ast \cdots \ast {x}^{⌊\frac{n+m-1}{m}⌋}$ .

This section of the work was followed by deriving a countable subspace of the exponential set model with a simple application of a logarithm (base 2) to the exponential sets and demonstrating its equivalent results to those of standard information theory.

There are many promising aspects of this work that can be explored. It impacts nearly all areas of physics and mathematics and proves that information is physical and occurs specifically, in units of bits.

The Euclidian cuboid space of information as a countable volume may connect it to a 3-manifold [13] , as a possible shape of the universe. Future research will review this topological connection. In addition, there is some allowable variance in the projection theorem that needs to be explored.

Nonetheless, the main goal of this paper was met and concisely illustrated in Figure 5. That is, there is a tangible coupling between information theory, geometry and multi-dimensional projection. This all relies on a unique visualization of higher-dimensional space.

Conflicts of Interest

The author declares no conflicts of interest regarding the publication of this paper.

 [1] Robertson, S.A. (1984) Polytopes and Symmetry. Cambridge University Press, Cambridge, p. 75. [2] Gao, X., Gallicchio, E. and Roitberg, A. (2019) The Generalized Boltzmann Distribution Is the only Distribution in Which the Gibbs-Shannon Entropy Equals the Thermodynamic Entropy. The Journal of Chemical Physics, 151, Article 034113. https://doi.org/10.1063/1.5111333 [3] Jolliffe, I.T. and Cadima, J. (2016) Principal Component Analysis: A Review and Recent Developments. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374, Article 20150202. https://doi.org/10.1098/rsta.2015.0202 [4] Jamieson, A.R., Giger, M.L., Drukker, K., Lui, H., Yuan, Y. and Bhooshan, N. (2010). Exploring Nonlinear Feature Space Dimension Reduction and Data Representation in Breast CADx with Laplacian Eigenmaps and t-SNE. Medical Physics, 37, 339-351. https://doi.org/10.1118/1.3267037 [5] Barzegar, A., Shafiee, A. and Taqavi, M. (2020) “It from Bit” and Quantum Mechanics. Foundations of Science, 25, 375-384. https://doi.org/10.1007/s10699-019-09644-1 [6] Shannon, C.E. and Weaver, W. (1949) The Mathematical Theory of Communication. University of Illinois Press, Champaign. [7] Wehrl, A. (1978) General Properties of Entropy. Reviews of Modern Physics, 50, 221-260. https://doi.org/10.1103/RevModPhys.50.221 [8] Jaynes, E.T. (1957) Information Theory and Statistical Mechanics. Physical Review, 106, 620-630. https://doi.org/10.1103/PhysRev.106.620 [9] Borda, M. (2011) Fundamentals in Information Theory and Coding. Springer, Berlin. https://doi.org/10.1007/978-3-642-20347-3 [10] Noether, E. (1918) Invariante Variationsprobleme. Nachrichten von der Gesellschaft der Wissenschaften zu Göttingen. Mathematisch-Physikalische Klasse. pp. 235-257. [11] Pechnikov, V.A. and Kaminsky, F.V. (2008) Diamond Potential of Metamorphic Rocks in the Kokchetav Massif, Northern Kazakhstan. European Journal of Mineralogy, 20, 395-413. https://doi.org/10.1127/0935-1221/2008/0020-1813 [12] Bridgman, P.W. (1937) The Phase Diagram of Water to 45,000 kg/cm2. Journal of Chemical Physics, 5, 964-966. https://doi.org/10.1063/1.1749971 [13] Hempel, J, (1976) 3-Manifolds. Princeton University Press, Princeton.