How the Units That Quantify Both the Gas Constant R and the Boltzmann Constant kB Link the Temperature Dependence of Gas Volume with the Temperature Dependence of Entropy ()
1. Introduction
G. N. Lewis in his seminal book on chemical thermodynamics published in 1923 had this to say about entropy:
“What is this entropy that Clausius and Gibbs have placed in a position of coordinate importance with energy, but which has proved a bugbear to so many students of thermodynamics.” [1]
It is a bugbear because entropy, unlike mechanical kinetic energy 1/2 mv2, depends upon absolute temperature T, and temperature is not a mechanical variable. The differential equations of Newtonian mechanics remain exact only when frictionally-generated heat causing a temperature increase ΔT has been explicitly ruled out with the action-equals-reaction Law III of motion [2]. Temperature is a new dimension beyond the mass, length, and time dimensions upon which the mechanical world-view is founded; and frictionally-generated heat with its higher temperature introduces inequalities into the equations of Newtonian mechanics.
Thermo-dynamics is the science of a vectored force generated by a temperature increase. The mathematics that models the relationship between temperature and force-activated work is built around a temperature increase causing a volume increase. While the relationship between a temperature increase and a gas volume increase is represented exactly in the gas equation PΔV = nRΔT, the relationship between this scalar volume increase and the portion of this volume increase that can be transformed into vectored weight-lifting work using an engine or turbine is uncertain. Industrially, the conversion rate is always less than 50% [3].
2. A Quick Survey of Relevant Experimental Data
Before getting into a detailed discussion about entropy, we must look first at some specific entropy values in Table 1. The values here are taken from the NBS tables of chemical thermodynamic properties [4]. It is immediately apparent that the entropy variable differs from the other two thermodynamic variables in four significant ways. The entropy value is dependent on absolute temperature. The S˚ entropy value is an absolute value, unlike the relative ΔHf˚ and ΔGf˚ values. The entropy value is zero at the absolute zero of temperature. Finally, entropy’s dependence on absolute temperature is smallest for solids, larger for liquids, and larger still for gases.
Before looking at the numbers, we must focus on the symbol and on the units to which the entropy number applies. The superscript (˚) specifies that all values have been normalized to the thermodynamic benchmarks for temperature and pressure in the surroundings; namely T = 298 K (25˚C; 77˚F) and P = 101.3 kPa (1.00 atm). The lack of a Δ symbol signifying an algebraic difference indicates that entropy numbers are all absolute values starting from absolute zero; not relative values benchmarked against a specific homo-atomic molecule as for ΔH and ΔG. The fact there are no negative numbers in the entropy column provides confirmation that entropy values, in J·K−1·mol−1 units are absolute. The value V quantifying volume in the ideal gas equation is also an absolute value.
Table 1. Data selected from Reference 4 showing empirical ΔHo values and ΔGo values. The absolute entropy So values were calculated from these using the thermodynamic relationship ΔG˚ = ΔH˚ − TΔS˚.
|
ΔHf˚ kJ·mol−1 |
ΔGf˚ kJ·mol−1 |
S˚ J·K−1·mol−1 |
Solids |
|
|
|
C (s, diamond) |
1.88 |
2.90 |
2.38 |
B(s)0 |
0.0 |
0.0 |
5.90 |
Li(s) |
0.0 |
0.0 |
29.1 |
LiF(s) |
−618 |
−585 |
35.7 |
Liquids |
|
|
|
H2O(l) |
−286 |
−237 |
70.0 |
H2O2(l) |
−188 |
−120 |
110 |
CH3OH(l) |
−239 |
−166 |
127 |
Gases |
|
|
|
N2(g) |
0.0 |
0.0 |
192 |
O2(g) |
0.0 |
0.0 |
205 |
O3(g) |
143 |
163 |
239 |
It is units that provide the greatest insight into what entropy really is. Entropy (like enthalpy and Gibbs energy and volume) is an extensive variable whose value depends upon the number of molecules comprising the system under study. In Table 1, the mol−1 unit indicates that the system consists of 6.02 × 1023 molecules. When entropy at the microscopic level is quantified using the Boltzmann constant kB = 1.38 × 10−23 J·K−1·molecule−1, the per-molecule unit is not usually specified. This makes the number meaningless, since the value for an extensive variable always depends upon the system’s molecule-count. When people write about entropy without stating both the temperature of the system and the number of molecules in the system they are writing about a concept other than thermodynamic entropy [5].
The joule unit J tells us that the numbers in the entropy column are 1000-times smaller than the kJ numbers in the ΔH and ΔG columns. This indicates that enthalpy plays the major role and entropy plays a minor role in determining the thermodynamic value for the Gibbs energy parameter ΔG = ΔH − TΔS. It is the Gibbs parameter that shows where a reactants-into-products chemical reaction reaches equilibrium.
The per-degree K−1 symbol for entropy tells us that these values quantify not the amount of entropy in one mole of material; but rather the amount by which a one-degree rise in temperature from 289 K to 290 K raises both the volume and the entropy level in one mole of material. To get the actual amount of entropy in one mole of N2(g) gas (Figure 1), one multiplies the per-degree value in Table 1 by 289 K.

Figure 1. At constant pressure, both volume occupied and entropy content for one mole of nitrogen gas increase as temperature increases from minimum values at absolute zero.
3. The Importance of Dimensionality
Long before Carnot, Kelvin, Clausius, Gibbs, and Boltzmann had started to think about thermodynamics, Joseph Fourier (1768-1830) working in Paris at the prestigious École Polytechnique had already introduced the powerful new method of dimensional analysis for maintaining dimensional homogeneity among the various terms in an equation. In the Gibbs energy equation ΔG = ΔH − TΔS, for example, the entropy variable ΔS must be multiplied by the absolute temperature T before it can be subtracted from the ΔH enthalpy term. Otherwise, one would be subtracting apples from oranges!
In 1822 Fourier published The Analytical Theory of Heat (flow) [6]. In this book where he lays out the simple elements of dimensional analysis, Fourier also gives us two profound insights germane to entropy and to thermodynamics.
“Whatever may be the range of mechanical theories, they do not apply to the effects of heat and temperature. These make up a special order of phenomena which cannot be explained by the principles of motion and equilibrium.” (p. 2)
“The thermometer is an instrument containing a substance whose smallest changes of volume can be observed. It serves to measure temperatures by the expansion of a liquid or a gas.” (p. 26)
In 1954, 132 years after Fourier had identified the need, the General Conference on Weights and Measures with responsibility for international agreement on these issues, added the kelvin (unit symbol K; dimension symbol [Θ]) to the list that now contains only 7 linearly-independent SI base units: the kilogram for physical amounts of mass where activity does not depend on molecule-count, the mole for chemical amounts where chemical activity depends on molecule-count (in batches of 6.02 × 1023 regardless of atom-count per molecule) and not on molecule mass, the meter for length, the second for time, the ampere for electrical current, the kelvin for temperature, and the candela for intensity of light. It is significant that in their many publications on thermodynamics neither Kelvin nor Clausius makes any reference to Fourier’s earlier insight about the dimensionality of temperature,
When identified using the ideal gas equation in the form PΔV = nRΔT, the increment by which a one degree rise in absolute temperature increases the volume of a gas sample by ΔV can be calculated with precision using the known value for the gas constant R.
4. Dimensional Duality of the PΔV Term in the Gas Law
The volume increase of a one-mole gas sample heated by ΔT is most easily calculated using the ideal gas law in the form PΔV = nRΔT. Here R is the universal gas constant. The variables in this equation are external pressure P, volume occupied V, molecule-count n (in batches of 6.02 × 1023 regardless of the atom-count per molecule), and absolute temperature T. Tables of the universal constants used in science give the value R = 0.0821 L atm·K−1·mol−1 for the universal gas constant.
However, the table of constants also gives a second value for R; namely R = 8.314 J·K−1·mol−1. This is because the PΔV product in the gas equation has the [m l2·t−2] dimensional signature of energy. Pressure is defined as force-per-unit-area. Force has the dimension [m l·t−2]. Area has the dimension [l2]. Therefore, pressure has the dimension [m l−1·t−2]. Volume has dimension [l3]. Therefore, PΔV has dimension [m l2·t−2], the dimension of energy. The magnitude of the R constant has not changed; only the units by which it is expressed.
When considering the amount by which a 1-degree rise in temperature T increases volume of a gaseous system, one uses the 0.0821 L atm·K−1·mol−1 value for R. When considering macroscopically the amount by which a 1-degree rise in temperature T increases the entropy of a gaseous system, one uses the 8.314 J·K−1·mol−1 value for R. When considering microscopically the amount by which a 1-degree rise in temperature T increases the entropy of a 1-molecule gaseous system, the gas constant is scaled down by a factor of 6.02 × 1023, giving 1.38 × 10−23 J·K−1·molecule−1 as the microscopic value for R. This is the PΔV energy provided by a 1-kelvin temperature increase when expanding the volume occupied by one molecule against external pressure of 101.3 kPa.
5. The Dimensionality of Entropy Is [m l2·t−2·Θ−1·mol−1]
Entropy measured in SI units of joules per kelvin per mole (J·K−1·mol−1) quantifies what a 1-degree rise in temperature T does to expand the volume occupied by a material system; be it a solid, a liquid, or a gas. Notice that entropy, like energy, is an extensive variable. The amount of material in the system under study must be specified by molecule-count. Entropy is the surrogate for the temperature concept T implicit but not specified in the word “heat”. This temperature variable remains hidden when heat is quantified only in energy units. Absolute temperature measured in kelvin units (unit symbol K; dimensional symbol [Θ]) is another linearly-independent dimension beyond the mass, length, and time dimensions used to identify energy’s dimensionality [m l2·t−2]. Entropy’s dimensionality is therefore [m l2 t−2·Θ−1·mol−1].
6. The Relationship between Volume Occupied and Entropy
Content
All materials occupy minimum volume and have minimum entropy at the absolute zero of temperature 0 K. As temperature rises above 0 K the per-degree increase in volume (and in entropy) for solids is small, for liquids it is a bit larger. For gases the per-degree increase in both volume and entropy is very large. For a gas it can be up to 100 times larger than the per-degree expansion coefficient for a solid such as diamond. This makes a gas thermometer a more sensitive instrument for measuring temperature than is a liquid thermometer. The per-degree expansion coefficient for diamond is 0.00354% while the same coefficient for air is 0.346% per degree.
For liquids and solids where molecules are in contiguous contact with each other, held together either by the chemical bonding force or by the weaker van der Waals attractive force, volume occupied is obviously dependent on the diameter of the molecule. The bigger the molecule, the more volume it occupies. But in the gas phase well above a liquid’s boiling temperature, molecules are thermally separated from each other by at least five molecular diameters. In the gas phase all molecules, regardless of diameter, occupy the same volume. This volume is determined by the amplitude of the molecule’s temperature activated agitation; not by the molecule’s diameter. As temperature rises the volume of exclusion for each molecule gets larger because the amplitude of the molecule’s thermal agitation increases. This is why there is a gas equation V = nRT/P with constant R = 0.08206 L atm·K−1·mol−1 telling us that 1 mole of every gas at the thermodynamic benchmark temperature 298 K (25˚C; 77˚F) and 1 atmosphere pressure occupies 24.5 liters. There are no similar equations allowing us to calculate the volume occupied by a liquid or a solid because here volume occupied is dependent on molecule size; and thermal agitation in liquids and solids, although not zero, is constrained by the attractive force holding the molecules in contact.
The R = 0.08206 L atm·K−1·mol−1 constant in the gas equation PV = nRT tells us that a one mole sample of gas (602,000 billion billion molecules) occupies 24.454 liters at T = 289 K (25 ˚C) and when exposed to an external pressure of one atmosphere (101.3 kPa in SI units). When the temperature is raised by one degree from 289 K to 290 K, increasing the thermal activity within the sample, the increased internal pressure pushes out in all directions against the constant external atmospheric pressure and increases the volume from 24.454 liters to 24.539 liters; an increase of 0.082 liter or 0.346 %. A 1-degree temperature increase at 298 K increases both the volume and the entropy content of a gas by 0.346%.
7. The Difference between Scalar PΔV “Work” and Vectored
Weight-Lifting Work
Every table of universal constants lists two values for the gas constant: R = 0.08206 L atm·K−1·mol−1 and R = 8.314 J·K−1·mol−1. The first value is used when one wants to know how the volume of a gas sample varies with temperature. The second value is used when wants to know how the entropy of a gas sample varies with temperature. The way in which both volume occupied and entropy content for one mole of nitrogen gas increase in parallel with increase in absolute temperature is shown in Figure 1.
Table 1 shows that when one mole of N2(g) gas at 298 K temperature and 101.3 kPa pressure has its absolute temperature raised by one kelvin, its absolute entropy S˚ rises by 192 J·K−1·mol−1. Since both the absolute temperature scale and the absolute entropy scale start at zero, and since both volume and entropy are linearly dependent on temperature, Figure 1 shows that 298 K the accumulated entropy in one mole of N2(g) gas is 298 K times 192 J·K−1·mol−1 equals 57.2 kJ·mol−1.
We may conclude from the units used in these two values for R that a pressure-times-volume PΔV product in the first case can legitimately be expressed in SI joule units of energy in the second. The pascal (Pa) is the SI unit of pressure and one atmosphere expressed in SI units is 101.3 kPa. The cubic meter (1000 L) is the SI unit of volume. When volume and pressure variables in the gas equation are expressed in these SI units, this results in R = 8.314 J·K−1·mol−1; a value telling us the amount by which a 1-degree rise in temperature increases the expansive energy of one mole of gas against the external pressure that is containing the sample under study.
This is the same PΔV expansive energy that is harnessed commercially when transforming heat into weight-lifting work using an engine or turbine. While this expansive energy has frequently been called “work” in the literature of thermodynamics, this designation is misleading. Pressure (dimensionality [m·l−1·t−2]) is defined as force-per-unit-area and volume has dimensionality [l3]. While the pressure-volume product has the dimensionality of weight-lifting work it does not have the required directionality. PΔV “work” is equally divided along all three orthogonal coordinates and is a scalar quantity. Real weight-lifting work is oriented in the vertical direction against the force of gravity and is a vectored quantity. Dimensionally, the PΔV expansive energy is equally divided among the three orthogonal directions of space. Only one-third of a PΔV unit of scalar expansive energy can be converted, using an engine or turbine, into vectored mgΔh weight-lifting work.
The piston-and-cylinder geometry of an engine is such that only the one-third of expansive energy that is oriented in the piston’s vectored direction does real work. The other two-thirds pass out of the cylinder as heat whose elevated temperature raises the entropy of the surroundings. In the thermodynamic literature this heat component is frequently called “unavailable energy”.
8. Scaling the R Constant from the Per-Mole Level to the
Per-Molecule Level
Proof that the R constant in the ideal gas law measures not just the volume occupied by a particular gas sample, but also the amount of entropy in that same sample, comes from scaling the system size down from the macroscopic realm of mole units to the microscopic realm of molecule units. During this scaling process, everything remains invariant except the size of the system.
There are 602,000 billion billion atomic mass units (amu) in 1.00 gram unit of mass. Since the mole unit is a chemical’s molecular weight in amu expressed in gram units, there are 6.02 × 1023 molecules in one mole unit of chemical amount. When R = 8.314 J·K−1·mol−1, now expressed in energy units rather than in pressure and volume units, is scaled down from the macroscopic per-mole level to the microscopic per-molecule level using the 6.02 × 1023 scaling factor, one obtains R = 1.38 × 10−23 J·K−1·molecule−1. This very small number is the fraction of a joule by which a 1-degree rise in temperature raises the thermal expansive energy and hence the expanded volume occupied by a one molecule system. But this 1.38 × 10−23 J·K−1·molecule−1 constant has, for over a century, been known by another name and symbol; namely the Boltzmann constant symbolized by kB. This constant kB = 1.38 × 10−23 J·K−1·molecule−1 is the quantum of entropy introduced by Boltzmann to correlate entropy at the microscopic level in per-molecule amounts with the Clausius concept of entropy at the macroscopic level quantified in mole units. Thus, the Boltzmann constant is the microscopic counterpart of the macroscopic gas constant R.
However, when one goes to a table of universal constants today to find the value for the Boltzmann constant, one finds kB = 1.38 × 10−23 J·K−1 with the size of sample to which the value applies unspecified. Why can this be? Entropy, like enthalpy and Gibbs energy, is an extensive variable. Without sample-size specified, as it is in Table 1, the number has no meaning. The value for an extensive variable like mass or volume or energy is directly proportional to sample size. The value for an intensive variable like density or temperature does not depend on sample size; and entropy is not an intensive variable. This failure to specify sample size for the Boltzmann constant when it is now known to be per-molecule is another reason students find the entropy concept very confusing.
Since the kB constant tells us the amount of entropy in one gas molecule at one particular temperature, then the R constant tells us the amount of entropy in one mole of an ideal gas at the same temperature. Conversely, since the R constant tells us the volume occupied by one mole of an ideal gas, then the kB constant quantifies for us the excluded volume occupied by one gas molecule. When these molecule increments of excluded volume are scaled up by the 6.02 × 1023 factor, this gives us the volume of 24.5 liters encoded in the gas constant R. This also illustrates why volume occupied and entropy content are both linearly (not logarithmically) dependent on absolute temperature, as shown in Figure 1. The “random disorder” interpretation frequently used to describe entropy fits well with this picture. As the volume of a molecular system is thermally increased, this gives the molecules more space in which to become more disordered; thereby increasing their entropy.
In the late 1800s, before the actual existence of molecules had been verified experimentally, James Clerk Maxwell (1831-1879) in Cambridge and Ludwig Boltzmann (1844-1906) in Vienna were each trying to model entropy at the microscopic level in statistical terms; that is, in pure very-large probability numbers devoid of molecule units. This new model known today as “statistical mechanics” defined macroscopic entropy S as “the natural logarithm of W, the number of distinct microscopic states” available to the macroscopic system. But the W variable here is a pure number. There are no units specifying the size of the sample to which this “entropy” value applies.
Today we know that the number of “distinct microscopic states” available to the macroscopic system is 602,000 billion; the number of molecules in one mole. And today we know that the dependence of entropy upon temperature is linear; not logarithmic. Today the ratio R/kB has the value 6.02 × 1023 molecules-per-mole. This value is not a dimensionless number. It is the number of molecules in one mole, regardless of whether the system is a solid, a liquid, or a gas. And the molecule is just as much a unit of system amount as is the mole.
In the period 1893-1900 when Boltzmann in Vienna was doing his most important work, many leading physicists and applied mathematicians were anti-atomists; people highly skeptical about the real existence of microscopic particles regardless of whether they were called atoms or molecules. Until 1905 when he became convinced by the evidence from Brownian motion, even Einstein had been skeptical about the existence of molecules. During his final years Boltzmann was constantly under attack from the logical positivists (i.e. anti-atomists) of the Vienna Circle who opposed any atomistic interpretation of microscopic phenomena; entropy in particular. It was for this reason that Boltzmann, who used the terms “atom” and “molecule” interchangeably, tried to fudge the issue by attaching neither to his value for the Boltzmann constant when one or the other is clearly required because entropy is an extensive variable. Boltzmann suffered from depression, and these attacks from prominent scientists eventually led to him taking his own life in 1906.
9. A Bit of History about Entropy and Thermodynamics
The fundamental equations of thermodynamics and of entropy were laid down by the young William Thomson (1824-1907; later Lord Kelvin) in Scotland and by Rudolf Clausius (1822-1888) in Germany. Both men were trained as mathematicians and were very adept mathematically. They had established their careers working with the exact differential equations of mechanics arising out of Newton’s three laws of motion. Neither man had much (if any) familiarity with the thermo-chemistry of combustion reactions; source of the elevated temperature in which mechanical work originates. Nor were they familiar with the mole-molecule relationship for measuring system amount.
In 1849 [7] Thomson introduced the new word “thermo-dynamics” (Greek for a temperature-generated force) to label the new pan-scientific discipline in which both men were heavily engaged. Between 1850 and 1865 Clausius published nine papers on this new topic, in the last of which the term “entropy” appears for the first time.1
Throughout this 16-year period of research, both men were engaged in acrimonious competition for priority in “discovering” the laws of this new discipline. Because the word “thermodynamics” had been introduced by Thomson, Clausius absolutely refused to use this new term. In his nine papers on the subject, Clausius always used the term “mechanical theory of heat” where Thomson used the term “thermodynamics”; oblivious of the fact that temperature is not a mechanical variable. Because Clausius had introduced the term “entropy”, Thomson seldom if ever used it; preferring to use the term “diffusion” instead. Both men were so wedded to the mechanical world-view that neither was prepared to acknowledge Fourier’s earlier insight that temperature is not a mechanical dimension; and that thermodynamics is not a mechanical science. Given this level of ambiguity and conflict over key terms at the very beginning, it is little wonder that students today have difficulty understanding thermodynamics.
Both Thomson (later Lord Kelvin) and Clausius tried, without success, to portray thermodynamics as just a logical extension of mechanics by introducing the problematic reversibility concept into their differential equations. All the equations of classical mechanics do indeed remain invariant when the time variable is reversed, provided there are no frictional losses producing an entropy increase. But chemically-generated heat always flows naturally from high to low down the temperature axis; even under time reversal. If none of it is converted into weight-lifting work using an engine or turbine, all the heat goes to increasing (very slightly) the temperature, the volume, and the entropy of the atmospheric surroundings. If some of it gets temporarily converted into weight-lifting work, even that mechanical energy eventually gets degraded by friction into heat that further raises the entropy of the surroundings. So wedded were they to the mechanical worl-view, neither man was prepared to admit that, because it is built around the temperature variable, thermodynamics (like chemistry) is not a mechanical science. Neither Kelvin nor Clausius ever made reference to Fourier’s book, published before either of them was born, showing by dimensional analysis that temperature is not a mechanical variable.
Regardless of the direction in which we choose to view the time dimension, heat always flows from a hot system to a cooler surroundings down the temperature axis. During this passage, it raises both the volume and the entropy of the atmospheric surroundings. This confirms the first two thermodynamic laws. As long as a heat increment descending the temperature axis keeps its temperature component hidden by being quantified in energy units only, then the energy component of heat can be viewed as being conserved. The energy part of the heat increment does not change. But the temperature component of this heat is not conserved. It is the hidden temperature component of the heat unit that activates all thermodynamic (and mechanical) change and it must necessarily be quantified in the entropy variable. Since the temperature component of heat always flows from a heat generating system into the lower temperature surroundings, this means that the Clausius statement of the second thermodynamic law was deeply insightful: “The entropy of the World always increases”. It never decreases because heat never flows of its own accord from a low temperature system into a higher temperature one.
10. Conclusions
This study shows that, in the gas equation PΔV = nRΔT, both volume-increase and entropy-increase are quantified in the same J·K−1 units. The gas constant R = 8.314 J·K−1·mol−1.and the Boltzmann constant kB = 1.38 × 10−23 J·K−1·molecule−1, smaller by a factor of 6.02 × 1023, both measure the same thing; one macroscopically and the other microscopically. At the per-mole macroscopic level, the R constant measures the energy increment PΔV needed to effect a one-kelvin volume expansion ΔV for one mole of gas against ambient atmospheric pressure of 101.3 kPa. At the per-molecule microscopic level, the kB constant measures the energy increment PΔV needed to effect a one-kelvin volume expansion ΔV for one gas molecule against ambient atmospheric pressure of 101.3 kPa.
Dictionaries and science textbooks define “entropy” as the degree of randomized molecular disorder in a gaseous system created by a rise in temperature. Since there are no units with which to quantify “randomized molecular disorder”, scientists were forced to use statistical methods to quantify entropy. This study shows how an entropy increase is linked to a volume increase ΔV, an increase which can be measured empirically. Since a volume increase provides molecules with a larger number of micro-places in which to relocate, this new method for quantifying entropy is consistent with the dictionary definition.
NOTES
1Clausius, R. On Several Convenient Forms of the Fundamental Equations of the Mechanical Theory of Heat, Pogg. Ann. 1865; reprinted (and translated) in The Mechanical Theory of Heat, Van Voorst, London, 1867, p. 357.