Next Article in Journal
Designing Labeled Graph Classifiers by Exploiting the Rényi Entropy of the Dissimilarity Representation
Previous Article in Journal
Cockroach Swarm Optimization Algorithm for Travel Planning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems

Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616, USA
*
Author to whom correspondence should be addressed.
Entropy 2017, 19(5), 214; https://doi.org/10.3390/e19050214
Submission received: 22 January 2017 / Revised: 22 March 2017 / Accepted: 3 May 2017 / Published: 8 May 2017
(This article belongs to the Section Statistical Physics)

Abstract

:
Collective organization in matter plays a significant role in its expressed physical properties. Typically, it is detected via an order parameter, appropriately defined for each given system’s observed emergent patterns. Recent developments in information theory, however, suggest quantifying collective organization in a system- and phenomenon-agnostic way: decomposing the system’s thermodynamic entropy density into a localized entropy, that is solely contained in the dynamics at a single location, and a bound entropy, that is stored in space as domains, clusters, excitations, or other emergent structures. As a concrete demonstration, we compute this decomposition and related quantities explicitly for the nearest-neighbor Ising model on the 1D chain, on the Bethe lattice with coordination number k = 3 , and on the 2D square lattice, illustrating its generality and the functional insights it gives near and away from phase transitions. In particular, we consider the roles that different spin motifs play (in cluster bulk, cluster edges, and the like) and how these affect the dependencies between spins.

1. Introduction

Collective behavior underlies a vast array of fascinating phenomena, many of critical importance to contemporary science and technology. Unusual material properties—such as superconductivity, metal-insulator transitions, and heavy fermions—have been attributed to collective behavior arising from the compounded interaction of system components [1]. Collective behavior is by no means limited to complex materials, however: the behavior of financial markets [2], epileptic seizures [3] and consciousness [4], and animal flocking behavior [5] are all also seen as examples. We now appreciate that it appears most prominently via phase transitions ([6] and references therein).
Operationally, collective behavior is detected and quantified using correlations or, more recently, by mutual information [7], estimated either locally via pairwise component interactions or globally. Exploring spin systems familiar from statistical mechanics, here we argue that these diagnostics can be substantially refined to become more incisive tools for quantifying collective behavior. This is part of the larger endeavor of discovering ways to automatically detect collective behavior and the emergence of organization [8].
Along these lines, much effort has been invested to explore the information-theoretic properties of the Ising model of statistical mechanics. Both Shaw [9] and Arnold [10] studied information in 1D spin-strips within the 2D Ising model. Feldman and Crutchfield [11,12,13] explored several generalizations of the excess entropy, a well known mutual information measure for time series, in 1D and 2D Ising models as a function of coupling strength, showing that they were sensitive to spin patterns and can be used as a generalized order parameter. In one of the more thoroughgoing studies to date, Lau and Grassberger [14] computed the excess entropy using adjacent rings of spins to probe criticality in the 2D cylindrical Ising model. Barnett et al. [6] tracked information flows in a kinetic Ising system using both the pairwise and global mutual information (generalized as the total correlation [15], it appears below) and the transfer entropy [16]. Abdallah and Plumbley [17] employed the dual total correlation [18,19] on small, random spin glasses. Despite their successful application, the measures used to monitor collective phenomena in these studies were motivated information theoretically rather than physically, and so their thermodynamic relevance and structural interpretation have remained unclear.
To remedy this, we decompose the thermodynamic entropy density for spin systems into a set of information measures, some already familiar in information theory [20], complex systems [21], and elsewhere [17]. We seek to illustrate how these different measures from information theory may be applied to spin systems. For example, one measure that falls out naturally—the bound entropy—is that part of the thermodynamic entropy accounting for entropy shared between spins. This is in contrast to monitoring collective behavior as the difference between the thermodynamic entropy and the entropy of a hypothetical distribution over uncorrelated spins—the total correlation mentioned above. In this way, the entropy-density decomposition provides a physical basis for informational measures of collective behavior, in particular showing how the measures lead to interpretations of emergent structure in system configurations. Paralleling the work in [22,23] with respect to decomposition of an information bit and stochastic equilibria, “anatomy” refers to dissecting something typically taken as a unitary whole—here, spin thermodynamic entropy density—to reveal its inner components.
Collective behavior can manifest itself in many ways. Perhaps the most obvious is synchronization [24]: schooling fish, flocking birds, and pulsing fireflies. It is these sorts of behaviors that maximize the total correlation. There are, however, other forms of collective behavior. Consider an error-correcting parity bit where a set of n random bits is augmented with an n + 1 th bit that is the sum mod 2 of the others. In this case, any proper subset of the n + 1 bits consists of bits that are statistically independent, but any n of them can uniquely identify the last. It is this sort of collective behavior that maximizes the dual total correlation.
To make this argument and illustrate its consequences, Section 2 first defines notation and lays the groundwork for our thermodynamic entropy-density decomposition. Section 3 then derives the decomposition. Though the decomposition is given for regular spin lattices, in principle it is meaningful for any system with well-defined thermodynamic entropy. Section 4 outlines the computational methods for estimating the resulting quantities, using them to interpret emergent organization in the nearest-neighbor ferromagnetic Ising model in one dimension (Section 4.1), in the Bethe lattice with coordination number k = 3 (Section 4.2), and in the two-dimensional square Ising lattice (Section 4.3). Having established the different types of entropy in spin systems and connections to their underlying physical structures, we conclude by suggesting applications and future directions.

2. Spin Entropies

We write σ to denote a configuration of spins on a lattice L , σ i for the particular spin state at lattice site i L , and σ i the collection of all spin states in a configuration excluding σ i at site i. The random variable over all possible spin configurations is denoted σ , a particular spin variable is σ i , and all spin variables but the one at site i is σ i . As a shorthand, we write σ σ and similar to mean indexing into the event space of the global configuration random variable σ . We study the ferromagnetic spin- 1 / 2 Ising model with nearest-neighbor interactions in the thermodynamic limit whose Hamiltonian is given by:
H ( σ ) = J i , j σ i σ j ,
where i , j denotes all pairs ( i , j ) such that the sites i and j are directly connected in L and the interaction strength J is positive. We assume the system is in equilibrium and isolated. So, the probability of configuration σ occurring is given by the Boltzmann distribution:
p σ = 1 Z e H ( σ ) / k B T ,
where Z is the partition function.
The Boltzmann entropy of a statistical mechanical system assumes the constituent degrees of freedom (here spins) are uncorrelated. To determine it, we use the isolated spin entropy:
H σ 0 = p log 2 p p log 2 p ,
where p = ( 1 + m ) / 2 is the probability of a spin being up, p = ( 1 m ) / 2 is the probability of a spin being down, and m = ( # # ) / L is average magnetization in a configuration. The site index 0 was chosen arbitrarily and represents any single spin in the lattice. The system’s Boltzmann entropy is then the extensive quantity H B = L · H σ 0 .
As Jaynes [25] emphasizes, though correct for an ideal gas, H B is not the empirically correct system entropy if a system develops internal correlations. In our case, if spins are correlated across the lattice, one must consider entire configurations, leading to the Gibbs entropy:
H σ = σ σ p σ log 2 p σ
and the thermodynamic entropy density h, the entropy per spin when the entire lattice is considered:
h = H σ L .
Note that we dropped the factor k B and used the base-2 logarithm. Our use of the letter h as opposed to s is meant to reflect this multiplicative constant difference. It is easy to see that H σ H B . Thus, the Boltzmann entropy is typically an overestimate of the true thermodynamic entropy.
More to the point, in the thermodynamic limit the Boltzmann entropy H B and Gibbs entropy H σ differ substantially. As Jaynes shows [25], the difference directly measures the effect of internal correlations on total energy and other thermodynamic state variables. Moreover, the difference does not vanish in the thermodynamic limit, rather it increases proportionally to system size L .
Before leaving the differences between entropy definitions, it is important to note that Boltzmann’s more familiar definition— S = k B ln W —via the number W of microstates associated with a given thermodynamic macrostate is consistent with Gibbs’ definition [25]. This follows from Shannon’s deep insight on the asymptotic equipartition property that 2 h L measures the volume of typical microstates: the set of almost-equiprobable configurations that are simultaneously most numerous and capture the bulk of the probability—those that are typically realized (see Section 21 of [26] , Chapter 3 of [7] , and [27,28]). Thus, Boltzmann’s W should be interpreted as the size (phase space volume) of the typical set associated with a particular thermodynamic macrostate. Given this agreement, we focus on Gibbs’ approach. Though, as we now note, the contrast between Gibbs’ entropy H σ and Boltzmann’s H B leads directly to our larger goals of interpretable decompositions of the thermodynamic entropy density.

3. Decomposing a Spin’s Thermodynamic Entropy

Since we are particularly interested here in monitoring the appearance of internal correlations, the difference between the Boltzmann and Gibbs entropies suggests itself as our first measure of a system’s internal organization. If each spin were independent, H σ 0 = h . For example, this is true at T = for the Ising model and for site percolation models [29]. If there are correlations between spins, however, then h < H σ 0 , as just noted for the extensive quantities. Their difference is the total correlation density [22]:
ρ = H σ 0 h = T σ L ,
T σ = D p σ | | i = 1 L p σ i here is the Kullback–Leibler divergence D ( · | | · ) [7] between the distribution over entire configurations and the product of isolated-spin marginal distributions. Now called the total correlation [15], this measure of internal correlation was wholly anticipated by Gibbs, as Jaynes notes (Equation (4) of [25]). Since ρ vanishes only when each spin is independent, it can be considered a measure of pattern or structure in a spin system. More specifically, it measures how constrained the spin distribution is and so is maximized when the spins are strongly correlated and act in concert. While this is certainly reasonable as an operational definition, our next step decomposes both h and ρ into more nuanced and incisive quantities.
Continuing, we recast the Gibbs thermodynamic entropy H σ as the sum of two nonnegative terms; starting from the configuration entropy in its standard statistical form Equation (3) and manipulating it into two terms:
H σ = σ σ p σ log 2 p σ i = 1 L p σ i | σ i i = 1 L p σ i | σ i = R σ + B σ ,
where R σ and B σ , known as the residual entropy and dual total correlation [17,20,22], are the following measures:
R σ = i = 1 L σ σ p σ log 2 p σ i | σ i = i = 1 L H σ i | σ i ,
and,
B σ = σ σ p σ log 2 p σ i = 0 L p σ i | σ i = H σ i = 0 L H σ i | σ i .
Note that both R σ and B σ are nonnegative and bound from above by H σ . The dual total correlation is also zero if and only if the spins are mutually independent, and so it too is a measure of collective behavior. In this case, however, it measures the actual information content of the spin distribution and is maximized when subsets of spins are independent, but collectively can determine one another. Spatial densities are denoted in lower case: h = H σ / L , r = R σ / L , and b = B σ / L . We consider their thermodynamic limit, where L .
Similarly, we can begin with the total correlation of spins T σ and break it into two terms:
T σ = σ σ p σ log 2 p σ i = 1 L p σ i
= B σ + Q σ ,
where B σ is as above. Though perhaps not clear here, B σ is naturally a component of T σ ; see [22] Figure 7c,d. Q σ is the enigmatic information [22]:
Q σ = σ σ p σ log 2 p σ 2 i = 1 L p σ i j = 1 L p σ j | σ j = T σ B σ ,
and again, we consider the spatial density q = Q σ / L in the thermodynamic limit.
It can be difficult to initially intuit the difference between T σ and B σ . The key is that they both measure dependencies among spins, but against different reference ensembles. T σ is the difference between the Boltzmann and Gibbs entropies (Equation (5)) and, therefore, quantifies the spin distribution’s deviation from a hypothetical distribution in which each spin is independent from the others. Since R σ is the amount of actual independent entropy in the spin distribution, B σ quantifies the amount of spin–spin dependency in configurations not in reference to a hypothetical independent-spin distribution, but rather to the actually realized amount of independence. This difference is pictorially represented in Figure 1.
Let us now interpret the physics captured by these entropy-density components. First, H σ 0 quantifies the entropy per spin ignoring any dependencies (e.g., correlations) between spins. The thermodynamic entropy density h, however, is the entropy per spin including dependencies. Continuing in this way, from Equation (7) r is the entropy per spin remaining after these dependencies have been factored out, meaning it is the average amount of independent entropy—the entropy per spin when we know the state of the other spins in the lattice. With this logic, b is that portion of the thermodynamic entropy density coming from dependencies and is, therefore, the average amount of dependent entropy per spin. As we shall see, in comparison to ρ , b provides a complementary, quantitatively different, and more physically grounded approach to quantifying dependencies among spins.
This completes, in effect, our decomposition of the isolated-spin entropy H σ 0 , shown schematically in Figure 2. To take stock, let us back up a bit. Recall that ρ is the difference between H σ 0 , which ignores the dependencies between the spins, and the thermodynamic entropy density h. That is, ρ is the difference between the isolated-spin entropy and how much entropy (h) there actually is. b is the difference between the thermodynamic entropy h and the amount of independent entropy r there is. In addition, b is also part of ρ [22], leaving q which therefore quantifies potential dependencies (explained shortly) not present in the thermodynamic entropy density h. To mathematically reframe these results, we showed that the Boltzmann thermodynamic entropy naturally decomposes into a complete set of irreducible atoms in the algebra of information measures [30].
Let us explore the consequences of this decomposition. Although r and b are components of the thermodynamic entropy density h—itself a measure of the irreducible per-spin randomness—individually they have very different meanings that give insight into the interplay of spin thermalization and spin ordering. The first, r, is the average amount of unshared entropy, quantifying randomness remaining in a single spin given the exact configuration of the rest of the lattice. For this reason we refer to it as the localized entropy. In the limit of high and low temperature, r’s behavior is intuitive. At low temperature, spins in the lattice are entirely aligned, and so there is no randomness in individual sites given the global alignment. At high temperatures, each spin is independent, and so there is full randomness in a site, even knowing its surroundings.
The other quantity, b, is the entropy remaining in a spin when the localized entropy is deducted, and so it captures the thermodynamic entropy shared between a spin and the rest of the lattice. We refer to it as the bound entropy. Its limits are similarly intuitive. At low temperatures there is no randomness to share, and so b = 0 . Similarly, at high temperatures all randomness is localized, and so no entropy can be shared and bound entropy also vanishes. At intermediate temperatures, though spins are in flux, a spin’s context does at least in part influence its alignment and so b > 0 . We refer to the temperature at which b is maximized by T b .
Finally, we consider the enigmatic information density q. As a component of ρ it also reflects dependencies, but in a way different from b. The bound entropy captures dependencies that are also part of the thermodynamic entropy, but q captures those that are not. As we shall see in Section 5, this means it is sensitive to the interior of clusters. At low temperatures, i.e., temperatures below T c , we expect q to be small, since H σ 0 is small. At high temperatures, there are no dependencies between spins, and so we also expect q to be small. At intermediate temperatures, just like b, we expect q to take larger values, but values that reflect the particular kinds of patterning in spin configurations.
The nonzero values of ρ , b, and q are all driven by Ising spin clustering. Consider for the moment randomly permuting the sites in the lattice, resulting in a site percolation model with site occupancy probability given by a simple function of the magnetization of the original Ising lattice. Then, the spin orientation at any particular site is now statistically independent of its neighbors, and so ρ = b = q = 0 , and therefore H σ 0 = h = r . There are, however, clusters in this permuted lattice—in fact, there may exist a giant cluster if the magnetization is such that the probability of site occupation is above the percolation threshold. These clusters are driven purely by the lattice topology. We therefore conclude that ρ , b, and q are each sensitive to differing aspects of how the pairwise Ising Hamiltonian affects the shape and distribution of spin clusters.
Finally, we note that all these information measures exhibit discontinuities in their derivative with respect to temperature at T c . This behavior is not surprising, however. Each of them is a method of probing the spin statistics of the lattice. The magnetization—the mean of σ 0 —also exhibits such a discontinuity, and so it is not surprising that more nuanced measures such as those considered here have a similar discontinuity.
To summarize, the disorder generated at each site is measured by the Gibbs thermodynamic entropy density h. One portion (b), participates in spatial organization and the other portion (r) does not. Thus, b is key to monitoring the emergence of spatial structures during pattern formation dynamics.

4. Results

To explore what the entropy components reveal, we calculate them for the ferromagnetic spin- 1 / 2 Ising model on one- (1D) and two-dimensional (2D) square lattices and on the Bethe lattice. These results demonstrate that the preceding qualitative explanations of the information measure behaviors are correct, but the results also raise subtle questions regarding the details of exactly what the measures capture.
As part of this, we report analytical results for the 1D chain and the Bethe lattice and show that they match results from simulation. We use analytic results whenever possible for the 2D lattice. For example, we report analytical results for h, H σ 0 , and ρ . Then, only r needs to be estimated from simulation and that result is then combined with analytic quantities to compute b and q. To provide a comparison, estimating b purely from simulation requires accurate h and r. In this, we make use of the global Markovian property of spin lattices [31]. To obtain r we need only condition a spin on all its nearest neighbors—spins directly coupled via the Hamiltonian of Equation (1). For h estimated purely from simulation, we use the method provided in [32]. In both cases, the entropy estimator H ^ 2 found in [33] was used to decrease the number of configuration samples while still providing convergence to known analytic results. Overall, when comparing and when analytic results are available, simulation results match to within standard error bars smaller than plot line widths. All simulation results were obtained using the Wolff algorithm [34]. In summary, in the reported results for those few quantities that required estimated or partial estimation from simulations, sufficient care was taken that the statistical errors fall far below the level needed to support the main conclusions.

4.1. 1D Ising Model

Simple Ising models have been broadly adapted to understand simple collective phenomena in fields ranging from surface physics [35,36] and biophysics [37,38] to economics [39]. The Ising model on a one-dimensional lattice ( L = Z ) can be fully analyzed in closed form. Unfortunately, its emergent patterns are rather constrained; for example, it does not exhibit a phase transition [40]. However, the exact solutions provide an important benchmark and so it is a necessary candidate with which to start and provides a base from which to generalize [41,42,43,44]. Analytic results were computed using a transfer matrix approach combined with the aforementioned Markovian property. For comparison, simulations were performed on a lattice of size N = 1024 . Note that the results presented are analytical solutions, and simulations were performed only to double check our calculations.
Independent of possessing a phase transition or not, the argument that b maximizes between temperature extremes still holds and Figure 3 verifies this. The bound entropy b reaches a maximal value at T b 1.0117 J/k B and this is not (and cannot be) associated with a phase transition. At lower temperatures, spin-spin shared entropy b is the dominant contributor to the Gibbs thermodynamic entropy density h = r + b . At higher temperatures, the localized entropy r is the dominant contributor. Similarly, at low temperatures q is the dominant contributor to ρ , and b is its dominant contributor at high temperatures. Section 5 below isolates why b peaks where it does and to which spin-configuration features each measure is sensitive. For now, let us continue with system informational phenomenology.

4.2. Bethe Lattice Ising Model

What role does the underlying lattice topology play in determining the balance between local randomness and spatially shared information that equilibrium configurations achieve? Spins on the Bethe lattice, in which L is an infinite Cayley tree with coordination number k, form an ideal candidate since the absence of loops makes it possible to compute quantities analytically. Moreover, spin configurations exhibit a phase transition at critical temperature [41]:
T c = 2 J k B ln k k 2 1 .
We analytically calculated the entropy decomposition for a Bethe lattice with coordination number k, the details of which appear in Appendix A. Simulation results, matching the analytic to high precision, were also estimated numerically on a 1000 node random 3-regular graph [45], which has no boundary and is locally Bethe lattice-like.
Figure 4 presents the results for a Bethe lattice with coordination number k = 3 , though other ks behave similarly. Interestingly, all information measures have a discontinuity in their first derivatives, and this happens at the phase transition at T c . Furthermore, the bound entropy b is maximized there: T b = T c 1.8205 J/k B . Opposite the 1D lattice, at small T the dominant contributor to Gibbs entropy h is the local randomness r. Thus, not only does the change in lattice topology induce a phase transition, but it also inverts the informational components’ contributions to thermodynamic entropy density. This, in turn, indicates a rather different underlying mechanism that supports entropy production in the low temperature regime. In the 1D case entropy is spatially extended and configurations of spins are dominated by large clusters. In the Bethe lattice deviations from uniformity come in the form of isolated spins differing from their surroundings. In short, except at high temperatures, Bethe lattice spin configurations are simply more random than those in 1D at corresponding temperatures and corresponding entropy densities. Also, unlike 1D, the spin-spin shared entropy b is the dominant contributor to ρ at both low and high temperatures.

4.3. 2D Ising Model

Unlike its 1D sibling, the nearest-neighbor ferromagnetic Ising model in two dimensions ( L = Z 2 ) has a phase transition at a finite critical temperature T c = 2 / ln 1 + 2 2.2692 J/k B . In the 2D case, although Onsager’s solution let us calculate H σ 0 , h, and ρ [41], we do not have an analytic form for r and so the curves for r, b, and q in Figure 5 partly rely on estimates from simulation, as explained above. The simulations were conducted on a 128 × 128 lattice with periodic boundary conditions and quantities averaged over 200,000 configuration updates. It is important to note that the finite system with periodic boundary conditions is ergodic and so, in principle, H σ 0 is 1 for all temperatures. However, our integration time is several orders of magnitude shorter than the expected magnetic spin-flip time due to thermal fluctuations, and so the simulations are both qualitatively and quantitatively similar to the thermodynamic limit and closely match analytic results.
Again, the resulting standard error bars were smaller than the plotted line widths.
Phase transition aside, the behaviors of h, r, and b, seen in Figure 5, are qualitatively similar to those in the 1D lattice. However, unlike 1D, at low temperatures r is the dominant contributor to h, similar to the Bethe lattice below its transition. Also, paralleling the Bethe lattice, b is the dominant contributor to ρ at both low and high temperatures. Like the 1D system, bound entropy is maximized within the disordered phase at T b 2.675 J/k B , above T c . Thus, the measures show that the 1D system over its whole temperature range is informationally analogous to the 2D system for T > T c . Let us now turn to what configurational structures lead to the overall behaviors of these measures and how they contribute to the Gibbs entropy density h.

5. Discussion

At first, it is striking that T b is not generically identical to T c . To understand why, we need to investigate to which spin-configuration motifs the various measures are sensitive. To accomplish this, we take a local approach. Each of the measures examined above is an average density, which is important when discussing the lattice as a whole and its bulk thermodynamic properties. However, each spin configuration motif contributes differently to this average. For example, an up spin surrounded by down spins contributes to the measures differently than an up spin surrounded by other up spins.

5.1. Motif Entropies

To quantify this, we appeal to spatially- and temporally-local forms of the averaged densities considered so far. This is possible on square lattices L = Z d , due to the existence of conditional forms of the entropy density; see [46] Theorem 2.9. The conditional forms for each of the component quantities are configuration-weighted averages over quantities of the form log 2 [22]. For example, the thermodynamic entropy density of Equation (4) can be shown to be [46]:
h = H σ i | σ i = σ σ p σ log 2 p σ i | σ i
= σ σ p σ h i ( σ )
where h i ( σ ) = log 2 p σ i | σ i and σ i is the set of spins whose indices are “lexicographically” less than i. h i ( σ ) is a spatially local or pointwise measure and quantifies how much the individual spin σ i contributes to the global entropy. We can more directly understand how local motifs contribute to the average entropy by plotting the local measures at each site within a given lattice configuration. This approach roughly parallels that in, e.g., [47], but here we employ different information measures and do not assume directionality, since we average the pointwise values resulting from each of the possible orientations centered at the given spin.
In the following, we define a cluster to be a maximal set of contiguous spins oriented identically. The edge of a cluster is that set of spins in a cluster with at least one neighbor not in the cluster. The remaining spins in the cluster are its bulk. An isolated spin is a cluster consisting of a single spin.
Figure 6 and Figure 7 show the results of the motif entropy analysis for 1D and 2D lattices. In all cases, the spatial average of the displayed quantities corresponds (in the thermodynamic limit) with the values reported in Figure 3 and Figure 5.
There are several immediate similarities that allow for structural interpretation. For example, motif q is positive (red) within the bulk of a cluster and negative (blue) on its edges. Highlighting opposite features, motif r is negligible within the bulk, but positive along edges, particularly corners and isolated spins. The motif bound entropy b, however, is more nuanced in its behavior. Considering Figure 6, at lower temperatures it is sensitive to cluster edges more so than cluster bulk. At higher temperatures, however, this relationship flips and it is more sensitive to the bulk. For all temperatures, it is negative for isolated spins—clusters of size 1.
In contrast, we see that at not-too-high temperatures motif b is largely sensitive to cluster boundaries. This leads one to speculate that, as in [6], b’s maximization in the disordered phase is due to complex interactions between cluster sizes and their surface area. This also sheds light on why b does in fact peak at T c on the Bethe lattice. On the square lattice, on the one hand, cluster boundaries have a tendency to smooth due to the presence of correlations flowing along short loops in the lattice topology that tend to align spins on a boundary. On the other, the Bethe lattice has no such loops, and so there is no pressure to reduce surface area. This suggests that b generically peaks at T c for systems where boundaries are not constrained by system energetics and that the energetic smoothing of cluster boundaries drives b to peak in the more “textured” disordered phase.
Interestingly, for general lattices such as the Bethe lattice, but especially for random graph topologies, local (conditional) forms of the thermodynamic entropy density (analogs of Equation (14)) and other motif measures are unknown and may simply not exist. While techniques, such as that found in [48], exist for estimating thermodynamic entropy density for a ferromagnetic Ising model on an arbitrary graph, the interpretation of a thermodynamic entropy density in such systems is problematic as each spin site may have differing connectivity and therefore play differing roles in the overall lattice dynamics. Therefore, while global averages may exist for arbitrary topologies and may even be tractably estimated, their structural meaning is vastly more challenging. This meaning is critical to understanding thermodynamic and statistical mechanical properties of such irregular systems and, more generally, correlations and information measures on network dynamical systems. If local information estimates, such as those used for the motif entropy analyses above, existed for such systems then studies can be undertaken that determine which nodes in a lattice or network contribute most to collective behavior.

5.2. Actively Storing Information versus Communicating It

Assuming Glauber dynamics for the nearest-neighbor 2D Ising system, Barnett et al. [6] computed a global transfer entropy [16]—the average information communicated between the entire lattice and a single spin. They found that it peaks within the disordered phase at a temperature T T g l 2.354 J/k B . Although there are serious concerns with how the transfer entropy conflates two-spin (dyadic) and multispin (polyadic) dependencies [49], those concerns are not relevant in these Ising systems due to their known, dyadic Hamiltonian.
To compare our measures with theirs, we examine the active storage of randomly generated information in spatial patterns, as measured by the ratio of b / h —how much of the localized randomness (h) is converted by a system into spin-spin (spatially) shared information (b). Figure 8 plots the ratio as a function of temperature for the 2D Ising system: the ratio peaks in the disordered phase and, surprisingly, it appears to do so at T T g l . This implies that there is a strong connection between b (storing thermal fluctuations as spatial correlation) and the potential for information communication in a system. Hopefully, the result suggests that Barnett et al.’s T T g l is not a result specific only to their choice of Glauber dynamics, but rather an intrinsic property of the 2D Ising model. It remains to be seen why spatially shared information is maximized within the disordered phase of the 2D Ising system, though.

6. Conclusions

As noted at the beginning, even the earliest debates over entropy’s statistical foundation turned on contrasting its thermodynamic components—Boltzmann’s isolated-spin entropy versus the Gibbs global entropy. From a modern perspective their difference, well known to Gibbs, is generalized mutual information that measures the degree of individual-component independence, now called the total correlation. From this, we showed that the Gibbs thermodynamic entropy density naturally decomposes further into two functionally distinct informational components: one quantifying independence among constituent spins and the other, dependence. The one quantifying dependence, the bound entropy b, captures collective behavior by expressing how much of the thermodynamic entropy density is locally shared. We then demonstrated the behavior of the bound entropy and related quantities for the nearest-neighbor ferromagnetic Ising model on a variety of lattices. We found that it tends to a maximum at intermediate temperatures, though not always at the magnetic phase transition. Our analyses support our earlier hypothesis [50] that q, as the dominant component of the persistent mutual information [51], should generically be maximized at critical points. Though not detailed here, this observation holds in three dimensions and in simulations of the Potts model [52] with the number of states s = 3 on lattices with 1, 2, and 3 dimensions.
This brief phenomenological study of thermodynamic entropy components served to give a physical grounding for information measures and what they reveal in spin systems on various lattice topologies. The results suggest many avenues of potential research. One topic to explore is the behavior of b in the Potts model, which switches from exhibiting a second-order phase transition in the magnetization to a first-order transition when the number of spin states exceeds four. Another setting of particular interest is to study the behavior of b in a frustrated system, such as the antiferromagnetic Ising model on a triangular lattice, paralleling the work in [53]. Finally, as alluded to above, although beyond the present scope, a next step is to consider informational measures for spins on arbitrary graphs, with the goal of providing insight into the roles that different nodes play in information processing and storage in complex dynamical networks.

Acknowledgments

We thank Cina Aghamohammadi, Adam Dioguardi, Raissa D’Souza, Pierre-André Noël, Márton Pósfai, Paul Riechers, Adam Rupe, Rajiv Singh, and Dowman Varn for helpful conversations. We thank the Santa Fe Institute for its hospitality during visits. JPC is an SFI External Faculty member. This material is based upon work supported by, or in part by, the U.S. Army Research Laboratory and the U.S. Army Research Office under contracts W911NF-13-1-0390 and W911NF-13-1-0340.

Author Contributions

Author Contributions: R.G.J. and J.P.C. conceived of the project. V.S.V. and R.G.J. designed and performed the simulations. All authors contributed equally to the analysis of the data, and to the writing of the paper.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Bethe Lattice Spin-Neighborhood Probabilities

Consider a Cayley tree with coordination number k consisting of n shells, or layers of spins centered at the root. The total number of spins in such a tree is N = k [ ( k 1 ) n 1 ] / ( k 2 ) . We denote the spins in the lattice using σ i , i [ 0 , N 1 ] . Let us use the index 0 for the central spin and indices [ 1 , k ] for its immediate neighbors. We use σ to denote a configuration—the state of all spins present in the Cayley tree. The Bethe lattice is the singular limit of a Cayley tree as n . Following [41], we write the partition function for the Ising model on the Bethe lattice as:
Z = σ σ exp β J i , j σ i σ j ,
where exp β J i , j σ i σ j is the Boltzmann factor corresponding to the system being in a state σ and σ denotes the set of all possible configurations. As shown in [41], due to the Cayley tree topology, the above expression can be written as:
Z = σ 0 σ 0 g n ( σ 0 ) k ,
where g n ( σ 0 ) is the partition function of a branch of the Cayley tree with its root at σ 0 . Starting with this expression we derive the joint probability of the central spin and its neighbors.
Let g n ( σ 0 ) g ( σ 0 ) in the limit of n shells. Then, the spins not close to the Cayley tree leaves behave like those on the Bethe lattice with the same coordination number. In this limit, by explicitly accounting for the bonds between the central spin and its neighbors, we can write the partition function as:
Z = σ 0 σ 0 σ 1 σ 1 σ k σ k exp β J i [ 1 , k ] σ 0 σ i i [ 1 , k ] [ g ( σ i ) ] k 1 ,
where the product is essentially over all branches one step away from the central spin and σ i is the spin to which it is anchored. From this, we obtain the joint probability of the central spin σ 0 and its k neighbors as:
p ( σ 0 , σ 1 , , σ k ) = exp β J i [ 1 , k ] σ 0 σ i i [ 1 , k ] [ g ( σ i ) ] k 1 / Z .
Dividing both numerator and denominator by [ g ( + 1 ) ] k 1 we obtain:
p ( σ 0 , σ 1 , , σ k ) = exp β J i [ 1 , k ] σ 0 σ i i [ 1 , k ] g ( σ i ) g ( + 1 ) k 1 e β J + e β J g ( 1 ) g ( + 1 ) k 1 k + e β J + e β J g ( 1 ) g ( + 1 ) k 1 k .
This is the joint probability distribution of the central spin and its neighbors. We evaluate the above expression numerically by defining g ( 1 ) / g ( + 1 ) = x . From [41] Equation (4.3.14), we know that x is the “stable” root of the equation:
x = e β J + e β J x k 1 e β J + e β J x k 1 .
Above T c , Equation (A2) has only one root: x = 1 , which is stable. Below T c , however, there are three roots: x 0 , 1, and x 0 1 , where x 0 < 1 . The stable roots are x 0 and x 0 1 . Setting x = x 0 provides the distribution p ( σ 0 , σ 1 , , σ k ) where the symmetry breaking prefers spins aligned up and where setting x = x 0 1 provides the distribution with the symmetry broken such that downward spins are preferred.

References

  1. Antonov, V.N.; Bekenov, L.V.; Yaresko, A.N. Electronic structure of strongly correlated systems. Adv. Condens. Matter Phys. 2011, 2011, 298928. [Google Scholar] [CrossRef]
  2. Mantegna, R.N.; Stanley, H.E. Introduction to Econophysics: Correlations and Complexity in Finance; Cambridge University Press: Cambridge, UK, 1999. [Google Scholar]
  3. Raiesdana, S.; Hashemi Golpayegani, M.R.; Nasrabadi, A.M. Complexity evolution in epileptic seizure. In Proceedings of the IEEE 30th Annual International Conference of the Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 4110–4113. [Google Scholar]
  4. Mäki-Marttunen, V.; Cortes, J.M.; Villarreal, M.F.; Chialvo, D.R. Disruption of transfer entropy and inter-hemispheric brain functional connectivity in patients with disorder of consciousness. BMC Neurosci. 2013, 14 (Suppl. 1), 83. [Google Scholar] [CrossRef]
  5. Couzin, I. Collective minds. Nature 2007, 445, 715. [Google Scholar] [CrossRef] [PubMed]
  6. Barnett, L.; Lizier, J.T.; Harré, M.; Seth, A.K.; Bossomaier, T. Information flow in a kinetic Ising model peaks in the disordered phase. Phys. Rev. Lett. 2013, 111, 177203. [Google Scholar] [CrossRef] [PubMed]
  7. Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; Wiley-Interscience: New York, NY, USA, 2006. [Google Scholar]
  8. Crutchfield, J.P. Between order and chaos. Nat. Phys. 2012, 8, 17–24. [Google Scholar] [CrossRef]
  9. Shaw, R. The Dripping Faucet as a Model Chaotic System; Aerial Press: Santa Cruz, CA, USA, 1984. [Google Scholar]
  10. Arnold, D. Information-theoretic analysis of phase transitions. Complex Syst. 1996, 10, 143–155. [Google Scholar]
  11. Crutchfield, J.P.; Feldman, D.P. Statistical complexity of simple one-dimensional spin systems. Phys. Rev. E 1997, 55, R1239. [Google Scholar] [CrossRef]
  12. Feldman, D.P.; Crutchfield, J.P. Structural information in two-dimensional patterns: Entropy convergence and excess entropy. Phys. Rev. E 2003, 67, 051104. [Google Scholar] [CrossRef] [PubMed]
  13. Feldman, D.P.; McTague, C.S.; Crutchfield, J.P. The organization of intrinsic computation: Complexity-entropy diagrams and the diversity of natural information processing. Chaos 2008, 18, 043106. [Google Scholar] [CrossRef] [PubMed]
  14. Lau, H.W.; Grassberger, P. Information theoretic aspects of the two-dimensional Ising model. Phys. Rev. E 2013, 87, 022128. [Google Scholar] [CrossRef] [PubMed]
  15. Watanabe, S. Information theoretical analysis of multivariate correlation. IBM J. Res. Dev. 1960, 4, 66–82. [Google Scholar] [CrossRef]
  16. Schreiber, T. Measuring information transfer. Phys. Rev. Lett. 2000, 85, 461–464. [Google Scholar] [CrossRef] [PubMed]
  17. Abdallah, S.A.; Plumbley, M.D. A measure of statistical complexity based on predictive information with application to finite spin systems. Phys. Lett. A 2012, 376, 275–281. [Google Scholar] [CrossRef]
  18. Han, T.S. Linear dependence structure of the entropy space. Inf. Control 1975, 29, 337–368. [Google Scholar] [CrossRef]
  19. Han, T.S. Nonnegative entropy measures of multivariate symmetric correlations. Inf. Control 1978, 36, 133–156. [Google Scholar] [CrossRef]
  20. Verdú, S.; Weissman, T. Erasure entropy. In Proceedings of the 2006 IEEE International Symposium on Information Theory, Seattle, WA, USA, 9–14 July 2006; pp. 98–102. [Google Scholar]
  21. James, R.G.; Burke, K.; Crutchfield, J.P. Chaos forgets and remembers: Measuring information creation, destruction, and storage. Phys. Lett. A 2014, 378, 2124–2127. [Google Scholar] [CrossRef]
  22. James, R.G.; Ellison, C.J.; Crutchfield, J.P. Anatomy of a bit: Information in a time series observation. Chaos 2011, 21, 037109. [Google Scholar] [CrossRef] [PubMed]
  23. Marzen, S.; Crutchfield, J.P. Information anatomy of stochastic equilibria. Entropy 2014, 16, 4713–4748. [Google Scholar] [CrossRef]
  24. Pikovsky, A.; Rosenblum, M.; Kurths, J. Synchronization: A Universal Concept in Nonlinear Sciences; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
  25. Jaynes, E.T. Gibbs versus Boltzmann entropies. Am. J. Phys. 1965, 33, 391–398. [Google Scholar] [CrossRef]
  26. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
  27. Grandy, W.T. Entropy and the Time Evolution of Macroscopic Systems; Oxford University Press: Oxford, UK, 2008; Volume 141. [Google Scholar]
  28. Ruelle, D. Statistical Mechanics: Rigorous Results; World Scientific: Singapore, 1969. [Google Scholar]
  29. Grimett, G.R. Percolation, 2nd ed.; Springer: Berlin, Germany, 1999. [Google Scholar]
  30. Reza, F.M. An Introduction to Information Theory; Courier Corporation: North Chelmsford, MA, USA, 1961. [Google Scholar]
  31. Goldstein, S.; Kuik, R.; Schlijper, A.G. Entropy and global Markov properties. Commun. Math. Phys. 1990, 126, 469–482. [Google Scholar] [CrossRef]
  32. Schlijper, A.G.; Smit, B. Two-sided bounds on the free energy from local states in Monte Carlo simulations. J. Stat. Phys. 1989, 56, 247–260. [Google Scholar] [CrossRef]
  33. Schürmann, T. A Note on Entropy Estimation. arXiv, 2015; arXiv:1503.05911. [Google Scholar]
  34. Wolff, U. Collective Monte Carlo updating for spin systems. Phys. Rev. Lett. 1989, 62, 361–364. [Google Scholar] [CrossRef] [PubMed]
  35. Yutaka, M.; Kataura, H.; Matsuda, K.; Okabe, Y. A one-dimensional Ising model for C 70 molecular ordering in C 70-peapods. New J. Phys. 2003, 5, 127. [Google Scholar]
  36. Zimmermann, F.M.; Pan, X. Interaction of H2 with Si(001) − (2×1): Solution of the barrier puzzle. Phys. Rev. Lett. 2000, 85, 618–621. [Google Scholar] [CrossRef] [PubMed]
  37. Zimm, B.H. Theory of “melting” of the helical form in double chains of the DNA type. J. Chem. Phys. 1960, 33, 1349–1356. [Google Scholar] [CrossRef]
  38. Wartell, R.M.; Benight, A.S. Thermal denaturation of DNA molecules: A comparison of theory with experiment. Phys. Rep. 1985, 126, 67–107. [Google Scholar] [CrossRef]
  39. Durlauf, S.N. How can statistical mechanics contribute to social science? Proc. Natl. Acad. Sci. USA 1999, 96, 10582–10584. [Google Scholar] [CrossRef] [PubMed]
  40. Pathria, R.K.; Beale, P.D. Statistical Mechanics; Elsevier Science: Amsterdam, The Netherlands, 1996. [Google Scholar]
  41. Baxter, R.J. Exactly Solved Models in Statistical Mechanics; Academic Press: New York, NY, USA, 1982. [Google Scholar]
  42. Pfeuty, P. An exact result for the 1D random Ising model in a transverse field. Phys. Lett. A 1979, 72, 245–246. [Google Scholar] [CrossRef]
  43. Feldman, D.P. Computational Mechanics of Classical Spin Systems. Ph.D. Thesis, University of California, Davis, CA, USA, 1998. [Google Scholar]
  44. Yilmaz, M.B.; Zimmermann, F.M. Exact cluster size distribution in the one-dimensional Ising model. Phys. Rev. E 2005, 71, 026127. [Google Scholar] [CrossRef] [PubMed]
  45. Bollobás, B. Random Graphs Vol. 73, Cambridge Studies in Advanced Mathematics; Cambridge University Press: Cambridge, UK, 2001. [Google Scholar]
  46. Marcus, B.; Pavlov, R. Computing bounds for entropy of stationary Zd Markov random fields. SIAM J. Discret. Math. 2013, 27, 1544–1558. [Google Scholar] [CrossRef]
  47. Lizier, J.; Prokopenko, M.; Zomaya, A. Information modification and particle collisions in distributed computation. Chaos 2010, 20, 037109. [Google Scholar] [CrossRef] [PubMed]
  48. Jerrum, M.; Sinclair, A. Polynomial-time approximation algorithms for the Ising model. SIAM J. Comput. 1993, 22, 1087–1116. [Google Scholar] [CrossRef]
  49. James, R.G.; Barnett, N.; Crutchfield, J.P. Information flows? A critique of transfer entropies. Phys. Rev. Lett. 2016, 116, 238701. [Google Scholar] [CrossRef] [PubMed]
  50. Ara, P.M.; James, R.G.; Crutchfield, J.P. The elusive present: Hidden past and future dependence and why we build models. Phys. Rev. E 2016, 93, 022143. [Google Scholar] [CrossRef] [PubMed]
  51. Ball, R.C.; Diakonova, M.; MacKay, R.S. Quantifying emergence in terms of persistent mutual information. Adv. Complex Syst. 2010, 13, 327–338. [Google Scholar] [CrossRef]
  52. Potts, R.B. Some generalized order-disorder transformations. In Mathematical Proceedings of the Cambridge Philosophical Society; Cambridge University Press: Cambridge, UK, 1952; Volume 48, pp. 106–109. [Google Scholar]
  53. Robinson, M.D.; Feldman, D.P.; McKay, S.R. Local entropy and structure in a two-dimensional frustrated system. Chaos 2011, 21, 037114. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Total correlation T X (north-west hashing) and dual total correlation B X (north-east hashing) for three spins X 1 , X 2 , and X 3 depicted via a 3-variable Venn diagram [30] in which areas correspond to the magnitudes of information measures. The joint entropy H X 123 is the union of the three single-spin marginal entropies H X i (full circles). The outer rectangle represents an entropy H X 1 X 2 X 3 with all dependencies between spins removed; each marginal entropy H X i still matches its original marginal entropy, but does not overlap that of the other spins. This corresponds to the distribution whose Gibbs entropy is the Boltzmann entropy of X 123 . The diagram directly demonstrates that T X measures dependence as a distance from the external reference H X 1 X 2 X 3 . B X measures with respect to an internal reference H X 123 . The unshaded region is the residual entropy R X 123 .
Figure 1. Total correlation T X (north-west hashing) and dual total correlation B X (north-east hashing) for three spins X 1 , X 2 , and X 3 depicted via a 3-variable Venn diagram [30] in which areas correspond to the magnitudes of information measures. The joint entropy H X 123 is the union of the three single-spin marginal entropies H X i (full circles). The outer rectangle represents an entropy H X 1 X 2 X 3 with all dependencies between spins removed; each marginal entropy H X i still matches its original marginal entropy, but does not overlap that of the other spins. This corresponds to the distribution whose Gibbs entropy is the Boltzmann entropy of X 123 . The diagram directly demonstrates that T X measures dependence as a distance from the external reference H X 1 X 2 X 3 . B X measures with respect to an internal reference H X 123 . The unshaded region is the residual entropy R X 123 .
Entropy 19 00214 g001
Figure 2. Decomposition of the isolated-spin (Boltzmann) thermodynamic entropy density H σ 0 . First, decomposition into the Gibbs thermodynamic entropy density h and the total correlation ρ . Second, the thermodynamic entropy into the localized density r and the dual total correlation information density b, and the total correlation density ρ into b and the enigmatic information density q.
Figure 2. Decomposition of the isolated-spin (Boltzmann) thermodynamic entropy density H σ 0 . First, decomposition into the Gibbs thermodynamic entropy density h and the total correlation ρ . Second, the thermodynamic entropy into the localized density r and the dual total correlation information density b, and the total correlation density ρ into b and the enigmatic information density q.
Entropy 19 00214 g002
Figure 3. Boltzmann and Gibbs thermodynamic entropy-density decompositions for the 1D, nearest neighbor, ferromagnetic spin- 1 / 2 Ising model: Gibbs entropy density h and the localized entropy density r both monotonically increase with temperature, but the bound entropy density b is maximal near T b 1.0117 J/k B . Below this temperature, the dominant contribution to the Gibbs thermodynamic entropy density switches to the entropy shared between nearby spins.
Figure 3. Boltzmann and Gibbs thermodynamic entropy-density decompositions for the 1D, nearest neighbor, ferromagnetic spin- 1 / 2 Ising model: Gibbs entropy density h and the localized entropy density r both monotonically increase with temperature, but the bound entropy density b is maximal near T b 1.0117 J/k B . Below this temperature, the dominant contribution to the Gibbs thermodynamic entropy density switches to the entropy shared between nearby spins.
Entropy 19 00214 g003
Figure 4. Thermodynamic entropy-density decompositions for the Ising model on a Bethe lattice. By far and away, and unlike the 1D spin lattice, the individual-spin disorder r is the dominant contributor to Gibbs entropy density h over the entire temperature range. Of the information generated, relatively little (b) is stored in spatial patterns.
Figure 4. Thermodynamic entropy-density decompositions for the Ising model on a Bethe lattice. By far and away, and unlike the 1D spin lattice, the individual-spin disorder r is the dominant contributor to Gibbs entropy density h over the entire temperature range. Of the information generated, relatively little (b) is stored in spatial patterns.
Entropy 19 00214 g004
Figure 5. Thermodynamic entropy-density decompositions for the 2D, nearest neighbor, ferromagnetic Ising model. Curves are from numerical simulation with sufficient size that standard errors are much smaller than the line widths. As in 1D, the entropy density h and the localized entropy density r monotonically increase with temperature. Here, also, the bound entropy density b reaches a maximal value at a nonextremal temperature: near T b 2.675 J/k B , but this peak value does not occur at the critical temperature T c 2.2692 J/k B , where domain sizes become scale-free.
Figure 5. Thermodynamic entropy-density decompositions for the 2D, nearest neighbor, ferromagnetic Ising model. Curves are from numerical simulation with sufficient size that standard errors are much smaller than the line widths. As in 1D, the entropy density h and the localized entropy density r monotonically increase with temperature. Here, also, the bound entropy density b reaches a maximal value at a nonextremal temperature: near T b 2.675 J/k B , but this peak value does not occur at the critical temperature T c 2.2692 J/k B , where domain sizes become scale-free.
Entropy 19 00214 g005
Figure 6. Motif entropy-component analysis of the 1D Ising model at two temperatures. Positive values are red, negative blue. A segment of a spin configuration with up spins (white cells) and down spins (black cells) is shown at the bottom.
Figure 6. Motif entropy-component analysis of the 1D Ising model at two temperatures. Positive values are red, negative blue. A segment of a spin configuration with up spins (white cells) and down spins (black cells) is shown at the bottom.
Entropy 19 00214 g006
Figure 7. Motif entropy-component analysis of the 2D Ising model at T b . Positive values at a site are red, negative, blue.
Figure 7. Motif entropy-component analysis of the 2D Ising model at T b . Positive values at a site are red, negative, blue.
Entropy 19 00214 g007
Figure 8. Storing locally generated thermal randomness as spatial correlation: the ratio of b / h as a function of temperature in the 2D Ising system. It peaks within the disordered phase at a temperature T T g l 2.354 J/k B . Compare to Figure 2 in [6] .
Figure 8. Storing locally generated thermal randomness as spatial correlation: the ratio of b / h as a function of temperature in the 2D Ising system. It peaks within the disordered phase at a temperature T T g l 2.354 J/k B . Compare to Figure 2 in [6] .
Entropy 19 00214 g008

Share and Cite

MDPI and ACS Style

Vijayaraghavan, V.S.; James, R.G.; Crutchfield, J.P. Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems. Entropy 2017, 19, 214. https://doi.org/10.3390/e19050214

AMA Style

Vijayaraghavan VS, James RG, Crutchfield JP. Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems. Entropy. 2017; 19(5):214. https://doi.org/10.3390/e19050214

Chicago/Turabian Style

Vijayaraghavan, Vikram S., Ryan G. James, and James P. Crutchfield. 2017. "Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems" Entropy 19, no. 5: 214. https://doi.org/10.3390/e19050214

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop