Entropy of network ensembles

From Wikipedia, the free encyclopedia

A set of networks that satisfies given structural characteristics can be treated as a network ensemble.[1] Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble.[2]

The entropy is the logarithm of the number of graphs.[3] Entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network.[4]

Employing approaches from statistical mechanics, the complexity, uncertainty, and randomness of networks can be described by network ensembles with different types of constraints.[5]

Gibbs and Shannon entropy[edit]

By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as:

where is the constraint, and () are the elements in the adjacency matrix, if and only if there is a link between node i and node j. is a step function with if , and if . The auxiliary fields and have been introduced as analogy to the bath in classical mechanics.

For simple undirected networks, the partition function can be simplified as[6]

where , is the index of the weight, and for a simple network .

Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.

For a microcanonical ensemble, the Gibbs entropy is defined by:

where indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.

The probability of having a link between nodes i and j, with weight is given by:

For a canonical ensemble, the entropy is presented in the form of a Shannon entropy:

Relation between Gibbs and Shannon entropy[edit]

Network ensemble with given number of nodes and links , and its conjugate-canonical ensemble are characterized as microcanonical and canonical ensembles and they have Gibbs entropy and the Shannon entropy S, respectively. The Gibbs entropy in the ensemble is given by:[7]

For ensemble,

Inserting into the Shannon entropy:[6]

The relation indicates that the Gibbs entropy and the Shannon entropy per node S/N of random graphs are equal in the thermodynamic limit .

Von Neumann entropy[edit]

Von Neumann entropy is the extension of the classical Gibbs entropy in a quantum context. This entropy is constructed from a density matrix : historically, the first proposed candidate for such a density matrix has been an expression of the Laplacian matrix L associated with the network. The average von Neumann entropy of an ensemble is calculated as:[8]

For random network ensemble , the relation between and is nonmonotonic when the average connectivity is varied.

For canonical power-law network ensembles, the two entropies are linearly related.[6]

Networks with given expected degree sequences suggest that, heterogeneity in the expected degree distribution implies an equivalence between a quantum and a classical description of networks, which respectively corresponds to the von Neumann and the Shannon entropy.[9]

This definition of the Von Neumann entropy can also be extended to multilayer networks with tensorial approach[10] and has been used successfully to reduce their dimensionality from a structural point of perspective.[11]

However, it has been shown that this definition of entropy does not satisfy the property of sub-additivity (see Von Neumann entropy's subadditivity), expected to hold theoretically. A more grounded definition, satisfying this fundamental property, has been introduced by De Domenico and Biamonte[12] as a quantum-like Gibbs state

where

is a normalizing factor which plays the role of the partition function, and is a tunable parameter which allows multi-resolution analysis. If is interpreted as a temporal parameter, this density matrix is formally proportional to the propagator of a diffusive process on the top of the network.

This feature has been used to build a statistical field theory of complex information dynamics, where the density matrix can be interpreted in terms of the super-position of streams operators whose action is to activate information flows among nodes.[13] The framework has been successfully applied to analyze the protein-protein interaction networks of virus-human interactomes, including the SARS-CoV-2, to unravel the systemic features of infection of the latter at microscopic, mesoscopic and macroscopic scales,[14] as well as to assess the importance of nodes for integrating information flows within the network and the role they play in network robustness.[15]

This approach has been generalized to deal with other types of dynamics, such as random walks, on the top of multilayer networks, providing an effective way to reduce the dimensionality of such systems without altering their structure.[16] Using both classical and maximum-entropy random walks, the corresponding density matrices have been used to encode the network states of the human brain and to assess, at multiple scales, connectome’s information capacity at different stages of dementia.[17]

See also[edit]

References[edit]

  1. ^ Levin, E.; Tishby, N.; Solla, S.A. (October 1990). "A statistical approach to learning and generalization in layered neural networks". Proceedings of the IEEE. 78 (10): 1568–1574. doi:10.1109/5.58339. ISSN 1558-2256. S2CID 5254307.
  2. ^ Bianconi, Ginestra (2008). "The entropy of randomized network ensembles". EPL (Europhysics Letters). 81 (2): 28005. arXiv:0708.0153. Bibcode:2008EL.....8128005B. doi:10.1209/0295-5075/81/28005. ISSN 0295-5075. S2CID 17269886.
  3. ^ Menichetti, Giulia; Remondini, Daniel (2014). "Entropy of a network ensemble: definitions and applications to genomic data". Theoretical Biology Forum. 107 (1–2): 77–87. ISSN 0035-6050. PMID 25936214.
  4. ^ Krawitz, Peter; Shmulevich, Ilya (27 September 2007). "Entropy of complex relevant components of Boolean networks". Physical Review E. 76 (3): 036115. arXiv:0708.1538. Bibcode:2007PhRvE..76c6115K. doi:10.1103/PhysRevE.76.036115. PMID 17930314. S2CID 6192682.
  5. ^ Bianconi, Ginestra (27 March 2009). "Entropy of network ensembles". Physical Review E. 79 (3): 036114. arXiv:0802.2888. Bibcode:2009PhRvE..79c6114B. doi:10.1103/PhysRevE.79.036114. PMID 19392025. S2CID 26082469.
  6. ^ a b c Anand, Kartik; Bianconi, Ginestra (13 October 2009). "Entropy measures for networks: Toward an information theory of complex topologies". Physical Review E. 80 (4): 045102. arXiv:0907.1514. Bibcode:2009PhRvE..80d5102A. doi:10.1103/PhysRevE.80.045102. PMID 19905379. S2CID 27419558.
  7. ^ Bogacz, Leszek; Burda, Zdzisław; Wacław, Bartłomiej (1 July 2006). "Homogeneous complex networks". Physica A: Statistical Mechanics and Its Applications. 366: 587–607. arXiv:cond-mat/0502124. Bibcode:2006PhyA..366..587B. doi:10.1016/j.physa.2005.10.024. ISSN 0378-4371. S2CID 119428248.
  8. ^ Du, Wenxue; Li, Xueliang; Li, Yiyang; Severini, Simone (30 December 2010). "A note on the von Neumann entropy of random graphs". Linear Algebra and Its Applications. 433 (11): 1722–1725. doi:10.1016/j.laa.2010.06.040. ISSN 0024-3795.
  9. ^ Anand, Kartik; Bianconi, Ginestra; Severini, Simone (18 March 2011). "Shannon and von Neumann entropy of random networks with heterogeneous expected degree". Physical Review E. 83 (3): 036109. arXiv:1011.1565. Bibcode:2011PhRvE..83c6109A. doi:10.1103/PhysRevE.83.036109. PMID 21517560. S2CID 1482301.
  10. ^ De Domenico, Manlio; Solé-Ribalta, Albert; Cozzo, Emanuele; Kivelä, Mikko; Moreno, Yamir; Porter, Mason A.; Gómez, Sergio; Arenas, Alex (4 December 2013). "Mathematical Formulation of Multilayer Networks". Physical Review X. 3 (4): 041022. arXiv:1307.4977. Bibcode:2013PhRvX...3d1022D. doi:10.1103/PhysRevX.3.041022. S2CID 16611157.
  11. ^ De Domenico, Manlio; Nicosia, Vincenzo; Arenas, Alex; Latora, Vito (23 April 2015). "Structural reducibility of multilayer networks" (PDF). Nature Communications. 6: 6864. Bibcode:2015NatCo...6.6864D. doi:10.1038/ncomms7864. PMID 25904309. S2CID 16776349.
  12. ^ De Domenico, Manlio; Biamonte, Jacob (21 December 2016). "Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison". Physical Review X. 6 (4): 041062. arXiv:1609.01214. Bibcode:2016PhRvX...6d1062D. doi:10.1103/PhysRevX.6.041062. S2CID 51786781.
  13. ^ Ghavasieh, Arsham; Nicolini, Carlo; De Domenico, Manlio (10 November 2020). "Statistical physics of complex information dynamics". Physical Review E. 102 (5): 052304. arXiv:2010.04014. Bibcode:2020PhRvE.102e2304G. doi:10.1103/PhysRevE.102.052304. PMID 33327131. S2CID 222208856.
  14. ^ Ghavasieh, Arsham; Bontorin, Sebastiano; Artime, Oriol; Verstraete, Nina; De Domenico, Manlio (23 April 2021). "Multiscale statistical physics of the pan-viral interactome unravels the systemic nature of SARS-CoV-2 infections". Communications Physics. 4 (1): 83. Bibcode:2021CmPhy...4...83G. doi:10.1038/s42005-021-00582-8.
  15. ^ Ghavasieh, Arsham; Stella, Massimo; Biamonte, Jacob; De Domenico, Manlio (10 June 2021). "Unraveling the effects of multiscale network entanglement on empirical systems". Communications Physics. 4 (1): 129. arXiv:2008.05368. Bibcode:2021CmPhy...4..129G. doi:10.1038/s42005-021-00633-0. S2CID 221104066.
  16. ^ Ghavasieh, Arsham; De Domenico, Manlio (13 February 2020). "Enhancing transport properties in interconnected systems without altering their structure". Physical Review Research. 2 (1): 13–15. arXiv:2001.04450. Bibcode:2020PhRvR...2a3155G. doi:10.1103/PhysRevResearch.2.013155. S2CID 210165034.
  17. ^ Benigni, Barbara; Ghavasieh, Arsham; Corso, Alessandra; D'Andrea, Valeria; De Domenico, Manlio (22 June 2021). "Persistence of information flow: a multiscale characterization of human brain". Network Neuroscience. 5 (3): 831–850. doi:10.1162/netn_a_00203. PMC 8567833. PMID 34746629.