skip to main content
research-article
Public Access

An Operational Characterization of Mutual Information in Algorithmic Information Theory

Published:19 September 2019Publication History
Skip Abstract Section

Abstract

We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties—one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair—can establish via a probabilistic protocol with interaction on a public channel. For ℓ > 2, the longest shared secret that can be established from a tuple of strings (x1, …, x) by ℓ parties—each one having one component of the tuple and the complexity profile of the tuple—is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.

References

  1. Rudolf Ahlswede and Imre Csiszár. 1993. Common randomness in information theory and cryptography—I: Secret sharing. IEEE Trans. Inform. Theor. 39, 4 (1993), 1121--1132. DOI:https://doi.org/10.1109/18.243431Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Luis Antunes, Sophie Laplante, Alexandre Pinto, and Liliana Salvador. 2010. Cryptographic security of individual instances. In Proceedings of the International Conference on Information Theoretic Security (ICITS’10). 195--210.Google ScholarGoogle Scholar
  3. Bruno Bauwens. 2018. Optimal probabilistic polynomial time compression and the Slepian-Wolf theorem: Tighter version and simple proofs. Retrieved from: arXiv preprint arXiv:1802.00750 (2018).Google ScholarGoogle Scholar
  4. Bruno Bauwens and Marius Zimand. 2014. Linear list-approximation for short programs (or the power of a few random bits). In Proceedings of the IEEE 29th Conference on Computational Complexity (CCC’14). IEEE, 241--247. DOI:https://doi.org/10.1109/CCC.2014.32Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Charles H. Bennett, Gilles Brassard, and Jean-Marc Robert. 1988. Privacy amplification by public discussion. SIAM J. Comput. 17, 2 (1988), 210--229.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. A. A. Brudno. 1982. Entropy and the complexity of the trajectories of a dynamic system. Trudy Moskov. Matemat. Obsh. 44 (1982), 124--149.Google ScholarGoogle Scholar
  7. Gregory J. Chaitin. 1966. On the length of programs for computing finite binary sequences. J. ACM 13 (1966), 547--569.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Chung Chan, Ali Al-Bashabsheh, Javad B. Ebrahimi, Tarik Kaced, and Tie Liu. 2015. Multivariate mutual information inspired by secret-key agreement. Proc. IEEE 3, 10 (2015), 1883--1913.Google ScholarGoogle ScholarCross RefCross Ref
  9. Alexey V. Chernov, Andrei A. Muchnik, Andrei E. Romashchenko, Alexander Shen, and Nikolai K. Vereshchagin. 2002. Upper semi-lattice of binary strings with the relation “x is simple conditional to y.” Theor. Comput. Sci. 271, 1--2 (2002), 69--95. DOI:https://doi.org/10.1016/S0304-3975(01)00032-9Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Imre Csiszár and János Körner. 1978. Broadcast channels with confidential messages. IEEE Trans. Inform. Theor. 24, 3 (1978), 339--348. DOI:https://doi.org/10.1109/TIT.1978.1055892Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Imre Csiszár and Prakash Narayan. 2004. Secrecy capacities for multiple terminals. IEEE Trans. Inform. Theor. 50, 12 (2004), 3047--3061. DOI:https://doi.org/10.1109/TIT.2004.838380Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Peter Gács and János Körner. 1973. Common information is far less than mutual information. Prob. Contr. Inf. Theor. 2, 2 (1973), 149--162.Google ScholarGoogle Scholar
  13. Shafi Goldwasser. 2012. Pseudo-deterministic algorithms. In Proceedings of the 29th Symposium on Theoretical Aspects of Computer Science (STACS’12), Vol. 14. LIPIcs, 29--29.Google ScholarGoogle Scholar
  14. Venkatesan Guruswami and Adam Smith. 2010. Codes for computationally simple channels: Explicit constructions with optimal rate. In Proceedings of the 51st IEEE Symposium on Foundations of Computer Science (FOCS’10). 723--732.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Venkatesan Guruswami and Adam Smith. 2016. Optimal rate code constructions for computationally simple channels. J. ACM 63, 4 (2016), 35.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Yasuichi Horibe. 2003. A note on Kolmogorov complexity and entropy. Appl. Math. Lett. 16, 7 (2003), 1129--1130.Google ScholarGoogle ScholarCross RefCross Ref
  17. Tarik Kaced and Andrei Romashchenko. 2013. Conditional information inequalities for entropic and almost entropic points. IEEE Trans. Inform. Theor. 59, 11 (2013), 7149--7167.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Tarik Kaced, Andrei Romashchenko, and Nikolay Vereshchagin. 2015. Conditional information inequalities and combinatorial applications. Retrieved from: arXiv preprint arXiv:1501.04867 (2015).Google ScholarGoogle Scholar
  19. Andrei Nikolaevich Kolmogorov. 1965. Three approaches to the quantitative definition of information. Prob. Inform. Transmiss. 1, 1 (1965), 1--7.Google ScholarGoogle Scholar
  20. Sik Kow Leung-Yan-Cheong. 1976. Multi-user and wiretap channels including feedback. Technical Report No. 6603-2. Stanford University, Stanford, CA.Google ScholarGoogle Scholar
  21. Ueli M. Maurer. 1992. Conditionally perfect secrecy and a provably-secure randomized cipher. J. Cryptol. 5, 1 (1992), 53--66.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Ueli M. Maurer. 1993. Secret key agreement by public discussion from common information. IEEE Trans. Inform. Theor. 39, 3 (1993), 733--742. DOI:https://doi.org/10.1109/18.256484Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Andrei A. Muchnik. 1998. On common information. Theor. Comput. Sci. 207 (1998), 319--328.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Andrei A. Muchnik and Andrei E. Romashchenko. 2010. Stability of properties of Kolmogorov complexity under relativization. Prob. Inform. Transmiss. 46, 1 (2010), 38--61.Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. D. Musatov, A. E. Romashchenko, and A. Shen. 2011. Variations on Muchnik’s conditional complexity theorem. Theor. Comput. Syst. 49, 2 (2011), 227--245.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Ran Raz, Omer Reingold, and Salil P. Vadhan. 2002. Extracting all the randomness and reducing the error in Trevisan’s extractors. J. Comput. Syst. Sci. 65, 1 (2002), 97--128. DOI:https://doi.org/10.1006/jcss.2002.1824Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Ilya Razenshteyn. 2011. Common information revisited. Retrieved from: arXiv preprint arXiv:1104.3207 (2011).Google ScholarGoogle Scholar
  28. Andrei Romashchenko. 2000. Pairs of words with nonmaterializable mutual information. Prob. Inform. Transmiss. 36, 1 (2000), 3--20.Google ScholarGoogle Scholar
  29. Ronen Shaltiel. 2002. Recent developments in explicit constructions of extractors. Bull. EATCS 77, 67--95 (2002), 10.Google ScholarGoogle Scholar
  30. Alexander Shen, Vladimir Uspensky, and Nikolay Vereshchagin. 2017. Kolmogorov Complexity and Algorithmic Randomness. American Mathematical Society.Google ScholarGoogle Scholar
  31. Alexander Kh. Shen. 1983. The concept of (α, β)-stochasticity in the Kolmogorov sense, and its properties. Soviet Math. Dokl. 28, 1 (1983), 295--299.Google ScholarGoogle Scholar
  32. Adam D. Smith. 2007. Scrambling adversarial errors using few random bits, optimal information reconciliation, and better private codes. In Proceedings of the 18th ACM-SIAM Symposium on Discrete Algorithms 7, 09 (2007), 395--404.Google ScholarGoogle Scholar
  33. Ray J. Solomonoff. 1964. A formal theory of inductive inference. Inform. Contr. 7 (1964), 224--254.Google ScholarGoogle ScholarCross RefCross Ref
  34. Himanshu Tyagi. 2013. Common information and secret key capacity. IEEE Trans. Inform. Theor. 59, 9 (2013), 5627--5640.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Salil P. Vadhan. 2012. Pseudorandomness. Found. Trends Theor. Comput. Sci. 7, 1--3 (2012), 1--336. DOI:https://doi.org/10.1561/0400000010Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Aaron D. Wyner. 1975. The wire-tap channel. Bell Syst. Tech J. 54, 8 (1975), 1355--1387.Google ScholarGoogle ScholarCross RefCross Ref
  37. Marius Zimand. 2017. Kolmogorov complexity version of Slepian-Wolf coding. In Proceedings of the Symposium on the Theory of Computing (STOC’17). ACM, 22--32.Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Alexander Zvonkin and Leonid Levin. 1970. The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russian Math. Surv. 25, 6 (1970), 83--124.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. An Operational Characterization of Mutual Information in Algorithmic Information Theory

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image Journal of the ACM
        Journal of the ACM  Volume 66, Issue 5
        Distributed Computing, Algorithms and Data Structures, Algorithms, Scientific Computing, Derandomizing Algorithms, Online Algorithms and Algorithmic Information Theory
        October 2019
        266 pages
        ISSN:0004-5411
        EISSN:1557-735X
        DOI:10.1145/3350420
        Issue’s Table of Contents

        Copyright © 2019 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 19 September 2019
        • Accepted: 1 August 2019
        • Revised: 1 April 2019
        • Received: 1 August 2018
        Published in jacm Volume 66, Issue 5

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format