Abstract
We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties—one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair—can establish via a probabilistic protocol with interaction on a public channel. For ℓ > 2, the longest shared secret that can be established from a tuple of strings (x1, …, xℓ) by ℓ parties—each one having one component of the tuple and the complexity profile of the tuple—is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.
- Rudolf Ahlswede and Imre Csiszár. 1993. Common randomness in information theory and cryptography—I: Secret sharing. IEEE Trans. Inform. Theor. 39, 4 (1993), 1121--1132. DOI:https://doi.org/10.1109/18.243431Google ScholarDigital Library
- Luis Antunes, Sophie Laplante, Alexandre Pinto, and Liliana Salvador. 2010. Cryptographic security of individual instances. In Proceedings of the International Conference on Information Theoretic Security (ICITS’10). 195--210.Google Scholar
- Bruno Bauwens. 2018. Optimal probabilistic polynomial time compression and the Slepian-Wolf theorem: Tighter version and simple proofs. Retrieved from: arXiv preprint arXiv:1802.00750 (2018).Google Scholar
- Bruno Bauwens and Marius Zimand. 2014. Linear list-approximation for short programs (or the power of a few random bits). In Proceedings of the IEEE 29th Conference on Computational Complexity (CCC’14). IEEE, 241--247. DOI:https://doi.org/10.1109/CCC.2014.32Google ScholarDigital Library
- Charles H. Bennett, Gilles Brassard, and Jean-Marc Robert. 1988. Privacy amplification by public discussion. SIAM J. Comput. 17, 2 (1988), 210--229.Google ScholarDigital Library
- A. A. Brudno. 1982. Entropy and the complexity of the trajectories of a dynamic system. Trudy Moskov. Matemat. Obsh. 44 (1982), 124--149.Google Scholar
- Gregory J. Chaitin. 1966. On the length of programs for computing finite binary sequences. J. ACM 13 (1966), 547--569.Google ScholarDigital Library
- Chung Chan, Ali Al-Bashabsheh, Javad B. Ebrahimi, Tarik Kaced, and Tie Liu. 2015. Multivariate mutual information inspired by secret-key agreement. Proc. IEEE 3, 10 (2015), 1883--1913.Google ScholarCross Ref
- Alexey V. Chernov, Andrei A. Muchnik, Andrei E. Romashchenko, Alexander Shen, and Nikolai K. Vereshchagin. 2002. Upper semi-lattice of binary strings with the relation “x is simple conditional to y.” Theor. Comput. Sci. 271, 1--2 (2002), 69--95. DOI:https://doi.org/10.1016/S0304-3975(01)00032-9Google ScholarDigital Library
- Imre Csiszár and János Körner. 1978. Broadcast channels with confidential messages. IEEE Trans. Inform. Theor. 24, 3 (1978), 339--348. DOI:https://doi.org/10.1109/TIT.1978.1055892Google ScholarDigital Library
- Imre Csiszár and Prakash Narayan. 2004. Secrecy capacities for multiple terminals. IEEE Trans. Inform. Theor. 50, 12 (2004), 3047--3061. DOI:https://doi.org/10.1109/TIT.2004.838380Google ScholarDigital Library
- Peter Gács and János Körner. 1973. Common information is far less than mutual information. Prob. Contr. Inf. Theor. 2, 2 (1973), 149--162.Google Scholar
- Shafi Goldwasser. 2012. Pseudo-deterministic algorithms. In Proceedings of the 29th Symposium on Theoretical Aspects of Computer Science (STACS’12), Vol. 14. LIPIcs, 29--29.Google Scholar
- Venkatesan Guruswami and Adam Smith. 2010. Codes for computationally simple channels: Explicit constructions with optimal rate. In Proceedings of the 51st IEEE Symposium on Foundations of Computer Science (FOCS’10). 723--732.Google ScholarDigital Library
- Venkatesan Guruswami and Adam Smith. 2016. Optimal rate code constructions for computationally simple channels. J. ACM 63, 4 (2016), 35.Google ScholarDigital Library
- Yasuichi Horibe. 2003. A note on Kolmogorov complexity and entropy. Appl. Math. Lett. 16, 7 (2003), 1129--1130.Google ScholarCross Ref
- Tarik Kaced and Andrei Romashchenko. 2013. Conditional information inequalities for entropic and almost entropic points. IEEE Trans. Inform. Theor. 59, 11 (2013), 7149--7167.Google ScholarDigital Library
- Tarik Kaced, Andrei Romashchenko, and Nikolay Vereshchagin. 2015. Conditional information inequalities and combinatorial applications. Retrieved from: arXiv preprint arXiv:1501.04867 (2015).Google Scholar
- Andrei Nikolaevich Kolmogorov. 1965. Three approaches to the quantitative definition of information. Prob. Inform. Transmiss. 1, 1 (1965), 1--7.Google Scholar
- Sik Kow Leung-Yan-Cheong. 1976. Multi-user and wiretap channels including feedback. Technical Report No. 6603-2. Stanford University, Stanford, CA.Google Scholar
- Ueli M. Maurer. 1992. Conditionally perfect secrecy and a provably-secure randomized cipher. J. Cryptol. 5, 1 (1992), 53--66.Google ScholarDigital Library
- Ueli M. Maurer. 1993. Secret key agreement by public discussion from common information. IEEE Trans. Inform. Theor. 39, 3 (1993), 733--742. DOI:https://doi.org/10.1109/18.256484Google ScholarDigital Library
- Andrei A. Muchnik. 1998. On common information. Theor. Comput. Sci. 207 (1998), 319--328.Google ScholarDigital Library
- Andrei A. Muchnik and Andrei E. Romashchenko. 2010. Stability of properties of Kolmogorov complexity under relativization. Prob. Inform. Transmiss. 46, 1 (2010), 38--61.Google ScholarDigital Library
- D. Musatov, A. E. Romashchenko, and A. Shen. 2011. Variations on Muchnik’s conditional complexity theorem. Theor. Comput. Syst. 49, 2 (2011), 227--245.Google ScholarDigital Library
- Ran Raz, Omer Reingold, and Salil P. Vadhan. 2002. Extracting all the randomness and reducing the error in Trevisan’s extractors. J. Comput. Syst. Sci. 65, 1 (2002), 97--128. DOI:https://doi.org/10.1006/jcss.2002.1824Google ScholarDigital Library
- Ilya Razenshteyn. 2011. Common information revisited. Retrieved from: arXiv preprint arXiv:1104.3207 (2011).Google Scholar
- Andrei Romashchenko. 2000. Pairs of words with nonmaterializable mutual information. Prob. Inform. Transmiss. 36, 1 (2000), 3--20.Google Scholar
- Ronen Shaltiel. 2002. Recent developments in explicit constructions of extractors. Bull. EATCS 77, 67--95 (2002), 10.Google Scholar
- Alexander Shen, Vladimir Uspensky, and Nikolay Vereshchagin. 2017. Kolmogorov Complexity and Algorithmic Randomness. American Mathematical Society.Google Scholar
- Alexander Kh. Shen. 1983. The concept of (α, β)-stochasticity in the Kolmogorov sense, and its properties. Soviet Math. Dokl. 28, 1 (1983), 295--299.Google Scholar
- Adam D. Smith. 2007. Scrambling adversarial errors using few random bits, optimal information reconciliation, and better private codes. In Proceedings of the 18th ACM-SIAM Symposium on Discrete Algorithms 7, 09 (2007), 395--404.Google Scholar
- Ray J. Solomonoff. 1964. A formal theory of inductive inference. Inform. Contr. 7 (1964), 224--254.Google ScholarCross Ref
- Himanshu Tyagi. 2013. Common information and secret key capacity. IEEE Trans. Inform. Theor. 59, 9 (2013), 5627--5640.Google ScholarDigital Library
- Salil P. Vadhan. 2012. Pseudorandomness. Found. Trends Theor. Comput. Sci. 7, 1--3 (2012), 1--336. DOI:https://doi.org/10.1561/0400000010Google ScholarDigital Library
- Aaron D. Wyner. 1975. The wire-tap channel. Bell Syst. Tech J. 54, 8 (1975), 1355--1387.Google ScholarCross Ref
- Marius Zimand. 2017. Kolmogorov complexity version of Slepian-Wolf coding. In Proceedings of the Symposium on the Theory of Computing (STOC’17). ACM, 22--32.Google ScholarDigital Library
- Alexander Zvonkin and Leonid Levin. 1970. The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russian Math. Surv. 25, 6 (1970), 83--124.Google ScholarCross Ref
Index Terms
- An Operational Characterization of Mutual Information in Algorithmic Information Theory
Recommendations
Lower bounds using kolmogorov complexity
CiE'06: Proceedings of the Second conference on Computability in Europe: logical Approaches to Computational BarriersIn this paper, we survey a few recent applications of Kolmogorov complexity to lower bounds in several models of computation. We consider KI complexity of Boolean functions, which gives the complexity of finding a bit where inputs differ, for pairs of ...
Kolmogorov Complexity and Information Theory.With an Interpretation in Terms of Questions and Answers
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy, ...
Second quantised information distance
AbstractThe Kolmogorov complexity of a string is the minimum length of a programme that can produce that string. Information distance between two strings based on Kolmogorov complexity is defined as the minimum length of a programme that can transform ...
Comments