ABSTRACT
We study the role of Kullback-Leibler divergence in the framework of anomaly detection, where its abilities as a statistic underlying detection have never been investigated in depth. We give an in-principle analysis of network attack detection, showing explicitly attacks may be masked at minimal cost through 'camouflage'. We illustrate on both synthetic distributions and ones taken from real traffic.
- Y. Gu, A. McCallum, and D. Towsley. Detecting Anomalies in Network Traffic Using Maximum Entropy Estimation. In 5th Internet Measurement Conference, pages 345--350, 2005. Google ScholarDigital Library
- G. Nychis, V. Sekar, D. G. Andersen, H. Kim, and H. Zhang. An Empirical Evaluation of Entropy-based Traffic Anomaly Detection. In 8th ACM Internet Measurement Conference, pages 151--156, 2008. Google ScholarDigital Library
- K. Ramah Houerbi, K. Salamatian, and F. Kamoun. Scan Surveillance in Internet Networks. In 8th International IFIP-TC 6 Networking Conference, pages 614--625, 2009. Google ScholarDigital Library
- M. P. Stoecklin, J.-Y. L. Boudec, and A. Kind. A Two-Layered Anomaly Detection Technique based on Multi-Modal Flow Behavior Models. In 9th Intl. Conference on PAM, pages 212--221, 2008. Google ScholarDigital Library
Index Terms
- The role of KL divergence in anomaly detection
Recommendations
The role of KL divergence in anomaly detection
Performance evaluation reviewWe study the role of Kullback-Leibler divergence in the framework of anomaly detection, where its abilities as a statistic underlying detection have never been investigated in depth. We give an in-principle analysis of network attack detection, showing ...
Extending the Beta divergence to complex values
Highlights- We state and prove a generalization of Young’s inequality to the complex plane.
- We derive the Beta divergence for complex-valued scalars.
- We show that several well-known divergences are special cases of the Complex Beta divergence.
AbstractVarious information-theoretic divergences have been proposed for the cost function in tasks such as matrix factorization and clustering. One class of divergence is called the Beta divergence. By varying a real-valued parameter β , the Beta ...
Affine-Invariant Recognition of Handwritten Characters via Accelerated KL Divergence Minimization
ICDAR '11: Proceedings of the 2011 International Conference on Document Analysis and RecognitionThis paper proposes a new, affine-invariant image matching technique via accelerated KL (Kullback-Leibler) divergence minimization. First, we represent an image as a probability distribution by setting the sum of pixel values at one. Second, we ...
Comments