skip to main content
survey
Open Access

Dynamic Malware Analysis in the Modern Era—A State of the Art Survey

Published:13 September 2019Publication History
Skip Abstract Section

Abstract

Although malicious software (malware) has been around since the early days of computers, the sophistication and innovation of malware has increased over the years. In particular, the latest crop of ransomware has drawn attention to the dangers of malicious software, which can cause harm to private users as well as corporations, public services (hospitals and transportation systems), governments, and security institutions. To protect these institutions and the public from malware attacks, malicious activity must be detected as early as possible, preferably before it conducts its harmful acts. However, it is not always easy to know what to look for—especially when dealing with new and unknown malware that has never been seen. Analyzing a suspicious file by static or dynamic analysis methods can provide relevant and valuable information regarding a file's impact on the hosting system and help determine whether the file is malicious or not, based on the method's predefined rules. While various techniques (e.g., code obfuscation, dynamic code loading, encryption, and packing) can be used by malware writers to evade static analysis (including signature-based anti-virus tools), dynamic analysis is robust to these techniques and can provide greater understanding regarding the analyzed file and consequently can lead to better detection capabilities. Although dynamic analysis is more robust than static analysis, existing dynamic analysis tools and techniques are imperfect, and there is no single tool that can cover all aspects of malware behavior. The most recent comprehensive survey performed in this area was published in 2012. Since that time, the computing environment has changed dramatically with new types of malware (ransomware, cryptominers), new analysis methods (volatile memory forensics, side-channel analysis), new computing environments (cloud computing, IoT devices), new machine-learning algorithms, and more. The goal of this survey is to provide a comprehensive and up-to-date overview of existing methods used to dynamically analyze malware, which includes a description of each method, its strengths and weaknesses, and its resilience against malware evasion techniques. In addition, we include an overview of prominent studies presenting the usage of machine-learning methods to enhance dynamic malware analysis capabilities aimed at detection, classification, and categorization.

References

  1. T. Bell. 1999. The concept of dynamic analysis. ACM SIGSOFT Softw. Eng. Notes. 24, 6 (1999), 216--234. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. D. Uppal, V. Mehra, and V. Verma. 2014. Basic survey on malware analysis, tools and techniques. Int. J. Comput. Sci. Appl. 4, 1 (2014), 103--112.Google ScholarGoogle Scholar
  3. N. Idika and A. P. Mathur. 2007. A survey of malware detection techniques. Purdue University, 48, 2007-2.Google ScholarGoogle Scholar
  4. E. Gandotra, D. Bansal, and S. Sofat. 2014. Malware analysis and classification: A survey. J. Inf. Secur. 5, 2 (2014), 56--64.Google ScholarGoogle ScholarCross RefCross Ref
  5. K. Mathur and S. Hiranwal. 2013. A survey on techniques in detection and analyzing malware executables. Int. J. Adv. Res. Comput. Sci. Softw. Eng. 3, 4 (2013), 422--428.Google ScholarGoogle Scholar
  6. M. Egele, T. Scholte, E. Kirda, and C. Kruegel. 2012. A survey on automated dynamic malware-analysis techniques and tools. ACM Comput. Surv. 44, 2 (2012), 1--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. M. Damshenas, A. Dehghantanha, and R. Mahmoud. 2013. A survey on malware propagation, analysis and detection. Int. J. Cyber-Security Digit. Forens. 2, 4 (2013), 10--29.Google ScholarGoogle Scholar
  8. J. Landage and M. Wankhade. 2013. Malware and malware detection techniques: A survey. Int. J. Eng. Res. 2, 12 (2013), 61--68.Google ScholarGoogle Scholar
  9. Y. Ye, T. Li, D. Adjeroh, and S. S. Iyengar. 2017. A survey on malware detection using data-mining techniques. ACM Comput. Surv. 50, 3 (2017), 1--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. L. Yuan, W. Xing, H. Chen, and B. Zang. 2011. Security breaches as PMU deviation: Detecting and identifying security attacks using performance counters. In Proceedings of the Asia-Pacific Systems Workshop (APSYS’11). 6:1--6:5. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. D. L. Oppenheimer and M. R. Martonosi. 1997. Performance signatures: A mechanism for intrusion detection. In Proceedings of the 1997 IEEE Information Survivability Workshop. 2--5.Google ScholarGoogle Scholar
  12. S. Vogl and C. Eckert. 2012. Using hardware performance events for instruction-level monitoring on the x86 architecture. In Proceedings of the European Workshop on Systems Security.Google ScholarGoogle Scholar
  13. S. S. Clark et al. 2013. Wattsupdoc: Power side channels to nonintrusively discover untargeted malware on embedded medical devices. In Proceedings of the USENIX Workshop on Health Information Technology. 221--236, Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. X. Wang and R. Karri. 2013. Numchecker: Detecting kernel control-flow modifying rootkits by using hardware performance counters. In Proceedings of the 50th Annual Des. Autom. Conference (DAC’13). 1--7. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. J. Demme et al. 2013. On the feasibility of online malware detection with performance counters. ACM SIGARCH Comput. Archit. News. 41, 3 (2013), 559. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. A. Tang, S. Sethumadhavan, and S. J. Stolfo. 2014. Unsupervised anomaly-based malware detection using hardware features. In International Workshop on Recent Advances in Intrusion Detection. Springer, Cham, 109--129.Google ScholarGoogle Scholar
  17. S. Vomel and F. C. Freiling. 2011. A survey of main memory acquisition and analysis techniques for the windows operating system. Digit. Investig. 8, 1 (2011), 3--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. M. Graziano, D. Canali, L. Bilge, A. Lanzi, and D. Balzarotti. 2015. Needles in a haystack: Mining information from public dynamic analysis sandboxes for malware intelligence. In Proceedings of the 24th USENIX Security Symposium (USENIXSecur’15), 1057--1072. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. K. Rieck, P. Trinius, C. Willems, and T. Holz. 2011. Automatic analysis of malware behavior using machine learning. J. Comput. Secur. 19, 4 (2011), 639--668 Google ScholarGoogle ScholarCross RefCross Ref
  20. I. Firdausi, C. lim, A. Erwin, and A. S. Nugroho. 2010. Analysis of machine learning techniques used in behavior-based malware detection. In Proceedings of the 2nd International Conference Advances on Computing Control, and Telecommunications Technology. 201--203. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. T.-Y. Wang, C.-H. Wu, and C.-C. Hsieh. 2006. A surveillance spyware detection system based on data-mining methods. In 2006 IEEE International Conference on Evolutionary Computation. IEEE, 3236--3241.Google ScholarGoogle ScholarCross RefCross Ref
  22. L. Ablon, A. Bogart, and Zero Days. 2017. Zero days, thousands of nights: The life and times of zero-day vulnerabilities and their exploits. Rand Corporation. Google ScholarGoogle Scholar
  23. P. Regulation. 2016. Regulation (EU) 2016/679 of the European Parliament and of the Council. REGULATION (EU), 679, 2016.Google ScholarGoogle Scholar
  24. E. Toch et al. 2018. The privacy implications of cyber security systems: A technological survey. ACM Comput. Surv. Artic. 51, 36 (2018). Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. M. D. Schroeder and J. H. Saltzer. 1972. A hardware architecture for implementing protection rings. Commun. ACM 15, 3 (1972), 157--170. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. S. Aboughadareh, C. Csallner, and M. Azarmi. 2014. Mixed-mode malware and its analysis. In Proceedings of the 4th Progr. Prot. Reverse Eng. Workshop-- (PPREW’14). 1--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. B. G. Hoglund and J. Butler. 2005. Rootkits: Subverting the Windows Kernel. Addison-Wesley. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. J. Schiffman and D. Kaplan. 2014. The SMM rootkit revisited: Fun with USB. In Proceedings of the 9th International Conference on Availability, Reliab. Secur. (ARES’14). 279--286. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. A. Sergeev, V. Minchenkov, and V. Bashun. 2015. Malicious hypervisor and hidden virtualization of operation systems. In Proceedings of the 9th International Conference on Appl. Inf. Commun. Technol. (AICT’15). 178--182.Google ScholarGoogle Scholar
  30. H. Fritsch. 2008. Analysis and detection of virtualization-based rootkits. Technische Universitat, Munchen.Google ScholarGoogle Scholar
  31. S. T. King, P. M. Chen, Y. M. Wang, C. Verbowski, H. J. Wang, and J. R. Lorch. 2006. SubVirt: Implementing malware with virtual machines. In Proceedings of the IEEE Symposium on Secur. Priv. 2006, 314--327. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. A. Tereshkin and R. Wojtczuk. 2009. Introducing Ring -3 Rootkits. In Proceedings of the Black Hat USA Conference (BHUSA’09).Google ScholarGoogle Scholar
  33. M. Gorobets, O. Bazhaniuk, A. Matrosov, A. Furtak, and Y. Bulygin. 2015. Attacking hypervisors via firmware and hardware. Black Hat USA.Google ScholarGoogle Scholar
  34. N. Nissim, R. Yahalom, and Y. Elovici. 2017. USB-based attacks. Comput. Secur. 70 (2017), 675--688.Google ScholarGoogle ScholarCross RefCross Ref
  35. P. Stewin and I. Bystrov. 2013. Understanding DMA malware. Lect. Notes Comput. Sci. 7591 (2013) 21--41. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. R. Paleari, L. Martignoni, G. F. Roglia, and D. Bruschi. 2009. A fistful of red pills: How to automatically generate procedures to detect CPU emulators. In Proceedings of the 3rd USENIX Conference on Offensive Technology. 2. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. J. Rutkowska. 2004. Redpill: Detect VMM using (almost) One CPU Instruction. http://invisiblethings.org/papers/redpill.html.Google ScholarGoogle Scholar
  38. P. Ferrie. 2007. Attacks on more virtual machine emulators. Symantec Technol. Exch. 55.Google ScholarGoogle Scholar
  39. M. Carpenter, T. Liston, and E. Skoudis. 2007. Hiding virtualization from attackers and malware. IEEE Secur. Priv. 5, 3 (2007), 62--65. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. I. You and K. Yim. 2010. Malware obfuscation techniques: A brief survey. In Proceedings of the International Conference on Broadband, Wirel. Comput. Commun. Appl. (BWCCA’10). 297--300. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. U. Bayer, E. Kirda, and C. Kruegel. 2010. Improving the efficiency of dynamic malware analysis. In Proceedings of the ACM Symp. Appl. Comput. (SAC’10). 1871. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. P. Deshpande. 2013. Metamorphic Detection Using Function Call Graph Analysis. Master's Projects. 336.Google ScholarGoogle Scholar
  43. J. A P. Marpaung, M. Sain, and H.-J. Lee. 2012. Survey on malware evasion techniques: State of the art and challenges. In Proceedings of the 14th International Conference on Advances in Communications Technology (ICACT’12). 744--749.Google ScholarGoogle Scholar
  44. B. S. Rivera and R. U. 2015. Inocencio. Doing More with Less: A Study of Fileless Infection Attacks.Google ScholarGoogle Scholar
  45. G. Schwenk and K. Rieck. 2011. Adaptive detection of covert communication in HTTP requests. In Proceedings of the 7th European Conference on Comput. Netw. Defense (EC2ND’11). 25--32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. X. Li, Y. Wen, M. H. Huang, and Q. Liu. 2011. An overview of bootkit attacking approaches. In Proceedings of the 7th International Conference on Mob. Ad-hoc Sens. Networks (MSN’11). 428--431. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Y. Zhu, S. L. Liu, H. Lu, and W. Tang. 2013. Research on the detection technique of Bootkit. In Proceedings of the International Conference on Graph. Image Process (ICGIP’13). 1--7.Google ScholarGoogle Scholar
  48. N. Nissim, Y. Lapidot, A. Cohen, and Y. Elovici. 2018. Trusted system-calls analysis methodology aimed at detection of compromised virtual machines using sequential mining. Knowl.-Based Syst. 153, (2018), 147--175. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. A. Cohen and N. Nissim. 2018. Trusted detection of ransomware in a private cloud using machine learning methods leveraging meta-features from volatile memory. Expert Syst. Appl. 102 (2018), 158--178. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. D. Kirat, G. Vigna, and C. Kruegel. 2011. BareBox: Efficient malware analysis on bare metal. In Proceedings of the 27th Annual Comput. Secur. Appl. Conference. 403--412. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. D. D. Chen, M. Egele, M. Woo, and D. Brumley. 2016. Towards fully automated dynamic analysis for embedded firmware. In Proceedings of the Netw. Distrib. Syst. Secur. Symposium. 21--24.Google ScholarGoogle Scholar
  52. J. Rutkowska. 2007. Beyond the CPU: Defeating hardware-based RAM acquisition. In Proceedings of the Black Hat DC. 1--49.Google ScholarGoogle Scholar
  53. A. Reina, A. Fattori, F. Pagani, L. Cavallaro, and D. Bruschi. 2012. When hardware meets software: A bulletproof solution to forensic memory acquisition. In Proceedings of the ACM Comput. Secur. Appl. Conference. 79. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. J. Stuttgen and M. Cohen. 2013. Anti-forensic resilient memory acquisition. Digit. Investig. 10, Suppl. S105--S115. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. S. Agarwal. FRAME? Framework for real time analysis of malware. In Proceedings of the 8th International Conference on Cloud Comput. Data Sci. Eng. 14--15.Google ScholarGoogle Scholar
  56. U. Bayer, A. Moser, C. Kruegel, and E. Kirda. 2006. Dynamic analysis of malicious code. J. Comput. Virol. 2, 1 (2006), 67--77.Google ScholarGoogle ScholarCross RefCross Ref
  57. T. Mandl, U. Bayer, and F. Nentwich. 2009. ANalyzing Unknown BInarieS the automatic Way. In Virus Bulletin Conference, Vol. 1. 2 pages.Google ScholarGoogle Scholar
  58. G. Willems, T. Holz, and F. Freiling. 2007. Toward automated dynamic malware analysis using CWSandbox. IEEE Secur. Priv. 5, 2 (2007), 32--39. Google ScholarGoogle ScholarDigital LibraryDigital Library
  59. C. Seifert, R. Steenson, I. Welch, P. Komisarczuk, and B. Endicott-Popovsky. 2007. Capture - A behavioral analysis tool for applications and documents. Digit. Investig. 4, Suppl. 23--30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  60. A. Vasudevan. 2008. MalTRAK: Tracking and eliminating unknown malware. In Proceedings of the Annual Computer Secur. Appl. Conference (ACSAC’08). 311--321. Google ScholarGoogle ScholarDigital LibraryDigital Library
  61. M. Neugschwandtner, C. Platzer, P. M. Comparetti, and U. Bayer. 2010. dAnubis—Dynamic device driver analysis based on virtual machine introspection. In Proceedings of the Detect. Intrusions Malware, Vulnerability Assess. 41--60. Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. E. J. Schwartz, T. Avgerinos, and D. Brumley. 2010. All you ever wanted to know about dynamic taint analysis and forward symbolic execution (but might have been afraid to ask). In Proceedings of the IEEE Symposium on Secur. Priv. 317--331. Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. A. Moser, C. Kruegel, and E. Kirda. 2007. Exploring multiple execution paths for malware analysis. In Proceedings of the IEEE Symposium Secur. Priv. 231--245. Google ScholarGoogle ScholarDigital LibraryDigital Library
  64. D. Balzarotti, M. Cova, C. Karlberger, C. Kruegel, E. Kirda, and G. Vigna. 2010. Efficient detection of split personalities in malware. In Proceedings of the Conference on Netw. Distrib. Syst. Secur.Google ScholarGoogle Scholar
  65. D. Bruening. 2004. Efficient, transparent, and comprehensive runtime code manipulation. Electr. Eng. 306.Google ScholarGoogle Scholar
  66. A. Vasudevan and R. Yerraballi. 2005. Stealth breakpoints. In Proceedings of the Annual Computer Secur. Appl. Conference (ACSAC’05). 381--390. Google ScholarGoogle ScholarDigital LibraryDigital Library
  67. A. Vasudevan and R. Yerraballi. 2006. SPiKE: Engineering malware analysis tools using unobtrusive binary-instrumentation. In Proceedings of the Conference on Res. Pract. Inf. Technol. Ser. 48, 311--320. Google ScholarGoogle ScholarDigital LibraryDigital Library
  68. A. Vasudevan and R. Yerraballi. 2006. Cobra: Fine-grained malware analysis using stealth localized-executions. In Proceedings of the IEEE Symposium on Secur. Priv. (2006), 264--278. Google ScholarGoogle ScholarDigital LibraryDigital Library
  69. D. Brumley, C. Hartwig, Z. Liang, J. Newsome, D. Song, and H. Yin. 2007. Automatically identifying trigger-based behavior in malware. In Proceedings of the Conference on Botnet Detect. 65--88.Google ScholarGoogle Scholar
  70. A. Dinaburg, P. Royal, M. Sharif, and W. Lee. 2008. Ether: Malware analysis via hardware virtualization extensions. In Proceedings of the ACM Conference on Computer and Communications Security (CCS’08). 11. Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. M. Polino et al. 2017. Measuring and defeating anti-instrumentation-equipped malware. In Proceedings of the International Conference on Detect. Intrusions Malware, Vulnerability Assess. 73--96.Google ScholarGoogle ScholarCross RefCross Ref
  72. S. Cesare and Y. Xiang. 2010. Classification of malware using structured control flow. In Proceedings of the Conference on Res. Pract. Inf. Technol. Ser. 107, AusPDC, 61--70, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  73. M. Costa et al. 2005. Vigilante: End-to-End Containment of Internet Worms. In Proceedings of the Symposium on Operating Systems Principles (SOSP’05). Google ScholarGoogle ScholarDigital LibraryDigital Library
  74. H. Yin, D. Song, M. Egele, C. Kruegel, and E. Kirda. 2007. Panorama: Capturing system-wide information flow for malware detection and analysis. In Proceedings of the 14th ACM Conference Comput. Commun. Secur. (CCS’07), 116--127. Google ScholarGoogle ScholarDigital LibraryDigital Library
  75. J. Clause, W. Li, and A. Orso. 2007. Dytan: A generic dynamic taint analysis framework. In Proceedings of the International Symposium on Softw. Test. Anal. 196--206, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  76. M. Egele, C. Kruegel, E. Kirda, and D. Song. 2007. Dynamic spyware analysis. In Usenix Conference. 233--246. Google ScholarGoogle ScholarDigital LibraryDigital Library
  77. I. Korkin and I. Nesterov. 2014. Applying memory forensics to rootkit detection. In Proceedings of the Conference on Digit. Forensics, Secur. Law, 115--142.Google ScholarGoogle Scholar
  78. J. Stüttgen, S. Vomel, and M. Denzel. 2015. Acquisition and analysis of compromised firmware using memory forensics. Digit. Investig. 12, S1, S50--S60. Google ScholarGoogle ScholarDigital LibraryDigital Library
  79. Z. Liang, H. Yin, and D. Song. 2008. HookFinder: Identifying and understanding malware hooking behaviors. Dep. Electr. Comput. Eng. 41 (2008).Google ScholarGoogle Scholar
  80. J. Rhee, R. Riley, D. Xu, and X. Jiang. 2010. Kernel malware analysis with un-tampered and temporal views of dynamic kernel memory. Lect. Notes Comput. Sci. (Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), 6307 (2010), 178--197. Google ScholarGoogle ScholarDigital LibraryDigital Library
  81. C. Gorecki, F. C. Freiling, K. Marc, and T. Holz. 2011. Truman box: Improving dynamic malware analysis by emulating the internet. In Symposium on Self-Stabilizing Systems. Springer, Berlin, Heidelberg, 208--222. Google ScholarGoogle ScholarDigital LibraryDigital Library
  82. M. Yu, Z. Qi, Q. Lin, X. Zhong, B. Li, and H. Guan. 2012. Vis: Virtualization enhanced live forensics acquisition for native system. Digit. Investig. 9, 1 (2012), 22--33.Google ScholarGoogle ScholarCross RefCross Ref
  83. M. Graziano, A. Lanzi, and D. Balzarotti. 2013. Hypervisor memory forensics. In International Workshop on Recent Advances in Intrusion Detection. Springer, Berlin, Heidelberg, 21--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  84. A. Mohaisen, O. Alrawi, and M. Mohaisen. 2015. AMAL: High-fidelity, behavior-based automated malware analysis and classification. Comput. Secur. 52 251--266. Google ScholarGoogle ScholarDigital LibraryDigital Library
  85. Y. Cheng, X. Fu, X. Du, B. Luo, and M. Guizani. 2017. A lightweight live memory forensic approach based on hardware virtualization. Info. Sci. 379, 23--41.Google ScholarGoogle Scholar
  86. C. H. Lin, C. W. Tien, C. W. Chen, C. W. Tien, and H. K. Pao. 2016. Efficient spear-phishing threat detection using hypervisor monitor. In Proceedings of the International Carnahan Conference Secur. Technol. 299--303.Google ScholarGoogle Scholar
  87. C. H. Lin, H. K. Pao, and J. W. Liao. 2018. Efficient dynamic malware analysis using virtual time control mechanics. Comput. Secur. 73 359—373.Google ScholarGoogle Scholar
  88. S. Das, J. Werner, M. Antonakakis, M. Polychronakis, and F. Monrose. SoK? The Challenges, Pitfalls, and Perils of Using Hardware Performance Counters for Security. In Proceedings of 40th IEEE Symposium on Security and Privacy (S&P'''19).Google ScholarGoogle Scholar
  89. E. Buchanan, R. Roemer, S. Savage, and H. Shacham. 2008. Return-oriented programming: Exploitation without Code Injection. Black Hat, 8.Google ScholarGoogle Scholar
  90. B. Zhou, A. Gupta, R. Jahanshahi, M. Egele, and A. Joshi. 2018. Hardware performance counters can detect malware: myth or fact? In Proceedings of the Conference on Asia Conference on Computer and Communications Security (ASIACCS’18). 457--468. Google ScholarGoogle ScholarDigital LibraryDigital Library
  91. C. Smith and S. S. Consultant. 2008. Creating code obfuscation virtual machines. Tutorial in RECON08.Google ScholarGoogle Scholar
  92. M. M. G. Kang, P. Poosankam, and H. Yin. 2007. Renovo: A hidden code extractor for packed executables. In Proceedings of the ACM Workshop on Recurr. Malcode 46--53. Google ScholarGoogle ScholarDigital LibraryDigital Library
  93. P. Royal, M. Halpin, D. Dagon, R. Edmonds, and W. Lee. 2006. PolyUnpack: Automating the hidden-code extraction of unpack-executing malware. In Proceedings of the Annual Computer Secur. Appl. Conference (ACSAC’06). 289--298. Google ScholarGoogle ScholarDigital LibraryDigital Library
  94. R.O.C.U. 2005. United. Generic Unpacking—How to Handle Modified or Unknown PE Compression Engines. Retrieved from https://www.virusbulletin.com/conference/vb2005/abstracts/generic-unpacking-how-handle-modified-or-unknown-pe-compression-engines/.Google ScholarGoogle Scholar
  95. R. Pascanu, J. W. Stokes, H. Sanossian, M. Marinescu, and A. Thomas. 2015. Malware classification with recurrent networks. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’15). 1916--1920.Google ScholarGoogle Scholar
  96. S. Rasthofer, S. Arzt, and E. Bodden. 2014. A Machine-learning approach for classifying and categorizing android sources and sinks. In Proceedings of the Network and Distributed System Security Symposium. 23--26.Google ScholarGoogle Scholar
  97. G. Hospodar, B. Gierlichs, E. De Mulder, I. Verbauwhede, and J. Vandewalle. 2011. Machine learning in side-channel analysis: A first study. J. Cryptogr. Eng. 1, 4 (2011), 293--302.Google ScholarGoogle ScholarCross RefCross Ref
  98. A. Nazari, N. Sehatbakhsh, M. Alam, A. Zajic, and M. Prvulovic. 2017. EDDIE: EM-based detection of deviations in program execution. In Proceedings of the 44th Annual International Symposium on Computer Architecture (ISCA’17). 333--346. Google ScholarGoogle ScholarDigital LibraryDigital Library
  99. C.-K. Luk et al. 2005. Pin: Building customized program analysis tools with dynamic instrumentation. In Proceedings of the ACM SIGPLAN Conference on Program. Lang. Des. Implement. (PLDI’05). 190. Google ScholarGoogle ScholarDigital LibraryDigital Library
  100. P. Barham et al. 2003. Xen and the art of virtualization. In Proceedings of the 19th ACM Symposium on Operations Systems Principles (SOSP’03). 164. Google ScholarGoogle ScholarDigital LibraryDigital Library
  101. F. Bellard. 2005. QEMU, a fast and portable dynamic translator. In Proceedings of the USENIX Annual Technical Conference. 41--46. Google ScholarGoogle ScholarDigital LibraryDigital Library
  102. H. Yin and D. Song. 2012. Automatic Malware Analysis: An Emulator Based Approach. Springer Science 8 Business Media. Google ScholarGoogle ScholarDigital LibraryDigital Library
  103. D. Song et al. 2008. BitBlaze: A new approach to computer security via binary analysis. Lect. Notes Comput. Sci. 5352 (2008), 1--25. Google ScholarGoogle ScholarDigital LibraryDigital Library
  104. A. Vasudenvan. 2007. WiLDCAT: An integrated stealth environment for dynamic malware analysis. PhD Dissertations. http://hdl.handle.net/10106/233. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Dynamic Malware Analysis in the Modern Era—A State of the Art Survey

      Recommendations

      Reviews

      Massimiliano Masi

      Malware is a problem. Its spreading within industrial networks and critical infrastructures shows that there is an always increasing need for cybersecurity expertise to detect, protect, and react to infections. Unfortunately, analyzing how malware attacks a network can be an extremely complex task for a small team of cybersecurity specialists. Attackers can be highly motivated and can potentially have unlimited resources (in the worst cases). Typically, malware is either statically or dynamically analyzed. However, malware writers can use "various techniques ... to evade static analysis" and "dynamic analysis tools ... are imperfect." As the article states: "there is no single tool that cover[s] all aspects of malware behavior." In this survey, the authors provide a taxonomy for the malware, for the behavior of the malware, for how the malware analysis can be done, and for the techniques and tools available to perform it. The article concludes with a matrix summarizing malware behavior and correlations with layout and techniques. Such classifications can tremendously help malware analysts choose the best analysis strategy. Chief information security officers (CISOs), security information and event management (SIEM), and security operations center (SOC) practitioners will benefit from reading this article, as it provides insight into the techniques of both malware authors and malware analysts.

      Access critical reviews of Computing literature here

      Become a reviewer for Computing Reviews.

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image ACM Computing Surveys
        ACM Computing Surveys  Volume 52, Issue 5
        September 2020
        791 pages
        ISSN:0360-0300
        EISSN:1557-7341
        DOI:10.1145/3362097
        • Editor:
        • Sartaj Sahni
        Issue’s Table of Contents

        Copyright © 2019 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 13 September 2019
        • Accepted: 1 May 2019
        • Revised: 1 April 2019
        • Received: 1 November 2018
        Published in csur Volume 52, Issue 5

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • survey
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format