Visible to the public SlowFuzz: Automated Domain-Independent Detection of Algorithmic Complexity Vulnerabilities

TitleSlowFuzz: Automated Domain-Independent Detection of Algorithmic Complexity Vulnerabilities
Publication TypeConference Paper
Year of Publication2017
AuthorsPetsios, Theofilos, Zhao, Jason, Keromytis, Angelos D., Jana, Suman
Conference NameProceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security
PublisherACM
Conference LocationNew York, NY, USA
ISBN Number978-1-4503-4946-8
Keywordsalgorithmic complexity attacks, composability, DoS attacks, fuzzing, Metrics, pubcrawl, Resiliency, resource exhaustion attacks, sybil attacks
AbstractAlgorithmic complexity vulnerabilities occur when the worst-case time/space complexity of an application is significantly higher than the respective average case for particular user-controlled inputs. When such conditions are met, an attacker can launch Denial-of-Service attacks against a vulnerable application by providing inputs that trigger the worst-case behavior. Such attacks have been known to have serious effects on production systems, take down entire websites, or lead to bypasses of Web Application Firewalls. Unfortunately, existing detection mechanisms for algorithmic complexity vulnerabilities are domain-specific and often require significant manual effort. In this paper, we design, implement, and evaluate SlowFuzz, a domain-independent framework for automatically finding algorithmic complexity vulnerabilities. SlowFuzz automatically finds inputs that trigger worst-case algorithmic behavior in the tested binary. SlowFuzz uses resource-usage-guided evolutionary search techniques to automatically find inputs that maximize computational resource utilization for a given application. We demonstrate that SlowFuzz successfully generates inputs that match the theoretical worst-case performance for several well-known algorithms. SlowFuzz was also able to generate a large number of inputs that trigger different algorithmic complexity vulnerabilities in real-world applications, including various zip parsers used in antivirus software, regular expression libraries used in Web Application Firewalls, as well as hash table implementations used in Web applications. In particular, SlowFuzz generated inputs that achieve 300-times slowdown in the decompression routine of the bzip utility, discovered regular expressions that exhibit matching times exponential in the input size, and also managed to automatically produce inputs that trigger a high number of collisions in PHP's default hashtable implementation.
URLhttp://doi.acm.org/10.1145/3133956.3134073
DOI10.1145/3133956.3134073
Citation Keypetsios_slowfuzz:_2017