Visible to the public Predicting the Difficulty of Compromise through How Attackers Discover VulnerabilitiesConflict Detection Enabled

PI(s), Co-PI(s), Researchers:

PI: Andrew Meneely; Co-PI: Laurie Williams; Researchers: Ben Meyers and Nasif Imtiaz

HARD PROBLEM(S) ADDRESSED
This refers to Hard Problems, released November 2012.

  • Metrics

PUBLICATIONS
Papers were written as a result of your research from the current quarter only.

  • Alexopoulos, Nikolaos; Muhlhauser, Max; Meneely, Andrew, "Who are Vulnerability Reporters? A Large-scale Empirical Study on FOSS", 15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM 2021), to appear in ESEM 2021, Oct 13 2021 [preprint attached below]
  • Imtiaz, Nasif; Thorn, Seaver; Williams, Laurie, " A comparative study of vulnerability reporting by software composition analysis tools", 15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM 2021), to appear 2021. [to be uploaded next quarter]

KEY HIGHLIGHTS
Each effort should submit one or two specific highlights. Each item should include a paragraph or two along with a citation if available. Write as if for the general reader of IEEE S&P.
The purpose of the highlights is to give our immediate sponsors a body of evidence that the funding they are providing (in the framework of the SoS lablet model) is delivering results that "more than justify" the investment they are making.

  • We are building on a natural language classifier to mine apology statements from software repositories to systematically discover self-admitted mistakes. This classifier is being applied to a random sampling of GitHub repositories, including language from commits, issues, and pull request conversations. Thus far, we have collected data from 17,491 repositories, which are the top 1000 ranked repositories from 54 different programming languages. We are scaling up our data collection as well as working on apology results. We are targeting a paper submission in this coming quarter.
  • We published a paper comparing the output of software component analysis (SCA) tools, a tool type getting increased attention with the Executive Order on Cybersecurity's emphasis on software supply chain. This paper is to appear in ESEM 2021. Our manual analysis of the tools' results suggests that the accuracy of the vulnerability database is a key differentiator for SCA tools. We recommend that practitioners should not rely on any single tool at the present as that can result in missing known vulnerabilities.
  • We conducted a study to aid software practitioners and researchers in understanding the current practice of releasing security fixes by open source packages through an empirical measurement study. We find that packages are typically fast and light in their security releases as the median release comes under 4 days of the corresponding fix and contains 402 lines of code change. Furthermore, we find 61.5% of the releases come with a release note that documents the corresponding security fix, while 6.4% of these releases also mention a breaking change. This study was submitted to the USENIX conference.

COMMUNITY ENGAGEMENT

  • Jacob Woolcutt has put the NCSU research team in touch with Stephen Magill at Sonatype to establish further collaboration.

EDUCATIONAL ADVANCES:

  • None.
AttachmentTaxonomyKindSize
ESEM_90.pdfPDF document757.06 KBDownloadPreview

Other available formats:

ESEM_90.pdf
AttachmentSize
bytes