Visible to the public Predicting the Difficulty of Compromise through How Attackers Discover VulnerabilitiesConflict Detection Enabled

PI(s), Co-PI(s), Researchers:

PI: Andrew Meneely; Co-PI: Laurie Williams; Researchers: Ben Meyers and Nasif Imtiaz

HARD PROBLEM(S) ADDRESSED
This refers to Hard Problems, released November 2012.

  • Metrics

PUBLICATIONS
Papers were written as a result of your research from the current quarter only.

  • Alexopoulos, Nikolaos; Muhlhauser, Max; Meneely, Andrew, "Who are Vulnerability Reporters? A Large-scale Empirical Study on FOSS", 15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM 2021), to appear in ESEM 2021, Oct 13 2021 [preprint attached below]

  • Imtiaz, Nasif; Thorn, Seaver; Williams, Laurie, " A comparative study of vulnerability reporting by software composition analysis tools", 15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM 2021), to appear 2021. [to be uploaded next quarter]
  • Bhattacharya, Saikath; Singh, Munindar P.; Williams, Laurie, "Software Security Readiness and Deployment", 2021 IEEE International Symposium on Software Reliability Engineering Workshops (ISSREW), to appear in ISSREW 2021.

KEY HIGHLIGHTS
Each effort should submit one or two specific highlights. Each item should include a paragraph or two along with a citation if available. Write as if for the general reader of IEEE S&P.
The purpose of the highlights is to give our immediate sponsors a body of evidence that the funding they are providing (in the framework of the SoS lablet model) is delivering results that "more than justify" the investment they are making.

  • We are building on a natural language classifier to mine apology statements from software repositories to systematically discover self-admitted mistakes. This classifier is being applied to a random sampling of GitHub repositories, including language from commits, issues, and pull request conversations. Thus far, we have collected data from 17,491 repositories, which are the top 1000 ranked repositories from 54 different programming languages. We are scaling up our data collection as well as working on apology results. We are targeting a paper submission in this coming quarter.
  • A comparative analysis of vulnerability reporting by software composition analysis tools" has been accepted at the 15th International Symposium on Empirical Software Engineering and Measurement (ESEM'21).
  • We conducted an empirical study on security releases of open source packages, specifically the study focused on the fix-to-release delay, code change size, and documentation of security releases over 4,377 advisories across seven open-source package ecosystems.

COMMUNITY ENGAGEMENT

EDUCATIONAL ADVANCES:

  • None.
AttachmentTaxonomyKindSize
ESEM_90.pdfPDF document757.06 KBDownloadPreview

Other available formats:

ESEM_90.pdf
AttachmentSize
bytes