Visible to the public TTP: Medium: Crowd Sourcing AnnotationsConflict Detection Enabled

Project Details

Lead PI

Performance Period

Oct 01, 2013 - Sep 30, 2017

Institution(s)

GRAMMATECH, INC.

Award Number


Outcomes Report URL


Both sound software verification techniques and heuristic software flaw-finding tools benefit from the presence of software annotations that describe the behavior of software components. Function summaries (in the form of logical annotations) allow modular checking of software and more precise reasoning. However, such annotations are difficult to write and not commonly produced by software developers, despite their benefits to static analysis. The Crowdsourcing Annotations project will address this deficiency by encouraging software-community-based crowd-sourced generation of annotations. This effort will be supported by tools that generate, use, and translate the annotations; the results of annotation efforts will be shared through openly available repositories. We will also use pilot projects to demonstrate and encourage the use of annotations and static analysis. The project will leverage and interact with the Software Assurance Marketplace (SWAMP) project's collection of static analysis tools and example software. Some of the technical challenges are developing uniform styles and languages for annotations, reliably validating crowd-sourced submissions, merging annotations and the corresponding source code, version control, and integration with typical software development environments. The social challenges are also important: designing and implementing a crowd-sourcing infrastructure in a way that enhances and motivates community and individual technical and social benefits.