Biblio
SSL certificates are a core component of the public key infrastructure that underpins encrypted communication in the Internet. In this paper, we report the results of a longitudinal study of the characteristics of SSL certificate chains presented to clients during secure web (HTTPS) connection setup. Our data set consists of 23B SSL certificate chains collected from a global panel consisting of over 2M residential client machines over a period of 6 months. The data informing our analyses provide perspective on the entire chain of trust, including root certificates, across a wide distribution of client machines. We identify over 35M unique certificate chains with diverse relationships at all levels of the PKI hierarchy. We report on the characteristics of valid certificates, which make up 99.7% of the total corpus. We also examine invalid certificate chains, finding that 93% of them contain an untrusted root certificate and we find they have shorter average chain length than their valid counterparts. Finally, we examine two unintended but prevalent behaviors in our data: the deprecation of root certificates and secure traffic interception. Our results support aspects of prior, scan-based studies on certificate characteristics but contradict other findings, highlighting the importance of the residential client-side perspective.
Empirical research in the Internet is fraught with challenges. Among these is the possibility that local environmental conditions (e.g., CPU load or network load) introduce unexpected bias or artifacts in measurements that lead to erroneous conclusions. In this paper, we describe a framework for local environment monitoring that is designed to be used during Internet measurement experiments. The goals of our work are to provide a critical, expanded perspective on measurement results and to improve the opportunity for reproducibility of results. We instantiate our framework in a tool we call SoMeta, which monitors the local environment during active probe-based measurement experiments. We evaluate the runtime costs of SoMeta and conduct a series of experiments in which we intentionally perturb different aspects of the local environment during active probe-based measurements. Our experiments show how simple local monitoring can readily expose conditions that bias active probe-based measurement results. We conclude with a discussion of how our framework can be expanded to provide metadata for a broad range of Internet measurement experiments.
Cyber SA is described as the current and predictive knowledge of cyberspace in relation to the Network, Missions and Threats across friendly, neutral and adversary forces. While this model provides a good high-level understanding of Cyber SA, it does not contain actionable information to help inform the development of capabilities to improve SA. In this paper, we present a systematic, human-centered process that uses a card sort methodology to understand and conceptualize Senior Leader Cyber SA requirements. From the data collected, we were able to build a hierarchy of high- and low- priority Cyber SA information, as well as uncover items that represent high levels of disagreement with and across organizations. The findings of this study serve as a first step in developing a better understanding of what Cyber SA means to Senior Leaders, and can inform the development of future capabilities to improve their SA and Mission Performance.