Visible to the public Biblio

Found 15086 results

Filters: Keyword is pubcrawl  [Clear All Filters]
2017-03-27
Eberly, Wayne.  2016.  Selecting Algorithms for Black Box Matrices: Checking For Matrix Properties That Can Simplify Computations. Proceedings of the ACM on International Symposium on Symbolic and Algebraic Computation. :207–214.

Processes to automate the selection of appropriate algorithms for various matrix computations are described. In particular, processes to check for, and certify, various matrix properties of black-box matrices are presented. These include sparsity patterns and structural properties that allow "superfast" algorithms to be used in place of black-box algorithms. Matrix properties that hold generically, and allow the use of matrix preconditioning to be reduced or eliminated, can also be checked for and certified –- notably including in the small-field case, where this presently has the greatest impact on the efficiency of the computation.

Batselier, Kim, Chen, Zhongming, Liu, Haotian, Wong, Ngai.  2016.  A Tensor-based Volterra Series Black-box Nonlinear System Identification and Simulation Framework. Proceedings of the 35th International Conference on Computer-Aided Design. :17:1–17:7.

Tensors are a multi-linear generalization of matrices to their d-way counterparts, and are receiving intense interest recently due to their natural representation of high-dimensional data and the availability of fast tensor decomposition algorithms. Given the input-output data of a nonlinear system/circuit, this paper presents a nonlinear model identification and simulation framework built on top of Volterra series and its seamless integration with tensor arithmetic. By exploiting partially-symmetric polyadic decompositions of sparse Toeplitz tensors, the proposed framework permits a pleasantly scalable way to incorporate high-order Volterra kernels. Such an approach largely eludes the curse of dimensionality and allows computationally fast modeling and simulation beyond weakly nonlinear systems. The black-box nature of the model also hides structural information of the system/circuit and encapsulates it in terms of compact tensors. Numerical examples are given to verify the efficacy, efficiency and generality of this tensor-based modeling and simulation framework.

Doerr, Carola, Lengler, Johannes.  2016.  The (1+1) Elitist Black-Box Complexity of LeadingOnes. Proceedings of the Genetic and Evolutionary Computation Conference 2016. :1131–1138.

One important goal of black-box complexity theory is the development of complexity models allowing to derive meaningful lower bounds for whole classes of randomized search heuristics. Complementing classical runtime analysis, black-box models help us understand how algorithmic choices such as the population size, the variation operators, or the selection rules influence the optimization time. One example for such a result is the Ω(n log n) lower bound for unary unbiased algorithms on functions with a unique global optimum [Lehre/Witt, GECCO 2010], which tells us that higher arity operators or biased sampling strategies are needed when trying to beat this bound. In lack of analyzing techniques, almost no non-trivial bounds are known for other restricted models. Proving such bounds therefore remains to be one of the main challenges in black-box complexity theory. With this paper we contribute to our technical toolbox for lower bound computations by proposing a new type of information-theoretic argument. We regard the permutation- and bit-invariant version of LeadingOnes and prove that its (1+1) elitist black-box complexity is Ω(n2), a bound that is matched by (1+1)-type evolutionary algorithms. The (1+1) elitist complexity of LeadingOnes is thus considerably larger than its unrestricted one, which is known to be of order n log log n [Afshani et al., 2013].

Bagnères, Lénaïc, Zinenko, Oleksandr, Huot, Stéphane, Bastoul, Cédric.  2016.  Opening Polyhedral Compiler's Black Box. Proceedings of the 2016 International Symposium on Code Generation and Optimization. :128–138.

While compilers offer a fair trade-off between productivity and executable performance in single-threaded execution, their optimizations remain fragile when addressing compute-intensive code for parallel architectures with deep memory hierarchies. Moreover, these optimizations operate as black boxes, impenetrable for the user, leaving them with no alternative to time-consuming and error-prone manual optimization in cases where an imprecise cost model or a weak analysis resulted in a bad optimization decision. To address this issue, we propose a technique allowing to automatically translate an arbitrary polyhedral optimization, used internally by loop-level optimization frameworks of several modern compilers, into a sequence of comprehensible syntactic transformations as long as this optimization focuses on scheduling loop iterations. This approach opens the black box of the polyhedral frameworks enabling users to examine, refine, replay and even design complex optimizations semi-automatically in partnership with the compiler.

Natanzon, Assaf, Winokur, Alex, Bachmat, Eitan.  2016.  Black Box Replication: Breaking the Latency Limits. Proceedings of the 9th ACM International on Systems and Storage Conference. :9:1–9:9.

Synchronous replication is critical for today's enterprise IT organization. It is mandatory by regulation in several countries for some types of organizations, including banks and insurance companies. The technology has been available for a long period of time, but due to speed of light and maximal latency limitations, it is usually limited to a distance of 50-100 miles. Flight data recorders, also known as black boxes, have long been used to record the last actions which happened in airplanes at times of disasters. We present an integration between an Enterprise Data Recorder and an asynchronous replication mechanism, which allows breaking the functional limits that light speed imposes on synchronous replication.

Doerr, Benjamin, Doerr, Carola, Yang, Jing.  2016.  Optimal Parameter Choices via Precise Black-Box Analysis. Proceedings of the Genetic and Evolutionary Computation Conference 2016. :1123–1130.

In classical runtime analysis it has been observed that certain working principles of an evolutionary algorithm cannot be understood by only looking at the asymptotic order of the runtime, but that more precise estimates are needed. In this work we demonstrate that the same observation applies to black-box complexity analysis. We prove that the unary unbiased black-box complexity of the classic OneMax function class is n ln(n) – cn ± o(n) for a constant c between 0.2539 and 0.2665. Our analysis yields a simple (1+1)-type algorithm achieving this runtime bound via a fitness-dependent mutation strength. When translated into a fixed-budget perspective, our algorithm with the same budget computes a solution that asymptotically is 13% closer to the optimum (given that the budget is at least 0.2675n).

Buzdalov, Maxim.  2016.  An Algorithm for Computing Lower Bounds for Unrestricted Black-Box Complexities. Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion. :147–148.

Finding and proving lower bounds on black-box complexities is one of the hardest problems in theory of randomized search heuristics. Until recently, there were no general ways of doing this, except for information theoretic arguments similar to the one of Droste, Jansen and Wegener. In a recent paper by Buzdalov, Kever and Doerr, a theorem is proven which may yield tighter bounds on unrestricted black-box complexity using certain problem-specific information. To use this theorem, one should split the search process into a finite number of states, describe transitions between states, and for each state specify (and prove) the maximum number of different answers to any query. We augment these state constraints by one more kind of constraints on states, namely, the maximum number of different currently possible optima. An algorithm is presented for computing the lower bounds based on these constraints. We also empirically show improved lower bounds on black-box complexity of OneMax and Mastermind.

2017-03-20
Hahn, Florian, Kerschbaum, Florian.  2016.  Poly-Logarithmic Range Queries on Encrypted Data with Small Leakage. Proceedings of the 2016 ACM on Cloud Computing Security Workshop. :23–34.

Privacy-preserving range queries allow encrypting data while still enabling queries on ciphertexts if their corresponding plaintexts fall within a requested range. This provides a data owner the possibility to outsource data collections to a cloud service provider without sacrificing privacy nor losing functionality of filtering this data. However, existing methods for range queries either leak additional information (like the ordering of the complete data set) or slow down the search process tremendously by requiring to query each ciphertext in the data collection. We present a novel scheme that only leaks the access pattern while supporting amortized poly-logarithmic search time. Our construction is based on the novel idea of enabling the cloud service provider to compare requested range queries. By doing so, the cloud service provider can use the access pattern to speed-up search time for range queries in the future. On the one hand, values that have fallen within a queried range, are stored in an interactively built index for future requests. On the other hand, values that have not been queried do not leak any information to the cloud service provider and stay perfectly secure. In order to show its practicability we have implemented our scheme and give a detailed runtime evaluation.

Karbab, ElMouatez Billah, Debbabi, Mourad, Derhab, Abdelouahid, Mouheb, Djedjiga.  2016.  Cypider: Building Community-based Cyber-defense Infrastructure for Android Malware Detection. Proceedings of the 32Nd Annual Conference on Computer Security Applications. :348–362.

The popularity of Android OS has dramatically increased malware apps targeting this mobile OS. The daily amount of malware has overwhelmed the detection process. This fact has motivated the need for developing malware detection and family attribution solutions with the least manual intervention. In response, we propose Cypider framework, a set of techniques and tools aiming to perform a systematic detection of mobile malware by building an efficient and scalable similarity network infrastructure of malicious apps. Our detection method is based on a novel concept, namely malicious community, in which we consider, for a given family, the instances that share common features. Under this concept, we assume that multiple similar Android apps with different authors are most likely to be malicious. Cypider leverages this assumption for the detection of variants of known malware families and zero-day malware. It is important to mention that Cypider does not rely on signature-based or learning-based patterns. Alternatively, it applies community detection algorithms on the similarity network, which extracts sub-graphs considered as suspicious and most likely malicious communities. Furthermore, we propose a novel fingerprinting technique, namely community fingerprint, based on a learning model for each malicious community. Cypider shows excellent results by detecting about 50% of the malware dataset in one detection iteration. Besides, the preliminary results of the community fingerprint are promising as we achieved 87% of the detection.

Chakraborty, Supriyo, Tripp, Omer.  2016.  Eavesdropping and Obfuscation Techniques for Smartphones. Proceedings of the International Conference on Mobile Software Engineering and Systems. :291–292.

Mobile apps often collect and share personal data with untrustworthy third-party apps, which may lead to data misuse and privacy violations. Most of the collected data originates from sensors built into the mobile device, where some of the sensors are treated as sensitive by the mobile platform while others permit unconditional access. Examples of privacy-prone sensors are the microphone, camera and GPS system. Access to these sensors is always mediated by protected function calls. On the other hand, the light sensor, accelerometer and gyroscope are considered innocuous. All apps have unrestricted access to their data. Unfortunately, this gap is not always justified. State-of-the-art privacy mechanisms on Android provide inadequate access control and do not address the vulnerabilities that arise due to unmediated access to so-called innocuous sensors on smartphones. We have developed techniques to demonstrate these threats. As part of our demonstration, we illustrate possible attacks using the innocuous sensors on the phone. As a solution, we present ipShield, a framework that provides users with greater control over their resources at runtime so as to protect against such attacks. We have implemented ipShield by modifying the AOSP.

Lara-Nino, Andres, Carlos, Miguel, Morales-Sandoval, Arturo, Diaz-Perez.  2016.  An evaluation of AES and present ciphers for lightweight cryptography on smartphones. :87–93.

In this work we present a study that evaluates and compares two block ciphers, AES and PRESENT, in the context of lightweight cryptography for smartphones security applications. To the best of our knowledge, this is the first comparison between these ciphers using a smartphone as computing platform. AES is the standard for symmetric encryption and PRESENT is one of the first ultra-lightweight ciphers proposed in the literature and included in the ISO/IEC 29192-2. In our study, we consider execution time, voltage consumption and memory usage as metrics for comparison purposes. The two block ciphers were evaluated through several experiments in a low-cost smartphone using Android built in tools. From the results we conclude that, for general purpose encryption AES performs statistically better although block-to-block PRESENT delivers better results.

Vazirian, Samane, Zahedi, Morteza.  2016.  A modified language modeling method for authorship attribution. :32–37.

This paper presents an approach to a closed-class authorship attribution (AA) problem. It is based on language modeling for classification and called modified language modeling. Modified language modeling aims to offer a solution for AA problem by Combinations of both bigram words weighting and Unigram words weighting. It makes the relation between unseen text and training documents clearer with giving extra reward of training documents; training document including bigram word as well as unigram words. Moreover, IDF value multiplied by related word probability has been used, instead of removing stop words which are provided by Stop words list. we evaluate Experimental results by four approaches; unigram, bigram, trigram and modified language modeling by using two Persian poem corpora as WMPR-AA2016-A Dataset and WMPR-AA2016-B Dataset. Results show that modified language modeling attributes authors better than other approaches. The result on WMPR-AA2016-B, which is bigger dataset, is much better than another dataset for all approaches. This may indicate that if adequate data is provided to train language modeling the modified language modeling can be a good solution to AA problem.

Dormann, Will.  2016.  Google Authentication Risks on iOS. Proceedings of the 1st International Workshop on Mobile Development. :3–5.

The Google Identity Platform is a system that allows a user to sign in to applications and other services by using a Google account. Google Sign-In is one such method for providing one’s identity to the Google Identity Platform. Google Sign-In is available for Android applications and iOS applications, as well as for websites and other devices. Users of Google Sign-In find that it integrates well with the Android platform, but iOS users (iPhone, iPad, etc.) do not have the same experience. The user experience when logging in to a Google account on an iOS application can not only be more tedious than the Android experience, but it also conditions users to engage in behaviors that put the information in their Google accounts at risk.

Asharov, Gilad, Naor, Moni, Segev, Gil, Shahaf, Ido.  2016.  Searchable Symmetric Encryption: Optimal Locality in Linear Space via Two-dimensional Balanced Allocations. Proceedings of the Forty-eighth Annual ACM Symposium on Theory of Computing. :1101–1114.

Searchable symmetric encryption (SSE) enables a client to store a database on an untrusted server while supporting keyword search in a secure manner. Despite the rapidly increasing interest in SSE technology, experiments indicate that the performance of the known schemes scales badly to large databases. Somewhat surprisingly, this is not due to their usage of cryptographic tools, but rather due to their poor locality (where locality is defined as the number of non-contiguous memory locations the server accesses with each query). The only known schemes that do not suffer from poor locality suffer either from an impractical space overhead or from an impractical read efficiency (where read efficiency is defined as the ratio between the number of bits the server reads with each query and the actual size of the answer). We construct the first SSE schemes that simultaneously enjoy optimal locality, optimal space overhead, and nearly-optimal read efficiency. Specifically, for a database of size N, under the modest assumption that no keyword appears in more than N1 − 1/loglogN documents, we construct a scheme with read efficiency Õ(loglogN). This essentially matches the lower bound of Cash and Tessaro (EUROCRYPT ’14) showing that any SSE scheme must be sub-optimal in either its locality, its space overhead, or its read efficiency. In addition, even without making any assumptions on the structure of the database, we construct a scheme with read efficiency Õ(logN). Our schemes are obtained via a two-dimensional generalization of the classic balanced allocations (“balls and bins”) problem that we put forward. We construct nearly-optimal two-dimensional balanced allocation schemes, and then combine their algorithmic structure with subtle cryptographic techniques.

Swami, Shivam, Rakshit, Joydeep, Mohanram, Kartik.  2016.  SECRET: Smartly EnCRypted Energy Efficient Non-volatile Memories. Proceedings of the 53rd Annual Design Automation Conference. :166:1–166:6.

Data persistence in emerging non-volatile memories (NVMs) poses a multitude of security vulnerabilities, motivating main memory encryption for data security. However, practical encryption algorithms demonstrate strong diffusion characteristics that increase cell flips, resulting in increased write energy/latency and reduced lifetime of NVMs. State-of-the-art security solutions have focused on reducing the encryption penalty (increased write energy/latency and reduced memory lifetime) in single-level cell (SLC) NVMs; however, the realization of low encryption penalty solutions for multi-/triple-level cell (MLC/TLC) secure NVMs remains an open area of research. This work synergistically integrates zero-based partial writes with XOR-based energy masking to realize Smartly EnCRypted Energy efficienT, i.e., SECRET MLC/TLC NVMs, without compromising the security of the underlying encryption technique. Our simulations on an MLC (TLC) resistive RAM (RRAM) architecture across SPEC CPU2006 workloads demonstrate that for 6.25% (7.84%) memory overhead, SECRET reduces write energy by 80% (63%), latency by 37% (49%), and improves memory lifetime by 63% (56%) over conventional advanced encryption standard-based (AES-based) counter mode encryption.

Pouliot, David, Wright, Charles V..  2016.  The Shadow Nemesis: Inference Attacks on Efficiently Deployable, Efficiently Searchable Encryption. Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. :1341–1352.

Encrypting Internet communications has been the subject of renewed focus in recent years. In order to add end-to-end encryption to legacy applications without losing the convenience of full-text search, ShadowCrypt and Mimesis Aegis use a new cryptographic technique called "efficiently deployable efficiently searchable encryption" (EDESE) that allows a standard full-text search system to perform searches on encrypted data. Compared to other recent techniques for searching on encrypted data, EDESE schemes leak a great deal of statistical information about the encrypted messages and the keywords they contain. Until now, the practical impact of this leakage has been difficult to quantify. In this paper, we show that the adversary's task of matching plaintext keywords to the opaque cryptographic identifiers used in EDESE can be reduced to the well-known combinatorial optimization problem of weighted graph matching (WGM). Using real email and chat data, we show how off-the-shelf WGM solvers can be used to accurately and efficiently recover hundreds of the most common plaintext keywords from a set of EDESE-encrypted messages. We show how to recover the tags from Bloom filters so that the WGM solver can be used with the set of encrypted messages that utilizes a Bloom filter to encode its search tags. We also show that the attack can be mitigated by carefully configuring Bloom filter parameters.

Deshotels, Luke, Deaconescu, Razvan, Chiroiu, Mihai, Davi, Lucas, Enck, William, Sadeghi, Ahmad-Reza.  2016.  SandScout: Automatic Detection of Flaws in iOS Sandbox Profiles. Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. :704–716.

Recent literature on iOS security has focused on the malicious potential of third-party applications, demonstrating how developers can bypass application vetting and code-level protections. In addition to these protections, iOS uses a generic sandbox profile called "container" to confine malicious or exploited third-party applications. In this paper, we present the first systematic analysis of the iOS container sandbox profile. We propose the SandScout framework to extract, decompile, formally model, and analyze iOS sandbox profiles as logic-based programs. We use our Prolog-based queries to evaluate file-based security properties of the container sandbox profile for iOS 9.0.2 and discover seven classes of exploitable vulnerabilities. These attacks affect non-jailbroken devices running later versions of iOS. We are working with Apple to resolve these attacks, and we expect that SandScout will play a significant role in the development of sandbox profiles for future versions of iOS.

Bellare, Mihir, Hoang, Viet Tung, Tessaro, Stefano.  2016.  Message-Recovery Attacks on Feistel-Based Format Preserving Encryption. Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. :444–455.

We give attacks on Feistel-based format-preserving encryption (FPE) schemes that succeed in message recovery (not merely distinguishing scheme outputs from random) when the message space is small. For \$4\$-bit messages, the attacks fully recover the target message using \$2textasciicircum1 examples for the FF3 NIST standard and \$2textasciicircum5 examples for the FF1 NIST standard. The examples include only three messages per tweak, which is what makes the attacks non-trivial even though the total number of examples exceeds the size of the domain. The attacks are rigorously analyzed in a new definitional framework of message-recovery security. The attacks are easily put out of reach by increasing the number of Feistel rounds in the standards.

Jo, Je-Gyeong, Ryou, Jae-cheol.  2016.  HTML and PDF Fuzzing Methodology in iOS. Proceedings of the 10th International Conference on Ubiquitous Information Management and Communication. :8:1–8:5.

iOS is well-known operating system which is strong in security. However, many attacking methods of iOS have recently been published which are called "Masque Attack", "Null Dereference" and "Italy Hacking Team's RCS". Therefore, security and safety is not suitable word to iOS. In addition, many security researchers have a problem to analyze iOS because the iOS is difficult to debug because of closed source. So, we propose a new security testing method for iOS. At first, we perform to fuzz iOS's web browser called MobileSafari. The MobileSafari is possible to express HTML, PDF and mp4, etc. We perform test abnormal HTML and PDF using our fuzzing method. We hope that our research can be helpful to iOS's security and safety.

Orikogbo, Damilola, Büchler, Matthias, Egele, Manuel.  2016.  CRiOS: Toward Large-Scale iOS Application Analysis. Proceedings of the 6th Workshop on Security and Privacy in Smartphones and Mobile Devices. :33–42.

Mobile applications - or apps - are one of the main reasons for the unprecedented success smart phones and tablets have experienced over the last decade. Apps are the main interfaces that users deal with when engaging in online banking, checking travel itineraries, or browsing their social network profiles while on the go. Previous research has studied various aspects of mobile application security including data leakage and privilege escalation through confused deputy attacks. However, the vast majority of mobile application research targets Google's Android platform. Few research papers analyze iOS applications and those that focus on the Apple environment perform their analysis on comparatively small datasets (i.e., thousands in iOS vs. hundreds of thousands in Android). As these smaller datasets call into question how representative the gained results are, we propose, implement, and evaluate CRiOS, a fully-automated system that allows us to amass comprehensive datasets of iOS applications which we subject to large-scale analysis. To advance academic research into the iOS platform and its apps, we plan on releasing CRiOS as an open source project. We also use CRiOS to aggregate a dataset of 43,404 iOS applications. Equipped with this dataset we analyze the collected apps to identify third-party libraries that are common among many applications. We also investigate the network communication endpoints referenced by the applications with respect to the endpoints' correct use of TLS/SSL certificates. In summary, we find that the average iOS application consists of 60.2% library classes and only 39.8% developer-authored content. Furthermore, we find that 9.32% of referenced network connection endpoints either entirely omit to cryptographically protect network communications or present untrustworthy SSL certificates.

Suarez, Drew, Mayer, Daniel.  2016.  Faux Disk Encryption: Realities of Secure Storage on Mobile Devices. Proceedings of the International Conference on Mobile Software Engineering and Systems. :283–284.

This paper reviews the challenges faced when securing data on mobile devices. After a discussion of the state-of-the-art of secure storage for iOS and Android, the paper introduces an attack which demonstrates how Full Disk Encryption (FDE) on Android can be ineffective in practice.

Graupner, Hendrik, Jaeger, David, Cheng, Feng, Meinel, Christoph.  2016.  Automated Parsing and Interpretation of Identity Leaks. Proceedings of the ACM International Conference on Computing Frontiers. :127–134.

The relevance of identity data leaks on the Internet is more present than ever. Almost every month we read about leakage of databases with more than a million users in the news. Smaller but not less dangerous leaks happen even multiple times a day. The public availability of such leaked data is a major threat to the victims, but also creates the opportunity to learn not only about security of service providers but also the behavior of users when choosing passwords. Our goal is to analyze this data and generate knowledge that can be used to increase security awareness and security, respectively. This paper presents a novel approach to automatic analysis of a vast majority of bigger and smaller leaks. Our contribution is the concept and a prototype implementation of a parser, composed of a syntactic and a semantic module, and a data analyzer for identity leaks. In this context, we deal with the two major challenges of a huge amount of different formats and the recognition of leaks' unknown data types. Based on the data collected, this paper reveals how easy it is for criminals to collect lots of passwords, which are plain text or only weakly hashed.

Johnston, Reece, Kim, Sun-il, Coe, David, Etzkorn, Letha, Kulick, Jeffrey, Milenkovic, Aleksandar.  2016.  Xen Network Flow Analysis for Intrusion Detection. Proceedings of the 11th Annual Cyber and Information Security Research Conference. :18:1–18:4.

Virtualization technology has become ubiquitous in the computing world. With it, a number of security concerns have been amplified as users run adjacently on a single host. In order to prevent attacks from both internal and external sources, the networking of such systems must be secured. Network intrusion detection systems (NIDSs) are an important tool for aiding this effort. These systems work by analyzing flow or packet information to determine malicious intent. However, it is difficult to implement a NIDS on a virtualized system due to their complexity. This is especially true for the Xen hypervisor: Xen has incredible heterogeneity when it comes to implementation, making a generic solution difficult. In this paper, we analyze the network data flow of a typical Xen implementation along with identifying features common to any implementation. We then explore the benefits of placing security checks along the data flow and promote a solution within the hypervisor itself.

Munaiah, Nuthan, Meneely, Andrew.  2016.  Beyond the Attack Surface: Assessing Security Risk with Random Walks on Call Graphs. Proceedings of the 2016 ACM Workshop on Software PROtection. :3–14.

When reasoning about software security, researchers and practitioners use the phrase ``attack surface'' as a metaphor for risk. Enumerate and minimize the ways attackers can break in then risk is reduced and the system is better protected, the metaphor says. But software systems are much more complicated than their surfaces. We propose function- and file-level attack surface metrics–-proximity and risky walk–-that enable fine-grained risk assessment. Our risky walk metric is highly configurable: we use PageRank on a probability-weighted call graph to simulate attacker behavior of finding or exploiting a vulnerability. We provide evidence-based guidance for deploying these metrics, including an extensive parameter tuning study. We conducted an empirical study on two large open source projects, FFmpeg and Wireshark, to investigate the potential correlation between our metrics and historical post-release vulnerabilities. We found our metrics to be statistically significantly associated with vulnerable functions/files with a small-to-large Cohen's d effect size. Our prediction model achieved an increase of 36% (in FFmpeg) and 27% (in Wireshark) in the average value of F-measure over a base model built with SLOC and coupling metrics. Our prediction model outperformed comparable models from prior literature with notable improvements: 58% reduction in false negative rate, 81% reduction in false positive rate, and 548% increase in F-measure. These metrics advance vulnerability prevention by [(a)] being flexible in terms of granularity, performing better than vulnerability prediction literature, and being tunable so that practitioners can tailor the metrics to their products and better assess security risk.

Ur Rahman, Akond Ashfaque, Williams, Laurie.  2016.  Software Security in DevOps: Synthesizing Practitioners' Perceptions and Practices. Proceedings of the International Workshop on Continuous Software Evolution and Delivery. :70–76.

In organizations that use DevOps practices, software changes can be deployed as fast as 500 times or more per day. Without adequate involvement of the security team, rapidly deployed software changes are more likely to contain vulnerabilities due to lack of adequate reviews. The goal of this paper is to aid software practitioners in integrating security and DevOps by summarizing experiences in utilizing security practices in a DevOps environment. We analyzed a selected set of Internet artifacts and surveyed representatives of nine organizations that are using DevOps to systematically explore experiences in utilizing security practices. We observe that the majority of the software practitioners have expressed the potential of common DevOps activities, such as automated monitoring, to improve the security of a system. Furthermore, organizations that integrate DevOps and security utilize additional security activities, such as security requirements analysis and performing security configurations. Additionally, these teams also have established collaboration between the security team and the development and operations teams.