Visible to the public Biblio

Found 5734 results

Filters: Keyword is Human Behavior  [Clear All Filters]
2017-05-22
To, Hien, Nguyen, Kien, Shahabi, Cyrus.  2016.  Differentially Private Publication of Location Entropy. Proceedings of the 24th ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems. :35:1–35:10.

Location entropy (LE) is a popular metric for measuring the popularity of various locations (e.g., points-of-interest). Unlike other metrics computed from only the number of (unique) visits to a location, namely frequency, LE also captures the diversity of the users' visits, and is thus more accurate than other metrics. Current solutions for computing LE require full access to the past visits of users to locations, which poses privacy threats. This paper discusses, for the first time, the problem of perturbing location entropy for a set of locations according to differential privacy. The problem is challenging because removing a single user from the dataset will impact multiple records of the database; i.e., all the visits made by that user to various locations. Towards this end, we first derive non-trivial, tight bounds for both local and global sensitivity of LE, and show that to satisfy ε-differential privacy, a large amount of noise must be introduced, rendering the published results useless. Hence, we propose a thresholding technique to limit the number of users' visits, which significantly reduces the perturbation error but introduces an approximation error. To achieve better utility, we extend the technique by adopting two weaker notions of privacy: smooth sensitivity (slightly weaker) and crowd-blending (strictly weaker). Extensive experiments on synthetic and real-world datasets show that our proposed techniques preserve original data distribution without compromising location privacy.

Cuff, Paul, Yu, Lanqing.  2016.  Differential Privacy As a Mutual Information Constraint. Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. :43–54.

Differential privacy is a precise mathematical constraint meant to ensure privacy of individual pieces of information in a database even while queries are being answered about the aggregate. Intuitively, one must come to terms with what differential privacy does and does not guarantee. For example, the definition prevents a strong adversary who knows all but one entry in the database from further inferring about the last one. This strong adversary assumption can be overlooked, resulting in misinterpretation of the privacy guarantee of differential privacy. Herein we give an equivalent definition of privacy using mutual information that makes plain some of the subtleties of differential privacy. The mutual-information differential privacy is in fact sandwiched between ε-differential privacy and (ε,δ)-differential privacy in terms of its strength. In contrast to previous works using unconditional mutual information, differential privacy is fundamentally related to conditional mutual information, accompanied by a maximization over the database distribution. The conceptual advantage of using mutual information, aside from yielding a simpler and more intuitive definition of differential privacy, is that its properties are well understood. Several properties of differential privacy are easily verified for the mutual information alternative, such as composition theorems.

Hay, Michael, Machanavajjhala, Ashwin, Miklau, Gerome, Chen, Yan, Zhang, Dan.  2016.  Principled Evaluation of Differentially Private Algorithms Using DPBench. Proceedings of the 2016 International Conference on Management of Data. :139–154.

Differential privacy has become the dominant standard in the research community for strong privacy protection. There has been a flood of research into query answering algorithms that meet this standard. Algorithms are becoming increasingly complex, and in particular, the performance of many emerging algorithms is data dependent, meaning the distribution of the noise added to query answers may change depending on the input data. Theoretical analysis typically only considers the worst case, making empirical study of average case performance increasingly important. In this paper we propose a set of evaluation principles which we argue are essential for sound evaluation. Based on these principles we propose DPBench, a novel evaluation framework for standardized evaluation of privacy algorithms. We then apply our benchmark to evaluate algorithms for answering 1- and 2-dimensional range queries. The result is a thorough empirical study of 15 published algorithms on a total of 27 datasets that offers new insights into algorithm behavior–-in particular the influence of dataset scale and shape–-and a more complete characterization of the state of the art. Our methodology is able to resolve inconsistencies in prior empirical studies and place algorithm performance in context through comparison to simple baselines. Finally, we pose open research questions which we hope will guide future algorithm design.

Krishnan, Sanjay, Wang, Jiannan, Franklin, Michael J., Goldberg, Ken, Kraska, Tim.  2016.  PrivateClean: Data Cleaning and Differential Privacy. Proceedings of the 2016 International Conference on Management of Data. :937–951.

Recent advances in differential privacy make it possible to guarantee user privacy while preserving the main characteristics of the data. However, most differential privacy mechanisms assume that the underlying dataset is clean. This paper explores the link between data cleaning and differential privacy in a framework we call PrivateClean. PrivateClean includes a technique for creating private datasets of numerical and discrete-valued attributes, a formalism for privacy-preserving data cleaning, and techniques for answering sum, count, and avg queries after cleaning. We show: (1) how the degree of privacy affects subsequent aggregate query accuracy, (2) how privacy potentially amplifies certain types of errors in a dataset, and (3) how this analysis can be used to tune the degree of privacy. The key insight is to maintain a bipartite graph relating dirty values to clean values and use this graph to estimate biases due to the interaction between cleaning and privacy. We validate these results on four datasets with a variety of well-studied cleaning techniques including using functional dependencies, outlier filtering, and resolving inconsistent attributes.

Jamrozik, Konrad, von Styp-Rekowsky, Philipp, Zeller, Andreas.  2016.  Mining Sandboxes. Proceedings of the 38th International Conference on Software Engineering. :37–48.

We present sandbox mining, a technique to confine an application to resources accessed during automatic testing. Sandbox mining first explores software behavior by means of automatic test generation, and extracts the set of resources accessed during these tests. This set is then used as a sandbox, blocking access to resources not used during testing. The mined sandbox thus protects against behavior changes such as the activation of latent malware, infections, targeted attacks, or malicious updates. The use of test generation makes sandbox mining a fully automatic process that can be run by vendors and end users alike. Our BOXMATE prototype requires less than one hour to extract a sandbox from an Android app, with few to no confirmations required for frequently used functionality.

Xia, Haijun.  2016.  Object-Oriented Interaction: Enabling Direct Physical Manipulation of Abstract Content via Objectification. Proceedings of the 29th Annual Symposium on User Interface Software and Technology. :13–16.

Touch input promises intuitive interactions with digital content as it employs our experience of manipulating physical objects: digital content can be rotated, scaled, and translated using direct manipulation gestures. However, the reliance on analog also confines the scope of direct physical manipulation: the physical world provides no mechanism to interact with digital abstract content. As such, applications on touchscreen devices either only include limited functionalities or fallback on the traditional form-filling paradigm, which is tedious, slow, and error prone for touch input. My research focuses on designing a new UI framework to enable complex functionalities on touch screen devices by expanding direct physical manipulation to abstract content via objectification. I present two research projects, objectification of attributes and selection, which demonstrate considerable promises.

Ghadi, Musab, Laouamer, Lamri, Nana, Laurent, Pascu, Anca.  2016.  A Robust Associative Watermarking Technique Based on Frequent Pattern Mining and Texture Analysis. Proceedings of the 8th International Conference on Management of Digital EcoSystems. :73–81.

Nowadays, the principle of image mining plays a vital role in various areas of our life, where numerous frameworks based on image mining are proposed for object recognition, object tracking, sensing images and medical image diagnosis. Nevertheless, the research in the image authentication based on image mining is still confined. Therefore, this paper comes to present an efficient engagement between the frequent pattern mining and digital watermarking to contribute significantly in the authentication of images transmitted via public networks. The proposed framework exploits some robust features of image to extract the frequent patterns in the image data. The maximal relevant patterns are used to discriminate between the textured and smooth blocks within the image, where the texture blocks are more appropriate to embed the secret data than smooth blocks. The experiment's result proves the efficiency of the proposed framework in terms of stabilization and robustness against different kind of attacks. The results are interesting and remarkable to preserve the image authentication.

Sinha, Rohit, Costa, Manuel, Lal, Akash, Lopes, Nuno P., Rajamani, Sriram, Seshia, Sanjit A., Vaswani, Kapil.  2016.  A Design and Verification Methodology for Secure Isolated Regions. Proceedings of the 37th ACM SIGPLAN Conference on Programming Language Design and Implementation. :665–681.

Hardware support for isolated execution (such as Intel SGX) enables development of applications that keep their code and data confidential even while running in a hostile or compromised host. However, automatically verifying that such applications satisfy confidentiality remains challenging. We present a methodology for designing such applications in a way that enables certifying their confidentiality. Our methodology consists of forcing the application to communicate with the external world through a narrow interface, compiling it with runtime checks that aid verification, and linking it with a small runtime that implements the narrow interface. The runtime includes services such as secure communication channels and memory management. We formalize this restriction on the application as Information Release Confinement (IRC), and we show that it allows us to decompose the task of proving confidentiality into (a) one-time, human-assisted functional verification of the runtime to ensure that it does not leak secrets, (b) automatic verification of the application's machine code to ensure that it satisfies IRC and does not directly read or corrupt the runtime's internal state. We present /CONFIDENTIAL: a verifier for IRC that is modular, automatic, and keeps our compiler out of the trusted computing base. Our evaluation suggests that the methodology scales to real-world applications.

O'Brien, Heather L., Freund, Luanne, Kopak, Richard.  2016.  Investigating the Role of User Engagement in Digital Reading Environments. Proceedings of the 2016 ACM on Conference on Human Information Interaction and Retrieval. :71–80.

User engagement is recognized as an important component of the user experience, but relatively little is known about the effect of engagement on the learning outcomes of such interactions. This experimental user study examines the relationship between user engagement (UE) and comprehension in varied academic reading environments. Forty-one university students interacted with one of two sets of texts presented in 4 conditions in the context of preparing for a class assignment. Employing the User Engagement Scale (UES), we found evidence of a relationship between students' comprehension of the texts and their degree of engagement with them. However, this association was confined to one of the UES subscales and was not consistent across levels of engagement. An examination of additional variables found little evidence that system and content characteristics influenced engagement; however, we noted that all students' reported increased knowledge, but topical interest for non-engaged students declined. Results contribute to existing literature by adding further evidence that the relationship between engagement and comprehension is complex and mediated.

Khanwalkar, Sanket, Balakrishna, Shonali, Jain, Ramesh.  2016.  Exploration of Large Image Corpuses in Virtual Reality. Proceedings of the 2016 ACM on Multimedia Conference. :596–600.

With the increasing capture of photos and their proliferation on social media, there is a pressing need for a more intuitive and versatile image search and exploration system. Image search systems have long been confined to the binds of the 2D legacy screens and the keyword text-box. With the recent advances in Virtual Reality (VR) technology, a move towards an immersive VR environment will redefine the image navigation experience. To this end, we propose a VR platform that gathers images from various sources, and addresses the 5 Ws of image search - what, where, when, who and why. We achieve this by providing the user with two modes of interactive exploration - (i) A mode that allows for a graph based navigation of an image dataset, using a steering wheel visualization, along multiple dimensions of time, location, visual concept, people, etc. and (ii) Another mode that provides an intuitive exploration of the image dataset using a logical hierarchy of visual concepts. Our contributions include creating a VR image exploration experience that is intuitive and allows image navigation along multiple dimensions.

Hessar, Mehrdad, Iyer, Vikram, Gollakota, Shyamnath.  2016.  Enabling On-body Transmissions with Commodity Devices. Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. :1100–1111.

We show for the first time that commodity devices can be used to generate wireless data transmissions that are confined to the human body. Specifically, we show that commodity input devices such as fingerprint sensors and touchpads can be used to transmit information to only wireless receivers that are in contact with the body. We characterize the propagation of the resulting transmissions across the whole body and run experiments with ten subjects to demonstrate that our approach generalizes across different body types and postures. We also evaluate our communication system in the presence of interference from other wearable devices such as smartwatches and nearby metallic surfaces. Finally, by modulating the operations of these input devices, we demonstrate bit rates of up to 50 bits per second over the human body.

Shalev, Noam, Keidar, Idit, Moatti, Yosef, Weinsberg, Yaron.  2016.  WatchIT: Who Watches Your IT Guy? Proceedings of the 8th ACM CCS International Workshop on Managing Insider Security Threats. :93–96.

System administrators have unlimited access to system resources. As the Snowden case shows, these permissions can be exploited to steal valuable personal, classified, or commercial data. In this work we propose a strategy that increases the organizational information security by constraining IT personnel's view of the system and monitoring their actions. To this end, we introduce the abstraction of perforated containers – while regular Linux containers are too restrictive to be used by system administrators, by "punching holes" in them, we strike a balance between information security and required administrative needs. Our system predicts which system resources should be accessible for handling each IT issue, creates a perforated container with the corresponding isolation, and deploys it in the corresponding machines as needed for fixing the problem. Under this approach, the system administrator retains his superuser privileges, while he can only operate within the container limits. We further provide means for the administrator to bypass the isolation, and perform operations beyond her boundaries. However, such operations are monitored and logged for later analysis and anomaly detection. We provide a proof-of-concept implementation of our strategy, along with a case study on the IT database of IBM Research in Israel.

Bloom, Gedare, Parmer, Gabriel, Simha, Rahul.  2016.  LockDown: An Operating System for Achieving Service Continuity by Quarantining Principals. Proceedings of the 9th European Workshop on System Security. :7:1–7:6.

This paper introduces quarantine, a new security primitive for an operating system to use in order to protect information and isolate malicious behavior. Quarantine's core feature is the ability to fork a protection domain on-the-fly to isolate a specific principal's execution of untrusted code without risk of a compromise spreading. Forking enables the OS to ensure service continuity by permitting even high-risk operations to proceed, albeit subject to greater scrutiny and constraints. Quarantine even partitions executing threads that share resources into isolated protection domains. We discuss the design and implementation of quarantine within the LockDown OS, a security-focused evolution of the Composite component-based microkernel OS. Initial performance results for quarantine show that about 98% of the overhead comes from the cost of copying memory to the new protection domain.

2017-05-19
Parkin, Simon, Fielder, Andrew, Ashby, Alex.  2016.  Pragmatic Security: Modelling IT Security Management Responsibilities for SME Archetypes. Proceedings of the 8th ACM CCS International Workshop on Managing Insider Security Threats. :69–80.

Here we model the indirect costs of deploying security controls in small-to-medium enterprises (SMEs) to manage cyber threats. SMEs may not have the in-house skills and collective capacity to operate controls efficiently, resulting in inadvertent data leakage and exposure to compromise. Aside from financial costs, attempts to maintain security can impact morale, system performance, and retraining requirements, which are modelled here. Managing the overall complexity and effectiveness of an SME's security controls has the potential to reduce unintended leakage. The UK Cyber Essentials Scheme informs basic control definitions, and Available Responsibility Budget (ARB) is modelled to understand how controls can be prioritised for both security and usability. Human factors of security and practical experience of security management for SMEs inform the modelling of deployment challenges across a set of SME archetypes differing in size, complexity, and use of IT. Simple combinations of controls are matched to archetypes, balancing capabilities to protect data assets with the effort demands placed upon employees. Experiments indicate that two-factor authentication can be readily adopted by many SMEs and their employees to protect core assets, followed by correct access privileges and anti-malware software. Service and technology providers emerge as playing an important role in improving access to usable security controls for SMEs.

Green, Benjamin, Krotofil, Marina, Hutchison, David.  2016.  Achieving ICS Resilience and Security Through Granular Data Flow Management. Proceedings of the 2Nd ACM Workshop on Cyber-Physical Systems Security and Privacy. :93–101.

Modern Industrial Control Systems (ICS) rely on enterprise to plant floor connectivity. Where the size, diversity, and therefore complexity of ICS increase, operational requirements, goals, and challenges defined by users across various sub-systems follow. Recent trends in Information Technology (IT) and Operational Technology (OT) convergence may cause operators to lose a comprehensive understanding of end-to-end data flow requirements. This presents a risk to system security and resilience. Sensors were once solely applied for operational process use, but now act as inputs supporting a diverse set of organisational requirements. If these are not fully understood, incomplete risk assessment, and inappropriate implementation of security controls could occur. In search of a solution, operators may turn to standards and guidelines. This paper reviews popular standards and guidelines, prior to the presentation of a case study and conceptual tool, highlighting the importance of data flows, critical data processing points, and system-to-user relationships. The proposed approach forms a basis for risk assessment and security control implementation, aiding the evolution of ICS security and resilience.

Ivanov, Radoslav, Pajic, Miroslav, Lee, Insup.  2016.  Attack-Resilient Sensor Fusion for Safety-Critical Cyber-Physical Systems. ACM Trans. Embed. Comput. Syst.. 15:21:1–21:24.

This article focuses on the design of safe and attack-resilient Cyber-Physical Systems (CPS) equipped with multiple sensors measuring the same physical variable. A malicious attacker may be able to disrupt system performance through compromising a subset of these sensors. Consequently, we develop a precise and resilient sensor fusion algorithm that combines the data received from all sensors by taking into account their specified precisions. In particular, we note that in the presence of a shared bus, in which messages are broadcast to all nodes in the network, the attacker’s impact depends on what sensors he has seen before sending the corrupted measurements. Therefore, we explore the effects of communication schedules on the performance of sensor fusion and provide theoretical and experimental results advocating for the use of the Ascending schedule, which orders sensor transmissions according to their precision starting from the most precise. In addition, to improve the accuracy of the sensor fusion algorithm, we consider the dynamics of the system in order to incorporate past measurements at the current time. Possible ways of mapping sensor measurement history are investigated in the article and are compared in terms of the confidence in the final output of the sensor fusion. We show that the precision of the algorithm using history is never worse than the no-history one, while the benefits may be significant. Furthermore, we utilize the complementary properties of the two methods and show that their combination results in a more precise and resilient algorithm. Finally, we validate our approach in simulation and experiments on a real unmanned ground robot.

Hojjati, Avesta, Adhikari, Anku, Struckmann, Katarina, Chou, Edward, Tho Nguyen, Thi Ngoc, Madan, Kushagra, Winslett, Marianne S., Gunter, Carl A., King, William P..  2016.  Leave Your Phone at the Door: Side Channels That Reveal Factory Floor Secrets. Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. :883–894.

From pencils to commercial aircraft, every man-made object must be designed and manufactured. When it is cheaper or easier to steal a design or a manufacturing process specification than to invent one's own, the incentive for theft is present. As more and more manufacturing data comes online, incidents of such theft are increasing. In this paper, we present a side-channel attack on manufacturing equipment that reveals both the form of a product and its manufacturing process, i.e., exactly how it is made. In the attack, a human deliberately or accidentally places an attack-enabled phone close to the equipment or makes or receives a phone call on any phone nearby. The phone executing the attack records audio and, optionally, magnetometer data. We present a method of reconstructing the product's form and manufacturing process from the captured data, based on machine learning, signal processing, and human assistance. We demonstrate the attack on a 3D printer and a CNC mill, each with its own acoustic signature, and discuss the commonalities in the sensor data captured for these two different machines. We compare the quality of the data captured with a variety of smartphone models. Capturing data from the 3D printer, we reproduce the form and process information of objects previously unknown to the reconstructors. On average, our accuracy is within 1 mm in reconstructing the length of a line segment in a fabricated object's shape and within 1 degree in determining an angle in a fabricated object's shape. We conclude with recommendations for defending against these attacks.

Kocabas, Ovunc, Soyata, Tolga, Aktas, Mehmet K..  2016.  Emerging Security Mechanisms for Medical Cyber Physical Systems. IEEE/ACM Trans. Comput. Biol. Bioinformatics. 13:401–416.

The following decade will witness a surge in remote health-monitoring systems that are based on body-worn monitoring devices. These Medical Cyber Physical Systems (MCPS) will be capable of transmitting the acquired data to a private or public cloud for storage and processing. Machine learning algorithms running in the cloud and processing this data can provide decision support to healthcare professionals. There is no doubt that the security and privacy of the medical data is one of the most important concerns in designing an MCPS. In this paper, we depict the general architecture of an MCPS consisting of four layers: data acquisition, data aggregation, cloud processing, and action. Due to the differences in hardware and communication capabilities of each layer, different encryption schemes must be used to guarantee data privacy within that layer. We survey conventional and emerging encryption schemes based on their ability to provide secure storage, data sharing, and secure computation. Our detailed experimental evaluation of each scheme shows that while the emerging encryption schemes enable exciting new features such as secure sharing and secure computation, they introduce several orders-of-magnitude computational and storage overhead. We conclude our paper by outlining future research directions to improve the usability of the emerging encryption schemes in an MCPS.

Ho, Grant, Leung, Derek, Mishra, Pratyush, Hosseini, Ashkan, Song, Dawn, Wagner, David.  2016.  Smart Locks: Lessons for Securing Commodity Internet of Things Devices. Proceedings of the 11th ACM on Asia Conference on Computer and Communications Security. :461–472.

We examine the security of home smart locks: cyber-physical devices that replace traditional door locks with deadbolts that can be electronically controlled by mobile devices or the lock manufacturer's remote servers. We present two categories of attacks against smart locks and analyze the security of five commercially-available locks with respect to these attacks. Our security analysis reveals that flaws in the design, implementation, and interaction models of existing locks can be exploited by several classes of adversaries, allowing them to learn private information about users and gain unauthorized home access. To guide future development of smart locks and similar Internet of Things devices, we propose several defenses that mitigate the attacks we present. One of these defenses is a novel approach to securely and usably communicate a user's intended actions to smart locks, which we prototype and evaluate. Ultimately, our work takes a first step towards illuminating security challenges in the system design and novel functionality introduced by emerging IoT systems.

Wadhawan, Yatin, Neuman, Clifford.  2016.  Evaluating Resilience of Gas Pipeline Systems Under Cyber-Physical Attacks: A Function-Based Methodology. Proceedings of the 2Nd ACM Workshop on Cyber-Physical Systems Security and Privacy. :71–80.

In this research paper, we present a function-based methodology to evaluate the resilience of gas pipeline systems under two different cyber-physical attack scenarios. The first attack scenario is the pressure integrity attack on the natural gas high-pressure transmission pipeline. Through simulations, we have analyzed the cyber attacks that propagate from cyber to the gas pipeline physical domain, the time before which the SCADA system should respond to such attacks, and finally, an attack which prevents the response of the system. We have used the combined results of simulations of a wireless mesh network for remote terminal units and of a gas pipeline simulation to measure the shortest Time to Criticality (TTC) parameter; the time for an event to reach the failure state. The second attack scenario describes how a failure of a cyber node controlling power grid functionality propagates from cyber to power to gas pipeline systems. We formulate this problem using a graph-theoretic approach and quantify the resilience of the networks by percentage of connected nodes and the length of the shortest path between them. The results show that parameters such as TTC, power distribution capacity of the power grid nodes and percentage of the type of cyber nodes compromised, regulate the efficiency and resilience of the power and gas networks. The analysis of such attack scenarios helps the gas pipeline system administrators design attack remediation algorithms and improve the response of the system to an attack.

Lissovoi, Andrei, Witt, Carsten.  2016.  The Impact of Migration Topology on the Runtime of Island Models in Dynamic Optimization. Proceedings of the Genetic and Evolutionary Computation Conference 2016. :1155–1162.

We introduce a simplified island model with behavior similar to the λ (1+1) islands optimizing the Maze fitness function, and investigate the effects of the migration topology on the ability of the simplified island model to track the optimum of a dynamic fitness function. More specifically, we prove that there exist choices of model parameters for which using a unidirectional ring as the migration topology allows the model to track the oscillating optimum through n Maze-like phases with high probability, while using a complete graph as the migration topology results in the island model losing track of the optimum with overwhelming probability. Additionally, we prove that if migration occurs only rarely, denser migration topologies may be advantageous. This serves to illustrate that while a less-dense migration topology may be useful when optimizing dynamic functions with oscillating behavior, and requires less problem-specific knowledge to determine when migration may be allowed to occur, care must be taken to ensure that a sufficient amount of migration occurs during the optimization process.

Bellon, Sebastien, Favi, Claudio, Malek, Miroslaw, Macchetti, Marco, Regazzoni, Francesco.  2016.  Evaluating the Impact of Environmental Factors on Physically Unclonable Functions (Abstract Only). Proceedings of the 2016 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays. :279–279.

Fabrication process introduces some inherent variability to the attributes of transistors (in particular length, widths, oxide thickness). As a result, every chip is physically unique. Physical uniqueness of microelectronics components can be used for multiple security applications. Physically Unclonable Functions (PUFs) are built to extract the physical uniqueness of microelectronics components and make it usable for secure applications. However, the microelectronics components used by PUFs designs suffer from external, environmental variations that impact the PUF behavior. Variations of temperature gradients during manufacturing can bias the PUF responses. Variations of temperature or thermal noise during PUF operation change the behavior of the circuit, and can introduce errors in PUF responses. Detailed knowledge of the behavior of PUFs operating over various environmental factors is needed to reliably extract and demonstrate uniqueness of the chips. In this work, we present a detailed and exhaustive analysis of the behavior of two PUF designs, a ring oscillator PUF and a timing path violation PUF. We have implemented both PUFs using FPGA fabricated by Xilinx, and analyzed their behavior while varying temperature and supply voltage. Our experiments quantify the robustness of each design, demonstrate their sensitivity to temperature and show the impact which supply voltage has on the uniqueness of the analyzed PUFs.

Xia, Lixue, Tang, Tianqi, Huangfu, Wenqin, Cheng, Ming, Yin, Xiling, Li, Boxun, Wang, Yu, Yang, Huazhong.  2016.  Switched by Input: Power Efficient Structure for RRAM-based Convolutional Neural Network. Proceedings of the 53rd Annual Design Automation Conference. :125:1–125:6.

Convolutional Neural Network (CNN) is a powerful technique widely used in computer vision area, which also demands much more computations and memory resources than traditional solutions. The emerging metal-oxide resistive random-access memory (RRAM) and RRAM crossbar have shown great potential on neuromorphic applications with high energy efficiency. However, the interfaces between analog RRAM crossbars and digital peripheral functions, namely Analog-to-Digital Converters (ADCs) and Digital-to-Analog Converters (DACs), consume most of the area and energy of RRAM-based CNN design due to the large amount of intermediate data in CNN. In this paper, we propose an energy efficient structure for RRAM-based CNN. Based on the analysis of data distribution, a quantization method is proposed to transfer the intermediate data into 1 bit and eliminate DACs. An energy efficient structure using input data as selection signals is proposed to reduce the ADC cost for merging results of multiple crossbars. The experimental results show that the proposed method and structure can save 80% area and more than 95% energy while maintaining the same or comparable classification accuracy of CNN on MNIST.

He, Zhezhi, Fan, Deliang.  2016.  A Low Power Current-Mode Flash ADC with Spin Hall Effect Based Multi-Threshold Comparator. Proceedings of the 2016 International Symposium on Low Power Electronics and Design. :314–319.

Current-mode Analog-to-Digital Converter (ADC) has drawn many attentions due to its high operating speed, power and ground noise immunity, and etc. However, 2n – 1 comparators are required in traditional n-bit current-mode ADC design, leading to inevitable high power consumption and large chip area. In this work, we propose a low power and compact current mode Multi-Threshold Comparator (MTC) based on giant Spin Hall Effect (SHE). The two threshold currents of the proposed SHE-MTC are 200μA and 250μA with 1ns switching time, respectively. The proposed current-mode hybrid spin-CMOS flash ADC based on SHE-MTC reduces the number of comparators almost by half (2n-1), thus correspondingly reducing the required current mirror branches, total power consumption and chip area. Moreover, due to the non-volatility of SHE-MTC, the front-end analog circuits can be switched off when it is not required to further increase power efficiency. The device dynamics of SHE-MTC is simulated using a numerical device model based on Landau-Lifshitz-Gilbert (LLG) equation with Spin-Transfer Torque (STT) term and SHE term. The device-circuit co-simulation in SPICE (45nm CMOS technology) have shown that the average power dissipation of proposed ADC is 1.9mW, operating at 500MS/s with 1.2 V power supply. The INL and DNL are in the range of 0.23LSB and 0.32LSB, respectively.

Calumby, Rodrigo Tripodi.  2016.  Diversity-oriented Multimodal and Interactive Information Retrieval. SIGIR Forum. 50:86–86.

Information retrieval methods, especially considering multimedia data, have evolved towards the integration of multiple sources of evidence in the analysis of the relevance of the items considering a given user search task. In this context, for attenuating the semantic gap between low-level features extracted from the content of the digital objects and high-level semantic concepts (objects, categories, etc.) and making the systems adaptive to different user needs, interactive models have brought the user closer to the retrieval loop allowing user-system interaction mainly through implicit or explicit relevance feedback. Analogously, diversity promotion has emerged as an alternative for tackling ambiguous or underspecified queries. Additionally, several works have addressed the issue of minimizing the required user effort on providing relevance assessments while keeping an acceptable overall effectiveness This thesis discusses, proposes, and experimentally analyzes multimodal and interactive diversity-oriented information retrieval methods. This work, comprehensively covers the interactive information retrieval literature and also discusses about recent advances, the great research challenges, and promising research opportunities. We have proposed and evaluated two relevancediversity trade-off enhancement work-flows, which integrate multiple information from images, such as: visual features, textual metadata, geographic information, and user credibility descriptors. In turn, as an integration of interactive retrieval and diversity promotion techniques, for maximizing the coverage of multiple query interpretations/aspects and speeding up the information transfer between the user and the system, we have proposed and evaluated a multimodal online learning-to-rank method trained with relevance feedback over diversified results Our experimental analysis shows that the joint usage of multiple information sources positively impacted the relevance-diversity balancing algorithms. Our results also suggest that the integration of multimodal-relevance-based filtering and reranking is effective on improving result relevance and also boosts diversity promotion methods. Beyond it, with a thorough experimental analysis we have investigated several research questions related to the possibility of improving result diversity and keeping or even improving relevance in interactive search sessions. Moreover, we analyze how much the diversification effort affects overall search session results and how different diversification approaches behave for the different data modalities. By analyzing the overall and per feedback iteration effectiveness, we show that introducing diversity may harm initial results whereas it significantly enhances the overall session effectiveness not only considering the relevance and diversity, but also how early the user is exposed to the same amount of relevant items and diversity