Visible to the public Biblio

Found 12044 results

Filters: Keyword is Resiliency  [Clear All Filters]
2018-01-10
Kuo, J., Lal, A..  2017.  Wideband material detection for spoof resistance in GHz ultrasonic fingerprint sensing. 2017 IEEE International Ultrasonics Symposium (IUS). :1–1.
One of the primary motivations for using ultrasound reflectometry for fingerprint imaging is the promise of increased spoof resistance over conventional optical or capacitive sensing approaches due to the ability for ultrasound to determine the elastic impedance of the imaged material. A fake 3D printed plastic finger can therefore be easily distinguished from a real finger. However, ultrasonic sensors are still vulnerable to materials that are similar in impedance to tissue, such as water or rubber. Previously we demonstrated an ultrasonic fingerprint reader operating with 1.3GHz ultrasound based on pulse echo impedance imaging on the backside silicon interface. In this work, we utilize the large bandwidth of these sensors to differentiate between a finger and materials with similar impedances using the frequency response of elastic impedance obtained by transducer excitation with a wideband RF chirp signal. The reflected signal is a strong function of impedance mismatch and absorption [Hoople 2015].
Hu, P., Pathak, P. H., Shen, Y., Jin, H., Mohapatra, P..  2017.  PCASA: Proximity Based Continuous and Secure Authentication of Personal Devices. 2017 14th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON). :1–9.
User's personal portable devices such as smartphone, tablet and laptop require continuous authentication of the user to prevent against illegitimate access to the device and personal data. Current authentication techniques require users to enter password or scan fingerprint, making frequent access to the devices inconvenient. In this work, we propose to exploit user's on-body wearable devices to detect their proximity from her portable devices, and use the proximity for continuous authentication of the portable devices. We present PCASA which utilizes acoustic communication for secure proximity estimation with sub-meter level accuracy. PCASA uses Differential Pulse Position Modulation scheme that modulates data through varying the silence period between acoustic pulses to ensure energy efficiency even when authentication operation is being performed once every second. It yields an secure and accurate distance estimation even when user is mobile by utilizing Doppler effect for mobility speed estimation. We evaluate PCASA using smartphone and smartwatches, and show that it supports up to 34 hours of continuous authentication with a fully charged battery.
Shi, Z., Huang, M., Zhao, C., Huang, L., Du, X., Zhao, Y..  2017.  Detection of LSSUAV using hash fingerprint based SVDD. 2017 IEEE International Conference on Communications (ICC). :1–5.
With the rapid development of science and technology, unmanned aerial vehicles (UAVs) gradually become the worldwide focus of science and technology. Not only the development and application but also the security of UAV is of great significance to modern society. Different from methods using radar, optical or acoustic sensors to detect UAV, this paper proposes a novel distance-based support vector data description (SVDD) algorithm using hash fingerprint as feature. This algorithm does not need large number of training samples and its computation complexity is low. Hash fingerprint is generated by extracting features of signal preamble waveforms. Distance-based SVDD algorithm is employed to efficiently detect and recognize low, slow, small unmanned aerial vehicles (LSSUAVs) using 2.4GHz frequency band.
Xie, P., Feng, J., Cao, Z., Wang, J..  2017.  GeneWave: Fast authentication and key agreement on commodity mobile devices. 2017 IEEE 25th International Conference on Network Protocols (ICNP). :1–10.
Device-to-device (D2D) communication is widely used for mobile devices and Internet of Things (IoT). Authentication and key agreement are critical to build a secure channel between two devices. However, existing approaches often rely on a pre-built fingerprint database and suffer from low key generation rate. We present GeneWave, a fast device authentication and key agreement protocol for commodity mobile devices. GeneWave first achieves bidirectional initial authentication based on the physical response interval between two devices. To keep the accuracy of interval estimation, we eliminate time uncertainty on commodity devices through fast signal detection and redundancy time cancellation. Then we derive the initial acoustic channel response (ACR) for device authentication. We design a novel coding scheme for efficient key agreement while ensuring security. Therefore, two devices can authenticate each other and securely agree on a symmetric key. GeneWave requires neither special hardware nor pre-built fingerprint database, and thus it is easy-to-use on commercial mobile devices. We implement GeneWave on mobile devices (i.e., Nexus 5X and Nexus 6P) and evaluate its performance through extensive experiments. Experimental results show that GeneWave efficiently accomplish secure key agreement on commodity smartphones with a key generation rate 10x faster than the state-of-the-art approach.
2017-12-28
Vizarreta, P., Heegaard, P., Helvik, B., Kellerer, W., Machuca, C. M..  2017.  Characterization of failure dynamics in SDN controllers. 2017 9th International Workshop on Resilient Networks Design and Modeling (RNDM). :1–7.

With Software Defined Networking (SDN) the control plane logic of forwarding devices, switches and routers, is extracted and moved to an entity called SDN controller, which acts as a broker between the network applications and physical network infrastructure. Failures of the SDN controller inhibit the network ability to respond to new application requests and react to events coming from the physical network. Despite of the huge impact that a controller has on the network performance as a whole, a comprehensive study on its failure dynamics is still missing in the state of the art literature. The goal of this paper is to analyse, model and evaluate the impact that different controller failure modes have on its availability. A model in the formalism of Stochastic Activity Networks (SAN) is proposed and applied to a case study of a hypothetical controller based on commercial controller implementations. In case study we show how the proposed model can be used to estimate the controller steady state availability, quantify the impact of different failure modes on controller outages, as well as the effects of software ageing, and impact of software reliability growth on the transient behaviour.

Mailloux, L. O., Sargeant, B. N., Hodson, D. D., Grimaila, M. R..  2017.  System-level considerations for modeling space-based quantum key distribution architectures. 2017 Annual IEEE International Systems Conference (SysCon). :1–6.

Quantum Key Distribution (QKD) is a revolutionary technology which leverages the laws of quantum mechanics to distribute cryptographic keying material between two parties with theoretically unconditional security. Terrestrial QKD systems are limited to distances of \textbackslashtextless;200 km in both optical fiber and line-of-sight free-space configurations due to severe losses during single photon propagation and the curvature of the Earth. Thus, the feasibility of fielding a low Earth orbit (LEO) QKD satellite to overcome this limitation is being explored. Moreover, in August 2016, the Chinese Academy of Sciences successfully launched the world's first QKD satellite. However, many of the practical engineering performance and security tradeoffs associated with space-based QKD are not well understood for global secure key distribution. This paper presents several system-level considerations for modeling and studying space-based QKD architectures and systems. More specifically, this paper explores the behaviors and requirements that researchers must examine to develop a model for studying the effectiveness of QKD between LEO satellites and ground stations.

Suebsombut, P., Sekhari, A., Sureepong, P., Ueasangkomsate, P., Bouras, A..  2017.  The using of bibliometric analysis to classify trends and future directions on \#x201C;smart farm \#x201D;. 2017 International Conference on Digital Arts, Media and Technology (ICDAMT). :136–141.

Climate change has affected the cultivation in all countries with extreme drought, flooding, higher temperature, and changes in the season thus leaving behind the uncontrolled production. Consequently, the smart farm has become part of the crucial trend that is needed for application in certain farm areas. The aims of smart farm are to control and to enhance food production and productivity, and to increase farmers' profits. The advantages in applying smart farm will improve the quality of production, supporting the farm workers, and better utilization of resources. This study aims to explore the research trends and identify research clusters on smart farm using bibliometric analysis that has supported farming to improve the quality of farm production. The bibliometric analysis is the method to explore the relationship of the articles from a co-citation network of the articles and then science mapping is used to identify clusters in the relationship. This study examines the selected research articles in the smart farm field. The area of research in smart farm is categorized into two clusters that are soil carbon emission from farming activity, food security and farm management by using a VOSviewer tool with keywords related to research articles on smart farm, agriculture, supply chain, knowledge management, traceability, and product lifecycle management from Web of Science (WOS) and Scopus online database. The major cluster of smart farm research is the soil carbon emission from farming activity which impacts on climate change that affects food production and productivity. The contribution is to identify the trends on smart farm to develop research in the future by means of bibliometric analysis.

Liu, H., Ditzler, G..  2017.  A fast information-theoretic approximation of joint mutual information feature selection. 2017 International Joint Conference on Neural Networks (IJCNN). :4610–4617.

Feature selection is an important step in data analysis to address the curse of dimensionality. Such dimensionality reduction techniques are particularly important when if a classification is required and the model scales in polynomial time with the size of the feature (e.g., some applications include genomics, life sciences, cyber-security, etc.). Feature selection is the process of finding the minimum subset of features that allows for the maximum predictive power. Many of the state-of-the-art information-theoretic feature selection approaches use a greedy forward search; however, there are concerns with the search in regards to the efficiency and optimality. A unified framework was recently presented for information-theoretic feature selection that tied together many of the works in over the past twenty years. The work showed that joint mutual information maximization (JMI) is generally the best options; however, the complexity of greedy search for JMI scales quadratically and it is infeasible on high dimensional datasets. In this contribution, we propose a fast approximation of JMI based on information theory. Our approach takes advantage of decomposing the calculations within JMI to speed up a typical greedy search. We benchmarked the proposed approach against JMI on several UCI datasets, and we demonstrate that the proposed approach returns feature sets that are highly consistent with JMI, while decreasing the run time required to perform feature selection.

Manoja, I., Sk, N. S., Rani, D. R..  2017.  Prevention of DDoS attacks in cloud environment. 2017 International Conference on Big Data Analytics and Computational Intelligence (ICBDAC). :235–239.

Cloud computing emerges as an endowment technological data for the longer term and increasing on one of the standards of utility computing is most likely claimed to symbolize a wholly new paradigm for viewing and getting access to computational assets. As a result of protection problem many purchasers hesitate in relocating their touchy data on the clouds, regardless of gigantic curiosity in cloud-based computing. Security is a tremendous hassle, considering the fact that so much of firms present a alluring goal for intruders and the particular considerations will pursue to lower the advancement of distributed computing if not located. Hence, this recent scan and perception is suitable to honeypot. Distributed Denial of Service (DDoS) is an assault that threats the availability of the cloud services. It's fundamental investigate the most important features of DDoS Defence procedures. This paper provides exact techniques that been carried out to the DDoS attack. These approaches are outlined in these paper and use of applied sciences for special kind of malfunctioning within the cloud.

Shafee, S., Rajaei, B..  2017.  A secure steganography algorithm using compressive sensing based on HVS feature. 2017 Seventh International Conference on Emerging Security Technologies (EST). :74–78.

Steganography is the science of hiding information to send secret messages using the carrier object known as stego object. Steganographic technology is based on three principles including security, robustness and capacity. In this paper, we present a digital image hidden by using the compressive sensing technology to increase security of stego image based on human visual system features. The results represent which our proposed method provides higher security in comparison with the other presented methods. Bit Correction Rate between original secret message and extracted message is used to show the accuracy of this method.

Kabiri, M. N., Wannous, M..  2017.  An Experimental Evaluation of a Cloud-Based Virtual Computer Laboratory Using Openstack. 2017 6th IIAI International Congress on Advanced Applied Informatics (IIAI-AAI). :667–672.

In previous work, we proposed a solution to facilitate access to computer science related courses and learning materials using cloud computing and mobile technologies. The solution was positively evaluated by the participants, but most of them indicated that it lacks support for laboratory activities. As it is well known that many of computer science subjects (e.g. Computer Networks, Information Security, Systems Administration, etc.) require a suitable and flexible environment where students can access a set of computers and network devices to successfully complete their hands-on activities. To achieve this criteria, we created a cloud-based virtual laboratory based on OpenStack cloud platform to facilitate access to virtual machine both locally and remotely. Cloud-based virtual labs bring a lot of advantages, such as increased manageability, scalability, high availability and flexibility, to name a few. This arrangement has been tested in a case-study exercise with a group of students as part of Computer Networks and System Administration courses at Kabul Polytechnic University in Afghanistan. To measure success, we introduced a level test to be completed by participants prior and after the experiment. As a result, the learners achieved an average of 17.1 % higher scores in the post level test after completing the practical exercises. Lastly, we distributed a questionnaire after the experiment and students provided positive feedback on the effectiveness and usefulness of the proposed solution.

Tane, E., Fujigaki, Y..  2017.  Cross-Disciplinary Survey on \#34;Data Science \#34; Field Development: Historical Analysis from 1600s-2000s. 2017 Portland International Conference on Management of Engineering and Technology (PICMET). :1–10.

For the last several decades, the rapid development of information technology and computer performance accelerates generation, transportation and accumulation of digital data, it came to be called "Big Data". In this context, researchers and companies are eager to utilize the data to create new values or manage a wide range of issues, and much focus is being placed on "Data Science" to extract useful information (knowledge) from digital data. Data Science has been developed from several independent fields such as Mathematics/Operations Research, Computer Science, Data Engineering, Visualization and Statistics since 1800s. In addition, Artificial Intelligence converges on this stream recent years. On the other hand, the national projects have been established to utilize data for society with concerns surrounding the security and privacy. In this paper, through detailed analysis on history of this field, processes of development and integration among related fields are discussed as well as comparative aspects between Japan and the United States. This paper also includes a brief discussion of future directions.

Chowdhary, A., Dixit, V. H., Tiwari, N., Kyung, S., Huang, D., Ahn, G. J..  2017.  Science DMZ: SDN based secured cloud testbed. 2017 IEEE Conference on Network Function Virtualization and Software Defined Networks (NFV-SDN). :1–2.

Software Defined Networking (SDN) presents a unique opportunity to manage and orchestrate cloud networks. The educational institutions, like many other industries face a lot of security threats. We have established an SDN enabled Demilitarized Zone (DMZ) — Science DMZ to serve as testbed for securing ASU Internet2 environment. Science DMZ allows researchers to conduct in-depth analysis of security attacks and take necessary countermeasures using SDN based command and control (C&C) center. Demo URL: https : //www.youtube.corn/watchlv = 8yo2lTNV 3r4.

Herley, C., Oorschot, P. C. v.  2017.  SoK: Science, Security and the Elusive Goal of Security as a Scientific Pursuit. 2017 IEEE Symposium on Security and Privacy (SP). :99–120.

The past ten years has seen increasing calls to make security research more “scientific”. On the surface, most agree that this is desirable, given universal recognition of “science” as a positive force. However, we find that there is little clarity on what “scientific” means in the context of computer security research, or consensus on what a “Science of Security” should look like. We selectively review work in the history and philosophy of science and more recent work under the label “Science of Security”. We explore what has been done under the theme of relating science and security, put this in context with historical science, and offer observations and insights we hope may motivate further exploration and guidance. Among our findings are that practices on which the rest of science has reached consensus appear little used or recognized in security, and a pattern of methodological errors continues unaddressed.

Ji, J. C. M., Chua, H. N., Lee, H. S., Iranmanesh, V..  2016.  Privacy and Security: How to Differentiate Them Using Privacy-Security Tree (PST) Classification. 2016 International Conference on Information Science and Security (ICISS). :1–4.

Privacy and security have been discussed in many occasions and in most cases, the importance that these two aspects play on the information system domain are mentioned often. Many times, research is carried out on the individual information security or privacy measures where it is commonly regarded with the focus on the particular measure or both privacy and security are regarded as a whole subject. However, there have been no attempts at establishing a proper method in categorizing any form of objects of protection. Through the review done on this paper, we would like to investigate the relationship between privacy and security and form a break down the aspects of privacy and security in order to provide better understanding through determining if a measure or methodology is security, privacy oriented or both. We would recommend that in further research, a further refined formulation should be formed in order to carry out this determination process. As a result, we propose a Privacy-Security Tree (PST) in this paper that distinguishes the privacy from security measures.

Ibrahim, Rosziati, Omotunde, Habeeb.  2017.  A Hybrid Threat Model for Software Security Requirement Specification - IEEE Conference Publication.

Security is often treated as secondary or a non- functional feature of software which influences the approach of vendors and developers when describing their products often in terms of what it can do (Use Cases) or offer customers. However, tides are beginning to change as more experienced customers are beginning to demand for more secure and reliable software giving priority to confidentiality, integrity and privacy while using these applications. This paper presents the MOTH (Modeling Threats with Hybrid Techniques) framework designed to help organizations secure their software assets from attackers in order to prevent any instance of SQL Injection Attacks (SQLIAs). By focusing on the attack vectors and vulnerabilities exploited by the attackers and brainstorming over possible attacks, developers and security experts can better strategize and specify security requirements required to create secure software impervious to SQLIAs. A live web application was considered in this research work as a case study and results obtained from the hybrid models extensively exposes the vulnerabilities deep within the application and proposed resolution plans for blocking those security holes exploited by SQLIAs.
 

Amin, S..  2016.  Security games on infrastructure networks. 2016 Science of Security for Cyber-Physical Systems Workshop (SOSCYPS). :1–4.

The theory of robust control models the controller-disturbance interaction as a game where disturbance is nonstrategic. The proviso of a deliberately malicious (strategic) attacker should be considered to increase the robustness of infrastructure systems. This has become especially important since many IT systems supporting critical functionalities are vulnerable to exploits by attackers. While the usefulness of game theory methods for modeling cyber-security is well established in the literature, new game theoretic models of cyber-physical security are needed for deriving useful insights on "optimal" attack plans and defender responses, both in terms of allocation of resources and operational strategies of these players. This whitepaper presents some progress and challenges in using game-theoretic models for security of infrastructure networks. Main insights from the following models are presented: (i) Network security game on flow networks under strategic edge disruptions; (ii) Interdiction problem on distribution networks under node disruptions; (iii) Inspection game to monitor commercial non-technical losses (e.g. energy diversion); and (iv) Interdependent security game of networked control systems under communication failures. These models can be used to analyze the attacker-defender interactions in a class of cyber-physical security scenarios.

Datta, A., Kar, S., Sinopoli, B., Weerakkody, S..  2016.  Accountability in cyber-physical systems. 2016 Science of Security for Cyber-Physical Systems Workshop (SOSCYPS). :1–3.

Our position is that a key component of securing cyber-physical systems (CPS) is to develop a theory of accountability that encompasses both control and computing systems. We envision that a unified theory of accountability in CPS can be built on a foundation of causal information flow analysis. This theory will support design and analysis of mechanisms at various stages of the accountability regime: attack detection, responsibility-assignment (e.g., attack identification or localization), and corrective measures (e.g., via resilient control) As an initial step in this direction, we summarize our results on attack detection in control systems. We use the Kullback-Liebler (KL) divergence as a causal information flow measure. We then recover, using information flow analyses, a set of existing results in the literature that were previously proved using different techniques. These results cover passive detection, stealthy attack characterization, and active detection. This research direction is related to recent work on accountability in computational systems [1], [2], [3], [4]. We envision that by casting accountability theories in computing and control systems in terms of causal information flow, we can provide a common foundation to develop a theory for CPS that compose elements from both domains.

Luo, S., Wang, Y., Huang, W., Yu, H..  2016.  Backup and Disaster Recovery System for HDFS. 2016 International Conference on Information Science and Security (ICISS). :1–4.

HDFS has been widely used for storing massive scale data which is vulnerable to site disaster. The file system backup is an important strategy for data retention. In this paper, we present an efficient, easy- to-use Backup and Disaster Recovery System for HDFS. The system includes a client based on HDFS with additional feature of remote backup, and a remote server with a HDFS cluster to keep the backup data. It supports full backup and regularly incremental backup to the server with very low cost and high throughout. In our experiment, the average speed of backup and recovery is up to 95 MB/s, approaching the theoretical maximum speed of gigabit Ethernet.

Lucia, W., Sinopoli, B., Franze, G..  2016.  A set-theoretic approach for secure and resilient control of Cyber-Physical Systems subject to false data injection attacks. 2016 Science of Security for Cyber-Physical Systems Workshop (SOSCYPS). :1–5.

In this paper a novel set-theoretic control framework for Cyber-Physical Systems is presented. By resorting to set-theoretic ideas, an anomaly detector module and a control remediation strategy are formally derived with the aim to contrast cyber False Data Injection (FDI) attacks affecting the communication channels. The resulting scheme ensures Uniformly Ultimate Boundedness and constraints fulfillment regardless of any admissible attack scenario.

Sandberg, H., Teixeira, A. M. H..  2016.  From control system security indices to attack identifiability. 2016 Science of Security for Cyber-Physical Systems Workshop (SOSCYPS). :1–6.

In this paper, we investigate detectability and identifiability of attacks on linear dynamical systems that are subjected to external disturbances. We generalize a concept for a security index, which was previously introduced for static systems. The index exactly quantifies the resources necessary for targeted attacks to be undetectable and unidentifiable in the presence of disturbances. This information is useful for both risk assessment and for the design of anomaly detectors. Finally, we show how techniques from the fault detection literature can be used to decouple disturbances and to identify attacks, under certain sparsity constraints.

Kwiatkowska, M..  2016.  Advances and challenges of quantitative verification and synthesis for cyber-physical systems. 2016 Science of Security for Cyber-Physical Systems Workshop (SOSCYPS). :1–5.

We are witnessing a huge growth of cyber-physical systems, which are autonomous, mobile, endowed with sensing, controlled by software, and often wirelessly connected and Internet-enabled. They include factory automation systems, robotic assistants, self-driving cars, and wearable and implantable devices. Since they are increasingly often used in safety- or business-critical contexts, to mention invasive treatment or biometric authentication, there is an urgent need for modelling and verification technologies to support the design process, and hence improve the reliability and reduce production costs. This paper gives an overview of quantitative verification and synthesis techniques developed for cyber-physical systems, summarising recent achievements and future challenges in this important field.

Thuraisingham, B., Kantarcioglu, M., Hamlen, K., Khan, L., Finin, T., Joshi, A., Oates, T., Bertino, E..  2016.  A Data Driven Approach for the Science of Cyber Security: Challenges and Directions. 2016 IEEE 17th International Conference on Information Reuse and Integration (IRI). :1–10.

This paper describes a data driven approach to studying the science of cyber security (SoS). It argues that science is driven by data. It then describes issues and approaches towards the following three aspects: (i) Data Driven Science for Attack Detection and Mitigation, (ii) Foundations for Data Trustworthiness and Policy-based Sharing, and (iii) A Risk-based Approach to Security Metrics. We believe that the three aspects addressed in this paper will form the basis for studying the Science of Cyber Security.

Chatti, S., Ounelli, H..  2017.  Fault Tolerance in a Cloud of Databases Environment. 2017 31st International Conference on Advanced Information Networking and Applications Workshops (WAINA). :166–171.

We will focused the concept of serializability in order to ensure the correct processing of transactions. However, both serializability and relevant properties within transaction-based applications might be affected. Ensure transaction serialization in corrupt systems is one of the demands that can handle properly interrelated transactions, which prevents blocking situations that involve the inability to commit either transaction or related sub-transactions. In addition some transactions has been marked as malicious and they compromise the serialization of running system. In such context, this paper proposes an approach for the processing of transactions in a cloud of databases environment able to secure serializability in running transactions whether the system is compromised or not. We propose also an intrusion tolerant scheme to ensure the continuity of the running transactions. A case study and a simulation result are shown to illustrate the capabilities of the suggested system.

Mondal, S. K., Sabyasachi, A. S., Muppala, J. K..  2017.  On Dependability, Cost and Security Trade-Off in Cloud Data Centers. 2017 IEEE 22nd Pacific Rim International Symposium on Dependable Computing (PRDC). :11–19.

The performance, dependability, and security of cloud service systems are vital for the ongoing operation, control, and support. Thus, controlled improvement in service requires a comprehensive analysis and systematic identification of the fundamental underlying constituents of cloud using a rigorous discipline. In this paper, we introduce a framework which helps identifying areas for potential cloud service enhancements. A cloud service cannot be completed if there is a failure in any of its underlying resources. In addition, resources are kept offline for scheduled maintenance. We use redundant resources to mitigate the impact of failures/maintenance for ensuring performance and dependability; which helps enhancing security as well. For example, at least 4 replicas are required to defend the intrusion of a single instance or a single malicious attack/fault as defined by Byzantine Fault Tolerance (BFT). Data centers with high performance, dependability, and security are outsourced to the cloud computing environment with greater flexibility of cost of owing the computing infrastructure. In this paper, we analyze the effectiveness of redundant resource usage in terms of dependability metric and cost of service deployment based on the priority of service requests. The trade-off among dependability, cost, and security under different redundancy schemes are characterized through the comprehensive analytical models.