Visible to the public Biblio

Found 314 results

Filters: Keyword is Standards  [Clear All Filters]
2017-04-20
Carnevale, B., Baldanzi, L., Pilato, L., Fanucci, L..  2016.  A flexible system-on-a-chip implementation of the Advanced Encryption Standard. 2016 20th International Conference on System Theory, Control and Computing (ICSTCC). :156–161.
Systems-on-a-Chip are among the best-performing and complete solutions for complex electronic systems. This is also true in the field of network security, an application requiring high performance with low resource usage. This work presents an Advanced Encryption Standard implementation for Systems-on-a-Chip using as a reference the Cipher Block Chaining mode. In particular, a flexible interface based and the Advanced Peripheral Bus to integrate the encryption algorithm with any kind of processor is presented. The hardware-software approach of the architecture is also analyzed and described. The final system was integrated on a Xilinx Zynq 7000 to prototype and evaluate the idea. Results show that our solution demonstrates good performance and flexibility with low resource usage, occupying less than 2% of the Zynq 7000 with a throughput of 320 Mbps. The architecture is suitable when implementations of symmetric encryption algorithms for modern Systems-on-a-Chip are required.
2017-03-08
Huang, J., Hou, D., Schuckers, S., Hou, Z..  2015.  Effect of data size on performance of free-text keystroke authentication. IEEE International Conference on Identity, Security and Behavior Analysis (ISBA 2015). :1–7.

Free-text keystroke authentication has been demonstrated to be a promising behavioral biometric. But unlike physiological traits such as fingerprints, in free-text keystroke authentication, there is no natural way to identify what makes a sample. It remains an open problem as to how much keystroke data are necessary for achieving acceptable authentication performance. Using public datasets and two existing algorithms, we conduct two experiments to investigate the effect of the reference profile size and test sample size on False Alarm Rate (FAR) and Imposter Pass Rate (IPR). We find that (1) larger reference profiles will drive down both IPR and FAR values, provided that the test samples are large enough, and (2) larger test samples have no obvious effect on IPR, regardless of the reference profile size. We discuss the practical implication of our findings.

Çeker, H., Upadhyaya, S..  2015.  Enhanced recognition of keystroke dynamics using Gaussian mixture models. MILCOM 2015 - 2015 IEEE Military Communications Conference. :1305–1310.

Keystroke dynamics is a form of behavioral biometrics that can be used for continuous authentication of computer users. Many classifiers have been proposed for the analysis of acquired user patterns and verification of users at computer terminals. The underlying machine learning methods that use Gaussian density estimator for outlier detection typically assume that the digraph patterns in keystroke data are generated from a single Gaussian distribution. In this paper, we relax this assumption by allowing digraphs to fit more than one distribution via the Gaussian Mixture Model (GMM). We have conducted an experiment with a public data set collected in a controlled environment. Out of 30 users with dynamic text, we obtain 0.08% Equal Error Rate (EER) with 2 components by using GMM, while pure Gaussian yields 1.3% EER for the same data set (an improvement of EER by 93.8%). Our results show that GMM can recognize keystroke dynamics more precisely and authenticate users with higher confidence level.

Morales, A., Luna-Garcia, E., Fierrez, J., Ortega-Garcia, J..  2015.  Score normalization for keystroke dynamics biometrics. 2015 International Carnahan Conference on Security Technology (ICCST). :223–228.

This paper analyzes score normalization for keystroke dynamics authentication systems. Previous studies have shown that the performance of behavioral biometric recognition systems (e.g. voice and signature) can be largely improved with score normalization and target-dependent techniques. The main objective of this work is twofold: i) to analyze the effects of different thresholding techniques in 4 different keystroke dynamics recognition systems for real operational scenarios; and ii) to improve the performance of keystroke dynamics on the basis of target-dependent score normalization techniques. The experiments included in this work are worked out over the keystroke pattern of 114 users from two different publicly available databases. The experiments show that there is large room for improvements in keystroke dynamic systems. The results suggest that score normalization techniques can be used to improve the performance of keystroke dynamics systems in more than 20%. These results encourage researchers to explore this research line to further improve the performance of these systems in real operational environments.

Gonzalez, N., Calot, E. P..  2015.  Finite Context Modeling of Keystroke Dynamics in Free Text. 2015 International Conference of the Biometrics Special Interest Group (BIOSIG). :1–5.

Keystroke dynamics analysis has been applied successfully to password or fixed short texts verification as a means to reduce their inherent security limitations, because their length and the fact of being typed often makes their characteristic timings fairly stable. On the other hand, free text analysis has been neglected until recent years due to the inherent difficulties of dealing with short term behavioral noise and long term effects over the typing rhythm. In this paper we examine finite context modeling of keystroke dynamics in free text and report promising results for user verification over an extensive data set collected from a real world environment outside the laboratory setting that we make publicly available.

Buda, A., Främling, K., Borgman, J., Madhikermi, M., Mirzaeifar, S., Kubler, S..  2015.  Data supply chain in Industrial Internet. 2015 IEEE World Conference on Factory Communication Systems (WFCS). :1–7.

The Industrial Internet promises to radically change and improve many industry's daily business activities, from simple data collection and processing to context-driven, intelligent and pro-active support of workers' everyday tasks and life. The present paper first provides insight into a typical industrial internet application architecture, then it highlights one fundamental arising contradiction: “Who owns the data is often not capable of analyzing it”. This statement is explained by imaging a visionary data supply chain that would realize some of the Industrial Internet promises. To concretely implement such a system, recent standards published by The Open Group are presented, where we highlight the characteristics that make them suitable for Industrial Internet applications. Finally, we discuss comparable solutions and concludes with new business use cases.

Polemi, N., Papastergiou, S..  2015.  Current efforts in ports and supply chains risk assessment. 2015 10th International Conference for Internet Technology and Secured Transactions (ICITST). :349–354.

Port services and maritime supply chain processes depend upon complex interrelated ICT systems hosted in the ports' Critical Information Infrastructures (CIIs). Current research efforts for securing the dual nature (cyber-physical) of the ports and their supply chain partners are presented here.

Mishra, A., Kumar, K., Rai, S. N., Mittal, V. K..  2015.  Multi-stage face recognition for biometric access. 2015 Annual IEEE India Conference (INDICON). :1–6.

Protecting the privacy of user-identification data is fundamental to protect the information systems from attacks and vulnerabilities. Providing access to such data only to the limited and legitimate users is the key motivation for `Biometrics'. In `Biometric Systems' confirming a user's claim of his/her identity reliably, is more important than focusing on `what he/she really possesses' or `what he/she remembers'. In this paper the use of face image for biometric access is proposed using two multistage face recognition algorithms that employ biometric facial features to validate the user's claim. The proposed algorithms use standard algorithms and classifiers such as EigenFaces, PCA and LDA in stages. Performance evaluation of both proposed algorithms is carried out using two standard datasets, the Extended Yale database and AT&T database. Results using the proposed multi-stage algorithms are better than those using other standard algorithms. Current limitations and possible applications of the proposed algorithms are also discussed along, with further scope of making these robust to pose, illumination and noise variations.

Boykov, Y., Isack, H., Olsson, C., Ayed, I. B..  2015.  Volumetric Bias in Segmentation and Reconstruction: Secrets and Solutions. 2015 IEEE International Conference on Computer Vision (ICCV). :1769–1777.

Many standard optimization methods for segmentation and reconstruction compute ML model estimates for appearance or geometry of segments, e.g. Zhu-Yuille [23], Torr [20], Chan-Vese [6], GrabCut [18], Delong et al. [8]. We observe that the standard likelihood term in these formu-lations corresponds to a generalized probabilistic K-means energy. In learning it is well known that this energy has a strong bias to clusters of equal size [11], which we express as a penalty for KL divergence from a uniform distribution of cardinalities. However, this volumetric bias has been mostly ignored in computer vision. We demonstrate signif- icant artifacts in standard segmentation and reconstruction methods due to this bias. Moreover, we propose binary and multi-label optimization techniques that either (a) remove this bias or (b) replace it by a KL divergence term for any given target volume distribution. Our general ideas apply to continuous or discrete energy formulations in segmenta- tion, stereo, and other reconstruction problems.

2017-03-07
Allawi, M. A. A., Hadi, A., Awajan, A..  2015.  MLDED: Multi-layer Data Exfiltration Detection System. 2015 Fourth International Conference on Cyber Security, Cyber Warfare, and Digital Forensic (CyberSec). :107–112.

Due to the growing advancement of crime ware services, the computer and network security becomes a crucial issue. Detecting sensitive data exfiltration is a principal component of each information protection strategy. In this research, a Multi-Level Data Exfiltration Detection (MLDED) system that can handle different types of insider data leakage threats with staircase difficulty levels and their implications for the organization environment has been proposed, implemented and tested. The proposed system detects exfiltration of data outside an organization information system, where the main goal is to use the detection results of a MLDED system for digital forensic purposes. MLDED system consists of three major levels Hashing, Keywords Extraction and Labeling. However, it is considered only for certain type of documents such as plain ASCII text and PDF files. In response to the challenging issue of identifying insider threats, a forensic readiness data exfiltration system is designed that is capable of detecting and identifying sensitive information leaks. The results show that the proposed system has an overall detection accuracy of 98.93%.

Gupta, M. K., Govil, M. C., Singh, G., Sharma, P..  2015.  XSSDM: Towards detection and mitigation of cross-site scripting vulnerabilities in web applications. 2015 International Conference on Advances in Computing, Communications and Informatics (ICACCI). :2010–2015.

With the growth of the Internet, web applications are becoming very popular in the user communities. However, the presence of security vulnerabilities in the source code of these applications is raising cyber crime rate rapidly. It is required to detect and mitigate these vulnerabilities before their exploitation in the execution environment. Recently, Open Web Application Security Project (OWASP) and Common Vulnerabilities and Exposures (CWE) reported Cross-Site Scripting (XSS) as one of the most serious vulnerabilities in the web applications. Though many vulnerability detection approaches have been proposed in the past, existing detection approaches have the limitations in terms of false positive and false negative results. This paper proposes a context-sensitive approach based on static taint analysis and pattern matching techniques to detect and mitigate the XSS vulnerabilities in the source code of web applications. The proposed approach has been implemented in a prototype tool and evaluated on a public data set of 9408 samples. Experimental results show that proposed approach based tool outperforms over existing popular open source tools in the detection of XSS vulnerabilities.

2017-02-27
Zheng, Y., Zheng, S..  2015.  Cyber Security Risk Assessment for Industrial Automation Platform. 2015 International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP). :341–344.

Due to the fact that the cyber security risks exist in industrial control system, risk assessment on Industrial Automation Platform (IAP) is discussed in this paper. The cyber security assessment model for IAP is built based on relevant standards at abroad. Fuzzy analytic hierarchy process and fuzzy comprehensive evaluation method based on entropy theory are utilized to evaluate the communication links' risk of IAP software. As a result, the risk weight of communication links which have impacts on platform and the risk level of this platform are given for further study on protective strategy. The assessment result shows that the methods used can evaluate this platform efficiently and practically.

Zhang, L., Li, B., Zhang, L., Li, D..  2015.  Fuzzy clustering of incomplete data based on missing attribute interval size. 2015 IEEE 9th International Conference on Anti-counterfeiting, Security, and Identification (ASID). :101–104.

Fuzzy c-means algorithm is used to identity clusters of similar objects within a data set, while it is not directly applied to incomplete data. In this paper, we proposed a novel fuzzy c-means algorithm based on missing attribute interval size for the clustering of incomplete data. In the new algorithm, incomplete data set was transformed to interval data set according to the nearest neighbor rule. The missing attribute value was replaced by the corresponding interval median and the interval size was set as the additional property for the incomplete data to control the effect of interval size in clustering. Experiments on standard UCI data set show that our approach outperforms other clustering methods for incomplete data.

2017-02-23
Jia, L., Sen, S., Garg, D., Datta, A..  2015.  "A Logic of Programs with Interface-Confined Code". 2015 IEEE 28th Computer Security Foundations Symposium. :512–525.

Interface-confinement is a common mechanism that secures untrusted code by executing it inside a sandbox. The sandbox limits (confines) the code's interaction with key system resources to a restricted set of interfaces. This practice is seen in web browsers, hypervisors, and other security-critical systems. Motivated by these systems, we present a program logic, called System M, for modeling and proving safety properties of systems that execute adversary-supplied code via interface-confinement. In addition to using computation types to specify effects of computations, System M includes a novel invariant type to specify the properties of interface-confined code. The interpretation of invariant type includes terms whose effects satisfy an invariant. We construct a step-indexed model built over traces and prove the soundness of System M relative to the model. System M is the first program logic that allows proofs of safety for programs that execute adversary-supplied code without forcing the adversarial code to be available for deep static analysis. System M can be used to model and verify protocols as well as system designs. We demonstrate the reasoning principles of System M by verifying the state integrity property of the design of Memoir, a previously proposed trusted computing system.

P. Jain, S. Nandanwar.  2015.  "Securing the Clustered Database Using Data Modification Technique". 2015 International Conference on Computational Intelligence and Communication Networks (CICN). :1163-1166.

The new era of information communication and technology (ICT), everyone wants to store/share their Data or information in online media, like in cloud database, mobile database, grid database, drives etc. When the data is stored in online media the main problem is arises related to data is privacy because different types of hacker, attacker or crackers wants to disclose their private information as publically. Security is a continuous process of protecting the data or information from attacks. For securing that information from those kinds of unauthorized people we proposed and implement of one the technique based on the data modification concept with taking the iris database on weka tool. And this paper provides the high privacy in distributed clustered database environments.

2017-02-14
M. Bere, H. Muyingi.  2015.  "Initial investigation of Industrial Control System (ICS) security using Artificial Immune System (AIS)". 2015 International Conference on Emerging Trends in Networks and Computer Communications (ETNCC). :79-84.

Industrial Control Systems (ICS) which among others are comprised of Supervisory Control and Data Acquisition (SCADA) and Distributed Control Systems (DCS) are used to control industrial processes. ICS have now been connected to other Information Technology (IT) systems and have as a result become vulnerable to Advanced Persistent Threats (APT). APTs are targeted attacks that use zero-day attacks to attack systems. Current ICS security mechanisms fail to deter APTs from infiltrating ICS. An analysis of possible solutions to deter APTs was done. This paper proposes the use of Artificial Immune Systems to secure ICS from APTs.

M. Ussath, F. Cheng, C. Meinel.  2015.  "Concept for a security investigation framework". 2015 7th International Conference on New Technologies, Mobility and Security (NTMS). :1-5.

The number of detected and analyzed Advanced Persistent Threat (APT) campaigns increased over the last years. Two of the main objectives of such campaigns are to maintain long-term access to the environment of the target and to stay undetected. To achieve these goals the attackers use sophisticated and customized techniques for the lateral movement, to ensure that these activities are not detected by existing security systems. During an investigation of an APT campaign all stages of it are relevant to clarify important details like the initial infection vector or the compromised systems and credentials. Most of the currently used approaches, which are utilized within security systems, are not able to detect the different stages of a complex attack and therefore a comprehensive security investigation is needed. In this paper we describe a concept for a Security Investigation Framework (SIF) that supports the analysis and the tracing of multi-stage APTs. The concept includes different automatic and semi-automatic approaches that support the investigation of such attacks. Furthermore, the framework leverages different information sources, like log files and details from forensic investigations and malware analyses, to give a comprehensive overview of the different stages of an attack. The overall objective of the SIF is to improve the efficiency of investigations and reveal undetected details of an attack.

S. Majumdar, A. Maiti, A. Nath.  2015.  "New Secured Steganography Algorithm Using Encrypted Secret Message inside QRTM Code: System Implemented in Android Phone". 2015 International Conference on Computational Intelligence and Communication Networks (CICN). :1130-1134.

Steganography is a method of hiding information, whereas the goal of cryptography is to make data unreadable. Both of these methodologies have their own advantages and disadvantages. Encrypted messages are easily detectable. If someone is spying on communication channel for encrypted message, he/she can easily identify the encrypted messages. Encryption may draw unnecessary attention to the transferred messages. This may lead to cryptanalysis of the encrypted message if the spy tries to know the message. If the encryption technique is not strong enough, the message may be deciphered. In contrast, Steganography tries to hide the data from third party by smartly embedding the data to some other file which is not at all related to the message. Here care is to be taken to minimize the modification of the container file in the process of embedding data. But the disadvantage of steganography is that it is not as secure as cryptography. In the present method the authors have introduced three-step security. Firstly the secret message is encrypted using bit level columnar transposition method introduced by Nath et al and after that the encrypted message is embedded in some image file along with its size. Finally the modified image is encoded into a QR Code TM. The entire method has also been implemented for the Android mobile environment. This method may be used to transfer confidential message through Android mobile phone.

2017-02-13
S. V. Trivedi, M. A. Hasamnis.  2015.  "Development of platform using NIOS II soft core processor for image encryption and decryption using AES algorithm". 2015 International Conference on Communications and Signal Processing (ICCSP). :1147-1151.

In our digital world internet is a widespread channel for transmission of information. Information that is transmitted can be in form of messages, images, audios and videos. Due to this escalating use of digital data exchange cryptography and network security has now become very important in modern digital communication network. Cryptography is a method of storing and transmitting data in a particular form so that only those for whom it is intended can read and process it. The term cryptography is most often associated with scrambling plaintext into ciphertext. This process is called as encryption. Today in industrial processes images are very frequently used, so it has become essential for us to protect the confidential image data from unauthorized access. In this paper Advanced Encryption Standard (AES) which is a symmetric algorithm is used for encryption and decryption of image. Performance of Advanced Encryption Standard algorithm is further enhanced by adding a key stream generator W7. NIOS II soft core processor is used for implementation of encryption and decryption algorithm. A system is designed with the help of SOPC (System on programmable chip) builder tool which is available in QUARTUS II (Version 10.1) environment using NIOS II soft core processor. Developed single core system is implemented using Altera DE2 FPGA board (Cyclone II EP2C35F672). Using MATLAB the image is read and then by using DWT (Discrete Wavelet Transform) the image is compressed. The image obtained after compression is now given as input to proposed AES encryption algorithm. The output of encryption algorithm is given as input to decryption algorithm in order to get back the original image. The implementation of which is done on the developed single core platform using NIOS II processor. Finally the output is analyzed in MATLAB by plotting histogram of original and encrypted image.

B. Boyadjis, C. Bergeron, S. Lecomte.  2015.  "Auto-synchronized selective encryption of video contents for an improved transmission robustness over error-prone channels". 2015 IEEE International Conference on Image Processing (ICIP). :2969-2973.

Selective encryption designates a technique that aims at scrambling a message content while preserving its syntax. Such an approach allows encryption to be transparent towards middle-box and/or end user devices, and to easily fit within existing pipelines. In this paper, we propose to apply this property to a real-time diffusion scenario - or broadcast - over a RTP session. The main challenge of such problematic is the preservation of the synchronization between encryption and decryption. Our solution is based on the Advanced Encryption Standard in counter mode which has been modified to fit our auto-synchronization requirement. Setting up the proposed synchronization scheme does not induce any latency, and requires no additional bandwidth in the RTP session (no additional information is sent). Moreover, its parallel structure allows to start decryption on any given frame of the video while leaving a lot of room for further optimization purposes.

2015-05-06
Ochian, A., Suciu, G., Fratu, O., Voicu, C., Suciu, V..  2014.  An overview of cloud middleware services for interconnection of healthcare platforms. Communications (COMM), 2014 10th International Conference on. :1-4.

Using heterogeneous clouds has been considered to improve performance of big-data analytics for healthcare platforms. However, the problem of the delay when transferring big-data over the network needs to be addressed. The purpose of this paper is to analyze and compare existing cloud computing environments (PaaS, IaaS) in order to implement middleware services. Understanding the differences and similarities between cloud technologies will help in the interconnection of healthcare platforms. The paper provides a general overview of the techniques and interfaces for cloud computing middleware services, and proposes a cloud architecture for healthcare. Cloud middleware enables heterogeneous devices to act as data sources and to integrate data from other healthcare platforms, but specific APIs need to be developed. Furthermore, security and management problems need to be addressed, given the heterogeneous nature of the communication and computing environment. The present paper fills a gap in the electronic healthcare register literature by providing an overview of cloud computing middleware services and standardized interfaces for the integration with medical devices.

Talamo, M., Barchiesi, M.L., Merella, D., Schunck, C.H..  2014.  Global convergence in digital identity and attribute management: Emerging needs for standardization. ITU Kaleidoscope Academic Conference: Living in a converged world - Impossible without standards?, Proceedings of the 2014. :15-21.

In a converging world, where borders between countries are surpassed in the digital environment, it is necessary to develop systems that effectively replace the recognition “vis-a-vis” with digital means of recognizing and identifying entities and people. In this work we summarize the current standardization efforts in the area of digital identity management. We identify a number of open challenges that need to be addressed in the near future to ensure the interoperability and usability of digital identity management services in an efficient and privacy maintaining international framework. These challenges for standardization include: the management of identifiers for digital identities at the global level; attribute management including attribute format, structure, and assurance; procedures and protocols to link attributes to digital identities. Attention is drawn to key elements that should be considered in addressing these issues through standardization.

Yakut, S., Ozer, A.B..  2014.  HMAC based one t #x0131;me password generator. Signal Processing and Communications Applications Conference (SIU), 2014 22nd. :1563-1566.

One Time Password which is fixed length strings to perform authentication in electronic media is used as a one-time. In this paper, One Time Password production methods which based on hash functions were investigated. Keccak digest algorithm was used for the production of One Time Password. This algorithm has been selected as the latest standards for hash algorithm in October 2012 by National Instute of Standards and Technology. This algorithm is preferred because it is faster and safer than the others. One Time Password production methods based on hash functions is called Hashing-Based Message Authentication Code structure. In these structures, the key value is using with the hash function to generate the Hashing-Based Message Authentication Code value. Produced One Time Password value is based on the This value. In this application, the length of the value One Time Password was the eight characters to be useful in practice.
 

Kishore, N., Kapoor, B..  2014.  An efficient parallel algorithm for hash computation in security and forensics applications. Advance Computing Conference (IACC), 2014 IEEE International. :873-877.

Hashing algorithms are used extensively in information security and digital forensics applications. This paper presents an efficient parallel algorithm hash computation. It's a modification of the SHA-1 algorithm for faster parallel implementation in applications such as the digital signature and data preservation in digital forensics. The algorithm implements recursive hash to break the chain dependencies of the standard hash function. We discuss the theoretical foundation for the work including the collision probability and the performance implications. The algorithm is implemented using the OpenMP API and experiments performed using machines with multicore processors. The results show a performance gain by more than a factor of 3 when running on the 8-core configuration of the machine.

Kishore, N., Kapoor, B..  2014.  An efficient parallel algorithm for hash computation in security and forensics applications. Advance Computing Conference (IACC), 2014 IEEE International. :873-877.


Hashing algorithms are used extensively in information security and digital forensics applications. This paper presents an efficient parallel algorithm hash computation. It's a modification of the SHA-1 algorithm for faster parallel implementation in applications such as the digital signature and data preservation in digital forensics. The algorithm implements recursive hash to break the chain dependencies of the standard hash function. We discuss the theoretical foundation for the work including the collision probability and the performance implications. The algorithm is implemented using the OpenMP API and experiments performed using machines with multicore processors. The results show a performance gain by more than a factor of 3 when running on the 8-core configuration of the machine.