Biblio
Interchange of information through cell phones, Tabs and PDAs (Personal Digital Assistant) is the new trend in the era of digitization. In day-to-day activities, sensitive information through mobile phones is exchanged among the users. This sensitive information can be in the form of text messages, images, location, etc. The research on Android mobile applications was done at the MIT, and found that applications are leaking enormous amount of information to the third party servers. 73 percent of 55 Android applications were detected to leak personal information of the users [8]. Transmission of files securely on Android is a big issue. Therefore it is important to shield the privacy of user data on Android operating system. The main motive of this paper is to protect the privacy of data on Android Platform by allowing transmission of textual data, location, pictures in encrypted format. By doing so, we achieved intimacy and integrity of data.
Software metrics are widely used to measure the quality of software and to give an early indication of the efficiency of the development process in industry. There are many well-established frameworks for measuring the quality of source code through metrics, but limited attention has been paid to the quality of software models. In this article, we evaluate the quality of state machine models specified using the Analytical Software Design (ASD) tooling. We discuss how we applied a number of metrics to ASD models in an industrial setting and report about results and lessons learned while collecting these metrics. Furthermore, we recommend some quality limits for each metric and validate them on models developed in a number of industrial projects.
Parameter estimation in wireless sensor networks (WSN) using encrypted non-binary quantized data is studied. In a WSN, sensors transmit their observations to a fusion center through a wireless medium where the observations are susceptible to unauthorized eavesdropping. Encryption approaches for WSNs with fixed threshold binary quantization were previously explored. However, fixed threshold binary quantization limits parameter estimation to scalar parameters. In this paper, we propose a stochastic encryption approach for WSNs that can operate on non-binary quantized observations and has the capability for vector parameter estimation. We extend a binary stochastic encryption approach proposed previously, to a non-binary generalized case. Sensor outputs are quantized using a quantizer with R + 1 levels, where R $ε$ 1, 2, 3,..., encrypted by flipping them with certain flipping probabilities, and then transmitted. Optimal estimators using maximum-likelihood estimation are derived for both a legitimate fusion center (LFC) and a third party fusion center (TPFC) perspectives. We assume the TPFC is unaware of the encryption. Asymptotic analysis of the estimators is performed by deriving the Cramer-Rao lower bound for LFC estimation, and the asymptotic bias and variance for TPFC estimation. Numerical results validating the asymptotic analysis are presented.
In contrast to electronic travel documents (e.g. ePassports), the standardisation of breeder documents (e.g. birth certificates), regarding harmonisation of content and contained security features is in statu nascendi. Due to the fact that breeder documents can be used as an evidence of identity and enable the application for electronic travel documents, they pose the weakest link in the identity life cycle and represent a security gap for identity management. In this work, we present a cost efficient way to enhance the long-term security of breeder documents by utilizing blockchain technology. A conceptual architecture to enhance breeder document long-term security and an introduction of the concept's constituting system components is presented. Our investigations provide evidence that the Bitcoin blockchain is most suitable for breeder document long-term security.
This paper introduces a multi-factors security key generation mechanism for self-organising Internet of Things (IoT) network and nodes. The mechanism enables users to generate unique set of security keys to enhance IoT security while meeting various business needs. The multi-factor security keys presents an additional security layer to existing security standards and practices currently being adopted by the IoT community. The proposed security key generation mechanism enables user to define and choose any physical and logical parameters he/she prefers, in generating a set of security keys to be encrypted and distributed to registered IoT nodes. IoT applications and services will only be activated after verifying that all security keys are present. Multiple levels of authorisation for different user groups can be easily created through the mix and match of the generated multi-factors security keys. A use case, covering indoor and outdoor field tests was conducted. The results of the tests showed that the mechanism is easily adaptable to meet diverse multivendor IoT devices and is scalable for various applications.
Security has always been concern when it comes to data sharing in cloud computing. Cloud computing provides high computation power and memory. Cloud computing is convenient way for data sharing. But users may sometime needs to outsourced the shared data to cloud server though it contains valuable and sensitive information. Thus it is necessary to provide cryptographically enhanced access control for data sharing system. This paper discuss about the promising access control for data sharing in cloud which is identity-based encryption. We introduce the efficient revocation scheme for the system which is revocable-storage identity-based encryption scheme. It provides both forward and backward security of ciphertext. Then we will have glance at the architecture and steps involved in identity-based encryption. Finally we propose system that provide secure file sharing system using identity-based encryption scheme.
Industrial Control Systems (ICS) are found in critical infrastructure such as for power generation and water treatment. When security requirements are incorporated into an ICS, one needs to test the additional code and devices added do improve the prevention and detection of cyber attacks. Conducting such tests in legacy systems is a challenge due to the high availability requirement. An approach using Timed Automata (TA) is proposed to overcome this challenge. This approach enables assessment of the effectiveness of an attack detection method based on process invariants. The approach has been demonstrated in a case study on one stage of a 6- stage operational water treatment plant. The model constructed captured the interactions among components in the selected stage. In addition, a set of attacks, attack detection mechanisms, and security specifications were also modeled using TA. These TA models were conjoined into a network and implemented in UPPAAL. The models so implemented were found effective in detecting the attacks considered. The study suggests the use of TA as an effective tool to model an ICS and study its attack detection mechanisms as a complement to doing so in a real plant-operational or under design.
In recent works, numerous physical-layer security systems have been proposed as alternatives to classic cryptography. Such systems aim to use the intrinsic properties of radio signals and the wireless medium to provide confidentiality and authentication to wireless devices. However, fundamental vulnerabilities are often discovered in these systems shortly after their inception. We therefore challenge the assumptions made by existing physical-layer security systems, and postulate that weaker assumptions are needed in order to adapt for practical scenarios. We also argue that if no computational advantage over an adversary can be ensured, secure communication cannot be realistically achieved.
High accurate time synchronization is very important for many applications and industrial environments. In a computer network, synchronization of time for connected devices is provided by the Precision Time Protocol (PTP), which in principal allows for device time synchronization down to microsecond level. However, PTP and network infrastructures are vulnerable to cyber-attacks, which can de-synchronize an entire network, leading to potentially devastating consequences. This paper will focus on the issue of internal attacks on time synchronization networks and discuss how counter-measures based on public key infrastructures, trusted platform modules, network intrusion detection systems and time synchronization supervisors can be adopted to defeat or at least detect such internal attacks.
Transform based image steganography methods are commonly used in security applications. However, the application of several recent transforms for image steganography remains unexplored. This paper presents bit-plane based steganography method using different transforms. In this work, the bit-plane of the transform coefficients is selected to embed the secret message. The characteristics of four transforms used in the steganography have been analyzed and the results of the four transforms are compared. This has been proven in the experimental results.
The principal mission of Multi-Source Multicast (MSM) is to disseminate all messages from all sources in a network to all destinations. MSM is utilized in numerous applications. In many of them, securing the messages disseminated is critical. A common secure model is to consider a network where there is an eavesdropper which is able to observe a subset of the network links, and seek a code which keeps the eavesdropper ignorant regarding all the messages. While this is solved when all messages are located at a single source, Secure MSM (SMSM) is an open problem, and the rates required are hard to characterize in general. In this paper, we consider Individual Security, which promises that the eavesdropper has zero mutual information with each message individually. We completely characterize the rate region for SMSM under individual security, and show that such a security level is achievable at the full capacity of the network, that is, the cut-set bound is the matching converse, similar to non-secure MSM. Moreover, we show that the field size is similar to non-secure MSM and does not have to be larger due to the security constraint.
This publication presents some techniques for insider threats and cryptographic protocols in secure processes. Those processes are dedicated to the information management of strategic data splitting. Strategic data splitting is dedicated to enterprise management processes as well as methods of securely storing and managing this type of data. Because usually strategic data are not enough secure and resistant for unauthorized leakage, we propose a new protocol that allows to protect data in different management structures. The presented data splitting techniques will concern cryptographic information splitting algorithms, as well as data sharing algorithms making use of cognitive data analysis techniques. The insider threats techniques will concern data reconstruction methods and cognitive data analysis techniques. Systems for the semantic analysis and secure information management will be used to conceal strategic information about the condition of the enterprise. Using the new approach, which is based on cognitive systems allow to guarantee the secure features and make the management processes more efficient.
The growing computerization of critical infrastructure as well as the pervasiveness of computing in everyday life has led to increased interest in secure application development. We observe a flurry of new security technologies like ARM TrustZone and Intel SGX, but a lack of a corresponding architectural vision. We are convinced that point solutions are not sufficient to address the overall challenge of secure system design. In this paper, we outline our take on a trusted component ecosystem of small individual building blocks with strong isolation. In our view, applications should no longer be designed as massive stacks of vertically layered frameworks, but instead as horizontal aggregates of mutually isolated components that collaborate across machine boundaries to provide a service. Lateral thinking is needed to make secure systems going forward.
In many hostile military environments for instance war zone, unfriendly nature, etc., the systems perform on the specially promoted mode and nature which they tolerate the defined system network architecture. Preparation of Disruption-Tolerant systems (DTN) enhances the network between the remote devices which provided to the soldiers in the war zone, this situation conveys the reliable data transmission under scanner. Cipher text approach are based on the attribute based encryption which mainly acts on the attributes or role of the users, which is a successful cryptographic strategy to maintain the control issues and also allow reliable data transfer. Specially, the systems are not centralized and have more data constrained issues in the systems, implementing the Ciphertext-Policy Attribute-Based Encryption (CP-ABE) was an important issue, where this strategy provides the new security and data protection approach with the help of the Key Revocation, Key Escrows and collaboration of the certain attributes with help of main Key Authorities. This paper mainly concentrates on the reliable data retrieval system with the help of CP-ABE for the Disruption-Tolerant Networks where multiple key authorities deal with respective attributes safely and securely. We performed comparison analysis on existing schemes with the recommended system components which are configured in the respective decentralized tolerant military system for reliable data retrieval.
Emerging communication technologies in distributed network systems require transfer of biometric digital images with high security. Network security is identified by the changes in system behavior which is either Dynamic or Deterministic. Performance computation is complex in dynamic system where cryptographic techniques are not highly suitable. Chaotic theory solves complex problems of nonlinear deterministic system. Several chaotic methods are combined to get hyper chaotic system for more security. Chaotic theory along with DNA sequence enhances security of biometric image encryption. Implementation proves the encrypted image is highly chaotic and resistant to various attacks.
Named Data Networks provide a clean-slate redesign of the Future Internet for efficient content distribution. Because Internet of Things are expected to compose a significant part of Future Internet, most content will be managed by constrained devices. Such devices are often equipped with limited CPU, memory, bandwidth, and energy supply. However, the current Named Data Networks design neglects the specific requirements of Internet of Things scenarios and many data structures need to be further optimized. The purpose of this research is to provide an efficient strategy to route in Named Data Networks by constructing a Forwarding Information Base using Iterated Bloom Filters defined as I(FIB)F. We propose the use of content names based on iterative hashes. This strategy leads to reduce the overhead of packets. Moreover, the memory and the complexity required in the forwarding strategy are lower than in current solutions. We compare our proposal with solutions based on hierarchical names and Standard Bloom Filters. We show how to further optimize I(FIB)F by exploiting the structure information contained in hierarchical content names. Finally, two strategies may be followed to reduce: (i) the overall memory for routing or (ii) the probability of false positives.
Radio Frequency IDentification(RFID) is one of the most important sensing techniques for Internet of Things(IoT) and RFID systems have been applied to various different fields. But an RFID system usually uses open wireless radio wave to communicate and this will lead to a serious threat to its privacy and security. The current popular RFID tags are some low-cost passive tags. Their computation and storage resources are very limited. It is not feasible for them to complete some complicated cryptographic operations. So it is very difficult to protect the security and privacy of an RFID system. Lightweight authentication protocol is considered as an effective approach. Many typical authentication protocols usually use Hash functions so that they require more computation and storage resources. Based on CRC function, we propose a lightweight RFID authentication protocol, which needs less computation and storage resources than Hash functions. This protocol exploits an on-chip CRC function and a pseudorandom number generator to ensure the anonymity and freshness of communications between reader and tag. It provides forward security and confidential communication. It can prevent eavesdropping, location trace, replay attack, spoofing and DOS-attack effectively. It is very suitable to be applied to RFID systems.
In recent years, the chaos based cryptographic algorithms have enabled some new and efficient ways to develop secure image encryption techniques. In this paper, we propose a new approach for image encryption based on chaotic maps in order to meet the requirements of secure image encryption. The chaos based image encryption technique uses simple chaotic maps which are very sensitive to original conditions. Using mixed chaotic maps which works based on simple substitution and transposition techniques to encrypt the original image yields better performance with less computation complexity which in turn gives high crypto-secrecy. The initial conditions for the chaotic maps are assigned and using that seed only the receiver can decrypt the message. The results of the experimental, statistical analysis and key sensitivity tests show that the proposed image encryption scheme provides an efficient and secure way for image encryption.
Image encryption takes been used by armies and governments to help top-secret communication. Nowadays, this one is frequently used for guarding info among various civilian systems. To perform secure image encryption by means of various chaotic maps, in such system a legal party may perhaps decrypt the image with the support of encryption key. This reversible chaotic encryption technique makes use of Arnold's cat map, in which pixel shuffling offers mystifying the image pixels based on the number of iterations decided by the authorized image owner. This is followed by other chaotic encryption techniques such as Logistic map and Tent map, which ensures secure image encryption. The simulation result shows the planned system achieves better NPCR, UACI, MSE and PSNR respectively.
as data size is growing up, cloud storage is becoming more familiar to store a significant amount of private information. Government and private organizations require transferring plenty of business files from one end to another. However, we will lose privacy if we exchange information without data encryption and communication mechanism security. To protect data from hacking, we can use Asymmetric encryption technique, but it has a key exchange problem. Although Asymmetric key encryption deals with the limitations of Symmetric key encryption it can only encrypt limited size of data which is not feasible for a large amount of data files. In this paper, we propose a probabilistic approach to Pretty Good Privacy technique for encrypting large-size data, named as ``BigCrypt'' where both Symmetric and Asymmetric key encryption are used. Our goal is to achieve zero tolerance security on a significant amount of data encryption. We have experimentally evaluated our technique under three different platforms.
Traffic normalization, i.e. enforcing a constant stream of fixed-length packets, is a well-known measure to completely prevent attacks based on traffic analysis. In simple configurations, the enforced traffic rate can be statically configured by a human operator, but in large virtual private networks (VPNs) the traffic pattern of many connections may need to be adjusted whenever the overlay topology or the transport capacity of the underlying infrastructure changes. We propose a rate-based congestion control mechanism for automatic adjustment of traffic patterns that does not leak any information about the actual communication. Overly strong rate throttling in response to packet loss is avoided, as the control mechanism does not change the sending rate immediately when a packet loss was detected. Instead, an estimate of the current packet loss rate is obtained and the sending rate is adjusted proportionally. We evaluate our control scheme based on a measurement study in a local network testbed. The results indicate that the proposed approach avoids network congestion, enables protected TCP flows to achieve an increased goodput, and yet ensures appropriate traffic flow confidentiality.
Currently, no major browser fully checks for TLS/SSL certificate revocations. This is largely due to the fact that the deployed mechanisms for disseminating revocations (CRLs, OCSP, OCSP Stapling, CRLSet, and OneCRL) are each either incomplete, insecure, inefficient, slow to update, not private, or some combination thereof. In this paper, we present CRLite, an efficient and easily-deployable system for proactively pushing all TLS certificate revocations to browsers. CRLite servers aggregate revocation information for all known, valid TLS certificates on the web, and store them in a space-efficient filter cascade data structure. Browsers periodically download and use this data to check for revocations of observed certificates in real-time. CRLite does not require any additional trust beyond the existing PKI, and it allows clients to adopt a fail-closed security posture even in the face of network errors or attacks that make revocation information temporarily unavailable. We present a prototype of name that processes TLS certificates gathered by Rapid7, the University of Michigan, and Google's Certificate Transparency on the server-side, with a Firefox extension on the client-side. Comparing CRLite to an idealized browser that performs correct CRL/OCSP checking, we show that CRLite reduces latency and eliminates privacy concerns. Moreover, CRLite has low bandwidth costs: it can represent all certificates with an initial download of 10 MB (less than 1 byte per revocation) followed by daily updates of 580 KB on average. Taken together, our results demonstrate that complete TLS/SSL revocation checking is within reach for all clients.
Internet of things (IOT) is a kind of advanced information technology which has drawn societies' attention. Sensors and stimulators are usually recognized as smart devices of our environment. Simultaneously IOT security brings up new issues. Internet connection and possibility of interaction with smart devices cause those devices to involve more in human life. Therefore, safety is a fundamental requirement in designing IOT. IOT has three remarkable features: overall perception, reliable transmission and intelligent processing. Because of IOT span, security of conveying data is an essential factor for system security. Hybrid encryption technique is a new model that can be used in IOT. This type of encryption generates strong security and low computation. In this paper, we have proposed a hybrid encryption algorithm which has been conducted in order to reduce safety risks and enhancing encryption's speed and less computational complexity. The purpose of this hybrid algorithm is information integrity, confidentiality, non-repudiation in data exchange for IOT. Eventually suggested encryption algorithm has been simulated by MATLAB software and its speed and safety efficiency were evaluated in comparison with conventional encryption algorithm.