Visible to the public Biblio

Found 1717 results

Filters: First Letter Of Last Name is J  [Clear All Filters]
2015-05-06
Al-Anzi, F.S., Salman, A.A., Jacob, N.K., Soni, J..  2014.  Towards robust, scalable and secure network storage in Cloud Computing. Digital Information and Communication Technology and it's Applications (DICTAP), 2014 Fourth International Conference on. :51-55.

The term Cloud Computing is not something that appeared overnight, it may come from the time when computer system remotely accessed the applications and services. Cloud computing is Ubiquitous technology and receiving a huge attention in the scientific and industrial community. Cloud computing is ubiquitous, next generation's in-formation technology architecture which offers on-demand access to the network. It is dynamic, virtualized, scalable and pay per use model over internet. In a cloud computing environment, a cloud service provider offers “house of resources” includes applications, data, runtime, middleware, operating system, virtualization, servers, data storage and sharing and networking and tries to take up most of the overhead of client. Cloud computing offers lots of benefits, but the journey of the cloud is not very easy. It has several pitfalls along the road because most of the services are outsourced to third parties with added enough level of risk. Cloud computing is suffering from several issues and one of the most significant is Security, privacy, service availability, confidentiality, integrity, authentication, and compliance. Security is a shared responsibility of both client and service provider and we believe security must be information centric, adaptive, proactive and built in. Cloud computing and its security are emerging study area nowadays. In this paper, we are discussing about data security in cloud at the service provider end and proposing a network storage architecture of data which make sure availability, reliability, scalability and security.

Junho Hong, Chen-Ching Liu, Govindarasu, M..  2014.  Integrated Anomaly Detection for Cyber Security of the Substations. Smart Grid, IEEE Transactions on. 5:1643-1653.

Cyber intrusions to substations of a power grid are a source of vulnerability since most substations are unmanned and with limited protection of the physical security. In the worst case, simultaneous intrusions into multiple substations can lead to severe cascading events, causing catastrophic power outages. In this paper, an integrated Anomaly Detection System (ADS) is proposed which contains host- and network-based anomaly detection systems for the substations, and simultaneous anomaly detection for multiple substations. Potential scenarios of simultaneous intrusions into the substations have been simulated using a substation automation testbed. The host-based anomaly detection considers temporal anomalies in the substation facilities, e.g., user-interfaces, Intelligent Electronic Devices (IEDs) and circuit breakers. The malicious behaviors of substation automation based on multicast messages, e.g., Generic Object Oriented Substation Event (GOOSE) and Sampled Measured Value (SMV), are incorporated in the proposed network-based anomaly detection. The proposed simultaneous intrusion detection method is able to identify the same type of attacks at multiple substations and their locations. The result is a new integrated tool for detection and mitigation of cyber intrusions at a single substation or multiple substations of a power grid.
 

Tang, Lu-An, Han, Jiawei, Jiang, Guofei.  2014.  Mining sensor data in cyber-physical systems. Tsinghua Science and Technology. 19:225-234.

A Cyber-Physical System (CPS) integrates physical devices (i.e., sensors) with cyber (i.e., informational) components to form a context sensitive system that responds intelligently to dynamic changes in real-world situations. Such a system has wide applications in the scenarios of traffic control, battlefield surveillance, environmental monitoring, and so on. A core element of CPS is the collection and assessment of information from noisy, dynamic, and uncertain physical environments integrated with many types of cyber-space resources. The potential of this integration is unbounded. To achieve this potential the raw data acquired from the physical world must be transformed into useable knowledge in real-time. Therefore, CPS brings a new dimension to knowledge discovery because of the emerging synergism of the physical and the cyber. The various properties of the physical world must be addressed in information management and knowledge discovery. This paper discusses the problems of mining sensor data in CPS: With a large number of wireless sensors deployed in a designated area, the task is real time detection of intruders that enter the area based on noisy sensor data. The framework of IntruMine is introduced to discover intruders from untrustworthy sensor data. IntruMine first analyzes the trustworthiness of sensor data, then detects the intruders' locations, and verifies the detections based on a graph model of the relationships between sensors and intruders.

Junho Hong, Chen-Ching Liu, Govindarasu, M..  2014.  Detection of cyber intrusions using network-based multicast messages for substation automation. Innovative Smart Grid Technologies Conference (ISGT), 2014 IEEE PES. :1-5.

This paper proposes a new network-based cyber intrusion detection system (NIDS) using multicast messages in substation automation systems (SASs). The proposed network-based intrusion detection system monitors anomalies and malicious activities of multicast messages based on IEC 61850, e.g., Generic Object Oriented Substation Event (GOOSE) and Sampled Value (SV). NIDS detects anomalies and intrusions that violate predefined security rules using a specification-based algorithm. The performance test has been conducted for different cyber intrusion scenarios (e.g., packet modification, replay and denial-of-service attacks) using a cyber security testbed. The IEEE 39-bus system model has been used for testing of the proposed intrusion detection method for simultaneous cyber attacks. The false negative ratio (FNR) is the number of misclassified abnormal packets divided by the total number of abnormal packets. The results demonstrate that the proposed NIDS achieves a low fault negative rate.
 

Farzan, F., Jafari, M.A., Wei, D., Lu, Y..  2014.  Cyber-related risk assessment and critical asset identification in power grids. Innovative Smart Grid Technologies Conference (ISGT), 2014 IEEE PES. :1-5.

This paper proposes a methodology to assess cyber-related risks and to identify critical assets both at power grid and substation levels. The methodology is based on a two-pass engine model. The first pass engine is developed to identify the most critical substation(s) in a power grid. A mixture of Analytical hierarchy process (AHP) and (N-1) contingent analysis is used to calculate risks. The second pass engine is developed to identify risky assets within a substation and improve the vulnerability of a substation against the intrusion and malicious acts of cyber hackers. The risk methodology uniquely combines asset reliability, vulnerability and costs of attack into a risk index. A methodology is also presented to improve the overall security of a substation by optimally placing security agent(s) on the automation system.
 

Zhen Ling, Junzhou Luo, Kui Wu, Wei Yu, Xinwen Fu.  2014.  TorWard: Discovery of malicious traffic over Tor. INFOCOM, 2014 Proceedings IEEE. :1402-1410.

Tor is a popular low-latency anonymous communication system. However, it is currently abused in various ways. Tor exit routers are frequently troubled by administrative and legal complaints. To gain an insight into such abuse, we design and implement a novel system, TorWard, for the discovery and systematic study of malicious traffic over Tor. The system can avoid legal and administrative complaints and allows the investigation to be performed in a sensitive environment such as a university campus. An IDS (Intrusion Detection System) is used to discover and classify malicious traffic. We performed comprehensive analysis and extensive real-world experiments to validate the feasibility and effectiveness of TorWard. Our data shows that around 10% Tor traffic can trigger IDS alerts. Malicious traffic includes P2P traffic, malware traffic (e.g., botnet traffic), DoS (Denial-of-Service) attack traffic, spam, and others. Around 200 known malware have been identified. To the best of our knowledge, we are the first to perform malicious traffic categorization over Tor.
 

Shaohua Tang, Lingling Xu, Niu Liu, Xinyi Huang, Jintai Ding, Zhiming Yang.  2014.  Provably Secure Group Key Management Approach Based upon Hyper-Sphere. Parallel and Distributed Systems, IEEE Transactions on. 25:3253-3263.

Secure group communication systems have become increasingly important for many emerging network applications. An efficient and robust group key management approach is indispensable to a secure group communication system. Motivated by the theory of hyper-sphere, this paper presents a new group key management approach with a group controller (GC). In our new design, a hyper-sphere is constructed for a group and each member in the group corresponds to a point on the hyper-sphere, which is called the member's private point. The GC computes the central point of the hyper-sphere, intuitively, whose “distance” from each member's private point is identical. The central point is published such that each member can compute a common group key, using a function by taking each member's private point and the central point of the hyper-sphere as the input. This approach is provably secure under the pseudo-random function (PRF) assumption. Compared with other similar schemes, by both theoretical analysis and experiments, our scheme (1) has significantly reduced memory and computation load for each group member; (2) can efficiently deal with massive membership change with only two re-keying messages, i.e., the central point of the hyper-sphere and a random number; and (3) is efficient and very scalable for large-size groups.

Jin Li, Xiaofeng Chen, Mingqiang Li, Jingwei Li, Lee, P.P.C., Wenjing Lou.  2014.  Secure Deduplication with Efficient and Reliable Convergent Key Management. Parallel and Distributed Systems, IEEE Transactions on. 25:1615-1625.

Data deduplication is a technique for eliminating duplicate copies of data, and has been widely used in cloud storage to reduce storage space and upload bandwidth. Promising as it is, an arising challenge is to perform secure deduplication in cloud storage. Although convergent encryption has been extensively adopted for secure deduplication, a critical issue of making convergent encryption practical is to efficiently and reliably manage a huge number of convergent keys. This paper makes the first attempt to formally address the problem of achieving efficient and reliable key management in secure deduplication. We first introduce a baseline approach in which each user holds an independent master key for encrypting the convergent keys and outsourcing them to the cloud. However, such a baseline key management scheme generates an enormous number of keys with the increasing number of users and requires users to dedicatedly protect the master keys. To this end, we propose Dekey , a new construction in which users do not need to manage any keys on their own but instead securely distribute the convergent key shares across multiple servers. Security analysis demonstrates that Dekey is secure in terms of the definitions specified in the proposed security model. As a proof of concept, we implement Dekey using the Ramp secret sharing scheme and demonstrate that Dekey incurs limited overhead in realistic environments.

Nicanfar, H., Jokar, P., Beznosov, K., Leung, V.C.M..  2014.  Efficient Authentication and Key Management Mechanisms for Smart Grid Communications. Systems Journal, IEEE. 8:629-640.

A smart grid (SG) consists of many subsystems and networks, all working together as a system of systems, many of which are vulnerable and can be attacked remotely. Therefore, security has been identified as one of the most challenging topics in SG development, and designing a mutual authentication scheme and a key management protocol is the first important step. This paper proposes an efficient scheme that mutually authenticates a smart meter of a home area network and an authentication server in SG by utilizing an initial password, by decreasing the number of steps in the secure remote password protocol from five to three and the number of exchanged packets from four to three. Furthermore, we propose an efficient key management protocol based on our enhanced identity-based cryptography for secure SG communications using the public key infrastructure. Our proposed mechanisms are capable of preventing various attacks while reducing the management overhead. The improved efficiency for key management is realized by periodically refreshing all public/private key pairs as well as any multicast keys in all the nodes using only one newly generated function broadcasted by the key generator entity. Security and performance analyses are presented to demonstrate these desirable attributes.

Ying Zhang, Ji Pengfei.  2014.  An efficient and hybrid key management for heterogeneous wireless sensor networks. Control and Decision Conference (2014 CCDC), The 26th Chinese. :1881-1885.

Key management is the core to ensure the communication security of wireless sensor network. How to establish efficient key management in wireless sensor networks (WSN) is a challenging problem for the constrained energy, memory, and computational capabilities of the sensor nodes. Previous research on sensor network security mainly considers homogeneous sensor networks with symmetric key cryptography. Recent researches have shown that using asymmetric key cryptography in heterogeneous sensor networks (HSN) can improve network performance, such as connectivity, resilience, etc. Considering the advantages and disadvantages of symmetric key cryptography and asymmetric key cryptography, the paper propose an efficient and hybrid key management method for heterogeneous wireless sensor network, cluster heads and base stations use public key encryption method based on elliptic curve cryptography (ECC), while using symmetric encryption method between adjacent nodes in the cluster. The analysis and simulation results show that the proposed key management method can provide better security, prefect scalability and connectivity with saving on storage space.

Faraji, M., Joon-Myung Kang, Bannazadeh, H., Leon-Garcia, A..  2014.  Identity access management for Multi-tier cloud infrastructures. Network Operations and Management Symposium (NOMS), 2014 IEEE. :1-9.

This paper presents a novel architecture to manage identity and access (IAM) in a Multi-tier cloud infrastructure, in which most services are supported by massive-scale data centres over the Internet. Multi-tier cloud infrastructure uses tier-based model from Software Engineering to provide resources in different tires. In this paper we focus on design and implementation of a centralized identity and access management system for the multi-tier cloud infrastructure. First, we discuss identity and access management requirements in such an environment and propose our solution to address these requirements. Next, we discuss approaches to improve performance of the IAM system and make it scalable to billions of users. Finally, we present experimental results based on the current deployment in the SAVI Testbed. We show that our IAM system outperforms the previously proposed IAM systems for cloud infrastructure by factor 9 in throughput when the number of users is small, it handle about 50 times more requests in peak usage. Because our architecture is a combination of Green-thread and load balanced process, it uses less systems resources, and easily scales up to address high number of requests.
 

Jøsang, A..  2014.  Identity management and trusted interaction in internet and mobile computing. Information Security, IET. 8:67-79.

The convergence of the Internet and mobile computing enables personalised access to online services anywhere and anytime. This potent access capability creates opportunities for new business models which stimulates vigorous investment and rapid innovation. Unfortunately, this innovation also produces new vulnerabilities and threats, and the new business models also create incentives for attacks, because criminals will always follow the money. Unless the new threats are balanced with appropriate countermeasures, growth in the Internet and mobile services will encounter painful setbacks. Security and trust are two fundamental factors for sustainable development of identity management in online markets and communities. The aim of this study is to present an overview of the central aspects of identity management in the Internet and mobile computing with respect to security and trust.

Jingkuan Song, Yi Yang, Xuelong Li, Zi Huang, Yang Yang.  2014.  Robust Hashing With Local Models for Approximate Similarity Search. Cybernetics, IEEE Transactions on. 44:1225-1236.

Similarity search plays an important role in many applications involving high-dimensional data. Due to the known dimensionality curse, the performance of most existing indexing structures degrades quickly as the feature dimensionality increases. Hashing methods, such as locality sensitive hashing (LSH) and its variants, have been widely used to achieve fast approximate similarity search by trading search quality for efficiency. However, most existing hashing methods make use of randomized algorithms to generate hash codes without considering the specific structural information in the data. In this paper, we propose a novel hashing method, namely, robust hashing with local models (RHLM), which learns a set of robust hash functions to map the high-dimensional data points into binary hash codes by effectively utilizing local structural information. In RHLM, for each individual data point in the training dataset, a local hashing model is learned and used to predict the hash codes of its neighboring data points. The local models from all the data points are globally aligned so that an optimal hash code can be assigned to each data point. After obtaining the hash codes of all the training data points, we design a robust method by employing ℓ2,1-norm minimization on the loss function to learn effective hash functions, which are then used to map each database point into its hash code. Given a query data point, the search process first maps it into the query hash code by the hash functions and then explores the buckets, which have similar hash codes to the query hash code. Extensive experimental results conducted on real-life datasets show that the proposed RHLM outperforms the state-of-the-art methods in terms of search quality and efficiency.
 

Jae Min Cho, Kiyoung Choi.  2014.  An FPGA implementation of high-throughput key-value store using Bloom filter. VLSI Design, Automation and Test (VLSI-DAT), 2014 International Symposium on. :1-4.

This paper presents an efficient implementation of key-value store using Bloom filters on FPGA. Bloom filters are used to reduce the number of unnecessary accesses to the hash tables, thereby improving the performance. Additionally, for better hash table utilization, we use a modified cuckoo hashing algorithm for the implementation. They are implemented in FPGA to further improve the performance. Experimental results show significant performance improvement over existing approaches.
 

Premnath, A.P., Ju-Yeon Jo, Yoohwan Kim.  2014.  Application of NTRU Cryptographic Algorithm for SCADA Security. Information Technology: New Generations (ITNG), 2014 11th International Conference on. :341-346.

Critical Infrastructure represents the basic facilities, services and installations necessary for functioning of a community, such as water, power lines, transportation, or communication systems. Any act or practice that causes a real-time Critical Infrastructure System to impair its normal function and performance will have debilitating impact on security and economy, with direct implication on the society. SCADA (Supervisory Control and Data Acquisition) system is a control system which is widely used in Critical Infrastructure System to monitor and control industrial processes autonomously. As SCADA architecture relies on computers, networks, applications and programmable controllers, it is more vulnerable to security threats/attacks. Traditional SCADA communication protocols such as IEC 60870, DNP3, IEC 61850, or Modbus did not provide any security services. Newer standards such as IEC 62351 and AGA-12 offer security features to handle the attacks on SCADA system. However there are performance issues with the cryptographic solutions of these specifications when applied to SCADA systems. This research is aimed at improving the performance of SCADA security standards by employing NTRU, a faster and light-weight NTRU public key algorithm for providing end-to-end security.

Miyoung Jang, Min Yoon, Jae-Woo Chang.  2014.  A privacy-aware query authentication index for database outsourcing. Big Data and Smart Computing (BIGCOMP), 2014 International Conference on. :72-76.

Recently, cloud computing has been spotlighted as a new paradigm of database management system. In this environment, databases are outsourced and deployed on a service provider in order to reduce cost for data storage and maintenance. However, the service provider might be untrusted so that the two issues of data security, including data confidentiality and query result integrity, become major concerns for users. Existing bucket-based data authentication methods have problem that the original spatial data distribution can be disclosed from data authentication index due to the unsophisticated data grouping strategies. In addition, the transmission overhead of verification object is high. In this paper, we propose a privacy-aware query authentication which guarantees data confidentiality and query result integrity for users. A periodic function-based data grouping scheme is designed to privately partition a spatial database into small groups for generating a signature of each group. The group signature is used to check the correctness and completeness of outsourced data when answering a range query to users. Through performance evaluation, it is shown that proposed method outperforms the existing method in terms of range query processing time up to 3 times.

Yueying Huang, Jingang Zhang, Houyan Chen.  2014.  On the security of a certificateless signcryption scheme. Electronics, Computer and Applications, 2014 IEEE Workshop on. :664-667.

Signcryption is a cryptographic primitive that simultaneously realizes both the functions of public key encryption and digital signature in a logically single step, and with a cost significantly lower than that required by the traditional “signature and encryption” approach. Recently, an efficient certificateless signcryption scheme without using bilinear pairings was proposed by Zhu et al., which is claimed secure based on the assumptions that the compute Diffie-Hellman problem and the discrete logarithm problem are difficult. Although some security arguments were provided to show the scheme is secure, in this paper, we find that the signcryption construction due to Zhu et al. is not as secure as claimed. Specifically, we describe an adversary that can break the IND-CCA2 security of the scheme without any Unsigncryption query. Moreover, we demonstrate that the scheme is insecure against key replacement attack by describing a concrete attack approach.
 

Djouadi, S.M., Melin, A.M., Ferragut, E.M., Laska, J.A., Jin Dong.  2014.  Finite energy and bounded attacks on control system sensor signals. American Control Conference (ACC), 2014. :1716-1722.

Control system networks are increasingly being connected to enterprise level networks. These connections leave critical industrial controls systems vulnerable to cyber-attacks. Most of the effort in protecting these cyber-physical systems (CPS) from attacks has been in securing the networks using information security techniques. Effort has also been applied to increasing the protection and reliability of the control system against random hardware and software failures. However, the inability of information security techniques to protect against all intrusions means that the control system must be resilient to various signal attacks for which new analysis methods need to be developed. In this paper, sensor signal attacks are analyzed for observer-based controlled systems. The threat surface for sensor signal attacks is subdivided into denial of service, finite energy, and bounded attacks. In particular, the error signals between states of attack free systems and systems subject to these attacks are quantified. Optimal sensor and actuator signal attacks for the finite and infinite horizon linear quadratic (LQ) control in terms of maximizing the corresponding cost functions are computed. The closed-loop systems under optimal signal attacks are provided. Finally, an illustrative numerical example using a power generation network is provided together with distributed LQ controllers.

Janbeglou, M., Naderi, H., Brownlee, N..  2014.  Effectiveness of DNS-Based Security Approaches in Large-Scale Networks. Advanced Information Networking and Applications Workshops (WAINA), 2014 28th International Conference on. :524-529.

The Domain Name System (DNS) is widely seen as a vital protocol of the modern Internet. For example, popular services like load balancers and Content Delivery Networks heavily rely on DNS. Because of its important role, DNS is also a desirable target for malicious activities such as spamming, phishing, and botnets. To protect networks against these attacks, a number of DNS-based security approaches have been proposed. The key insight of our study is to measure the effectiveness of security approaches that rely on DNS in large-scale networks. For this purpose, we answer the following questions, How often is DNS used? Are most of the Internet flows established after contacting DNS? In this study, we collected data from the University of Auckland campus network with more than 33,000 Internet users and processed it to find out how DNS is being used. Moreover, we studied the flows that were established with and without contacting DNS. Our results show that less than 5 percent of the observed flows use DNS. Therefore, we argue that those security approaches that solely depend on DNS are not sufficient to protect large-scale networks.

Wei Peng, Feng Li, Xukai Zou, Jie Wu.  2014.  Behavioral Malware Detection in Delay Tolerant Networks. Parallel and Distributed Systems, IEEE Transactions on. 25:53-63.

The delay-tolerant-network (DTN) model is becoming a viable communication alternative to the traditional infrastructural model for modern mobile consumer electronics equipped with short-range communication technologies such as Bluetooth, NFC, and Wi-Fi Direct. Proximity malware is a class of malware that exploits the opportunistic contacts and distributed nature of DTNs for propagation. Behavioral characterization of malware is an effective alternative to pattern matching in detecting malware, especially when dealing with polymorphic or obfuscated malware. In this paper, we first propose a general behavioral characterization of proximity malware which based on naive Bayesian model, which has been successfully applied in non-DTN settings such as filtering email spams and detecting botnets. We identify two unique challenges for extending Bayesian malware detection to DTNs ("insufficient evidence versus evidence collection risk" and "filtering false evidence sequentially and distributedly"), and propose a simple yet effective method, look ahead, to address the challenges. Furthermore, we propose two extensions to look ahead, dogmatic filtering, and adaptive look ahead, to address the challenge of "malicious nodes sharing false evidence." Real mobile network traces are used to verify the effectiveness of the proposed methods.
 

Wei Zhu, Jun Tang, Shuang Wan, Jie-Li Zhu.  2014.  Outlier-resistant adaptive filtering based on sparse Bayesian learning. Electronics Letters. 50:663-665.

In adaptive processing applications, the design of the adaptive filter requires estimation of the unknown interference-plus-noise covariance matrix from secondary training data. The presence of outliers in the training data can severely degrade the performance of adaptive processing. By exploiting the sparse prior of the outliers, a Bayesian framework to develop a computationally efficient outlier-resistant adaptive filter based on sparse Bayesian learning (SBL) is proposed. The expectation-maximisation (EM) algorithm is used therein to obtain a maximum a posteriori (MAP) estimate of the interference-plus-noise covariance matrix. Numerical simulations demonstrate the superiority of the proposed method over existing methods.

Bin Sun, Shutao Li, Jun Sun.  2014.  Scanned Image Descreening With Image Redundancy and Adaptive Filtering. Image Processing, IEEE Transactions on. 23:3698-3710.

Currently, most electrophotographic printers use halftoning technique to print continuous tone images, so scanned images obtained from such hard copies are usually corrupted by screen like artifacts. In this paper, a new model of scanned halftone image is proposed to consider both printing distortions and halftone patterns. Based on this model, an adaptive filtering based descreening method is proposed to recover high quality contone images from the scanned images. Image redundancy based denoising algorithm is first adopted to reduce printing noise and attenuate distortions. Then, screen frequency of the scanned image and local gradient features are used for adaptive filtering. Basic contone estimate is obtained by filtering the denoised scanned image with an anisotropic Gaussian kernel, whose parameters are automatically adjusted with the screen frequency and local gradient information. Finally, an edge-preserving filter is used to further enhance the sharpness of edges to recover a high quality contone image. Experiments on real scanned images demonstrate that the proposed method can recover high quality contone images from the scanned images. Compared with the state-of-the-art methods, the proposed method produces very sharp edges and much cleaner smooth regions.

Jian Wang, Lin Mei, Yi Li, Jian-Ye Li, Kun Zhao, Yuan Yao.  2014.  Variable Window for Outlier Detection and Impulsive Noise Recognition in Range Images. Cluster, Cloud and Grid Computing (CCGrid), 2014 14th IEEE/ACM International Symposium on. :857-864.

To improve comprehensive performance of denoising range images, an impulsive noise (IN) denoising method with variable windows is proposed in this paper. Founded on several discriminant criteria, the principles of dropout IN detection and outlier IN detection are provided. Subsequently, a nearest non-IN neighbors searching process and an Index Distance Weighted Mean filter is combined for IN denoising. As key factors of adapatablity of the proposed denoising method, the sizes of two windows for outlier INs detection and INs denoising are investigated. Originated from a theoretical model of invader occlusion, variable window is presented for adapting window size to dynamic environment of each point, accompanying with practical criteria of adaptive variable window size determination. Experiments on real range images of multi-line surface are proceeded with evaluations in terms of computational complexity and quality assessment with comparison analysis among a few other popular methods. It is indicated that the proposed method can detect the impulsive noises with high accuracy, meanwhile, denoise them with strong adaptability with the help of variable window.
 

Markman, A., Javidi, B., Tehranipoor, M..  2014.  Photon-Counting Security Tagging and Verification Using Optically Encoded QR Codes. Photonics Journal, IEEE. 6:1-9.

We propose an optical security method for object authentication using photon-counting encryption implemented with phase encoded QR codes. By combining the full phase double-random-phase encryption with photon-counting imaging method and applying an iterative Huffman coding technique, we are able to encrypt and compress an image containing primary information about the object. This data can then be stored inside of an optically phase encoded QR code for robust read out, decryption, and authentication. The optically encoded QR code is verified by examining the speckle signature of the optical masks using statistical analysis. Optical experimental results are presented to demonstrate the performance of the system. In addition, experiments with a commercial Smartphone to read the optically encoded QR code are presented. To the best of our knowledge, this is the first report on integrating photon-counting security with optically phase encoded QR codes.
 

2015-05-05
Koch, S., John, M., Worner, M., Muller, A., Ertl, T..  2014.  VarifocalReader #x2014; In-Depth Visual Analysis of Large Text Documents. Visualization and Computer Graphics, IEEE Transactions on. 20:1723-1732.

Interactive visualization provides valuable support for exploring, analyzing, and understanding textual documents. Certain tasks, however, require that insights derived from visual abstractions are verified by a human expert perusing the source text. So far, this problem is typically solved by offering overview-detail techniques, which present different views with different levels of abstractions. This often leads to problems with visual continuity. Focus-context techniques, on the other hand, succeed in accentuating interesting subsections of large text documents but are normally not suited for integrating visual abstractions. With VarifocalReader we present a technique that helps to solve some of these approaches' problems by combining characteristics from both. In particular, our method simplifies working with large and potentially complex text documents by simultaneously offering abstract representations of varying detail, based on the inherent structure of the document, and access to the text itself. In addition, VarifocalReader supports intra-document exploration through advanced navigation concepts and facilitates visual analysis tasks. The approach enables users to apply machine learning techniques and search mechanisms as well as to assess and adapt these techniques. This helps to extract entities, concepts and other artifacts from texts. In combination with the automatic generation of intermediate text levels through topic segmentation for thematic orientation, users can test hypotheses or develop interesting new research questions. To illustrate the advantages of our approach, we provide usage examples from literature studies.