Visible to the public Biblio

Found 15086 results

Filters: Keyword is pubcrawl  [Clear All Filters]
2017-08-18
Bi, Yu, Hu, X. Sharon, Jin, Yier, Niemier, Michael, Shamsi, Kaveh, Yin, Xunzhao.  2016.  Enhancing Hardware Security with Emerging Transistor Technologies. Proceedings of the 26th Edition on Great Lakes Symposium on VLSI. :305–310.

We consider how the I-V characteristics of emerging transistors (particularly those sponsored by STARnet) might be employed to enhance hardware security. An emphasis of this work is to move beyond hardware implementations of physically unclonable functions (PUFs) and random num- ber generators (RNGs). We highlight how new devices (i) may enable more sophisticated logic obfuscation for IP protection, (ii) could help to prevent fault injection attacks, (iii) prevent differential power analysis in lightweight cryptographic systems, etc.

Sion, Laurens, Van Landuyt, Dimitri, Yskout, Koen, Joosen, Wouter.  2016.  Towards Systematically Addressing Security Variability in Software Product Lines. Proceedings of the 20th International Systems and Software Product Line Conference. :342–343.

With the increasingly pervasive role of software in society, security is becoming an important quality concern, emphasizing security by design, but it requires intensive specialization. Security in families of systems is even harder, as diverse variants of security solutions must be considered, with even different security goals per product. Furthermore, security is not a static object but a moving target, adding variability. For this, an approach to systematically address security concerns in software product lines is needed. It should consider security separate from other variability dimensions. The main challenges to realize this are: (i) expressing security and its variability, (ii) selecting the right solution, (iii) properly instantiating a solution, and (iv) verifying and validating it. In this paper, we present our research agenda towards addressing the aforementioned challenges.

Ren, Wenyu, Nahrstedt, Klara, Yardley, Tim.  2016.  Operation-level Traffic Analyzer Framework for Smart Grid. Proceedings of the Symposium and Bootcamp on the Science of Security. :112–114.

The Smart Grid control systems need to be protected from internal attacks within the perimeter. In Smart Grid, the Intelligent Electronic Devices (IEDs) are resource-constrained devices that do not have the ability to provide security analysis and protection by themselves. And the commonly used industrial control system protocols offer little security guarantee. To guarantee security inside the system, analysis and inspection of both internal network traffic and device status need to be placed close to IEDs to provide timely information to power grid operators. For that, we have designed a unique, extensible and efficient operation-level traffic analyzer framework. The timing evaluation of the analyzer overhead confirms efficiency under Smart Grid operational traffic.

Gu, Peng, Li, Shuangchen, Stow, Dylan, Barnes, Russell, Liu, Liu, Xie, Yuan, Kursun, Eren.  2016.  Leveraging 3D Technologies for Hardware Security: Opportunities and Challenges. Proceedings of the 26th Edition on Great Lakes Symposium on VLSI. :347–352.

3D die stacking and 2.5D interposer design are promising technologies to improve integration density, performance and cost. Current approaches face serious issues in dealing with emerging security challenges such as side channel attacks, hardware trojans, secure IC manufacturing and IP piracy. By utilizing intrinsic characteristics of 2.5D and 3D technologies, we propose novel opportunities in designing secure systems. We present: (i) a 3D architecture for shielding side-channel information; (ii) split fabrication using active interposers; (iii) circuit camouflage on monolithic 3D IC, and (iv) 3D IC-based security processing-in-memory (PIM). Advantages and challenges of these designs are discussed, showing that the new designs can improve existing countermeasures against security threats and further provide new security features.

Nivethan, Jeyasingam, Papa, Mauricio.  2016.  A SCADA Intrusion Detection Framework That Incorporates Process Semantics. Proceedings of the 11th Annual Cyber and Information Security Research Conference. :6:1–6:5.

SCADA security is an increasingly important research area as these systems, used for process control and automation, are being exposed to the Internet due to their use of TCP/IP protocols as a transport mechanism for control messages. Most of the existing research work on SCADA systems has focused on addressing SCADA security by monitoring attacks or anomalies at the network level. The main issue affecting these systems today is that by focusing our attention on network-level monitoring needs, security practitioners may remain unaware of process level constraints. The proposed framework helps ensure that a mechanism is in place to help map process level constraints, as described by process engineers, to network level monitoring needs. Existing solutions have tried to address this problem but have not been able to fully bridge the gap between the process and the network. The goal of this research is to provide a solution that (i) leverages the knowledge process engineers have about the system (to help strengthen cyber security) and that has the ability to (ii) seamlessly monitors process constraints at the network level using standard network security tools. A prototype system for the Modbus TCP protocol and the Bro IDS has been built to validate the approach.

Huang, Yuanwen, Bhunia, Swarup, Mishra, Prabhat.  2016.  MERS: Statistical Test Generation for Side-Channel Analysis Based Trojan Detection. Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. :130–141.

Hardware Trojan detection has emerged as a critical challenge to ensure security and trustworthiness of integrated circuits. A vast majority of research efforts in this area has utilized side-channel analysis for Trojan detection. Functional test generation for logic testing is a promising alternative but it may not be helpful if a Trojan cannot be fully activated or the Trojan effect cannot be propagated to the observable outputs. Side-channel analysis, on the other hand, can achieve significantly higher detection coverage for Trojans of all types/sizes, since it does not require activation/propagation of an unknown Trojan. However, they have often limited effectiveness due to poor detection sensitivity under large process variations and small Trojan footprint in side-channel signature. In this paper, we address this critical problem through a novel side-channel-aware test generation approach, based on a concept of Multiple Excitation of Rare Switching (MERS), that can significantly increase Trojan detection sensitivity. The paper makes several important contributions: i) it presents in detail the statistical test generation method, which can generate high-quality testset for creating high relative activity in arbitrary Trojan instances; ii) it analyzes the effectiveness of generated testset in terms of Trojan coverage; and iii) it describes two judicious reordering methods can further tune the testset and greatly improve the side channel sensitivity. Simulation results demonstrate that the tests generated by MERS can significantly increase the Trojans sensitivity, thereby making Trojan detection effective using side-channel analysis.

Jaeger, Trent.  2016.  Configuring Software and Systems for Defense-in-Depth. Proceedings of the 2016 ACM Workshop on Automated Decision Making for Active Cyber Defense. :1–1.

The computer security community has long advocated defense in depth, building multiple layers of defense to protect a system. Realizing this vision is not yet practical, as software often ships with inadequate defenses, typically developed in an ad hoc fashion. Currently, programmers reason about security manually and lack tools to validate assurance that security controls provide satisfactory defenses. In this keynote talk, I will discuss how achieving defense in depth has a significant component in configuration. In particular, we advocate configuring security requirements for various layers of software defenses (e.g., privilege separation, authorization, and auditing) and generating software and systems defenses that implement such configurations (mostly) automatically. I will focus mainly on the challenge of retrofitting software with authorization code automatically to demonstrate the configuration problems faced by the community, and discuss how we may leverage these lessons to configuring software and systems for defense in depth.

DiScala, Michael, Abadi, Daniel J..  2016.  Automatic Generation of Normalized Relational Schemas from Nested Key-Value Data. Proceedings of the 2016 International Conference on Management of Data. :295–310.

Self-describing key-value data formats such as JSON are becoming increasingly popular as application developers choose to avoid the rigidity imposed by the relational model. Database systems designed for these self-describing formats, such as MongoDB, encourage users to use denormalized, heavily nested data models so that relationships across records and other schema information need not be predefined or standardized. Such data models contribute to long-term development complexity, as their lack of explicit entity and relationship tracking burdens new developers unfamiliar with the dataset. Furthermore, the large amount of data repetition present in such data layouts can introduce update anomalies and poor scan performance, which reduce both the quality and performance of analytics over the data. In this paper we present an algorithm that automatically transforms the denormalized, nested data commonly found in NoSQL systems into traditional relational data that can be stored in a standard RDBMS. This process includes a schema generation algorithm that discovers relationships across the attributes of the denormalized datasets in order to organize those attributes into relational tables. It further includes a matching algorithm that discovers sets of attributes that represent overlapping entities and merges those sets together. These algorithms reduce data repetition, allow the use of data analysis tools targeted at relational data, accelerate scan-intensive algorithms over the data, and help users gain a semantic understanding of complex, nested datasets.

Thoma, Cory, Lee, Adam J., Labrinidis, Alexandros.  2016.  PolyStream: Cryptographically Enforced Access Controls for Outsourced Data Stream Processing. Proceedings of the 21st ACM on Symposium on Access Control Models and Technologies. :227–238.

With data becoming available in larger quantities and at higher rates, new data processing paradigms have been proposed to handle high-volume, fast-moving data. Data Stream Processing is one such paradigm wherein transient data streams flow through sets of continuous queries, only returning results when data is of interest to the querier. To avoid the large costs associated with maintaining the infrastructure required for processing these data streams, many companies will outsource their computation to third-party cloud services. This outsourcing, however, can lead to private data being accessed by parties that a data provider may not trust. The literature offers solutions to this confidentiality and access control problem but they have fallen short of providing a complete solution to these problems, due to either immense overheads or trust requirements placed on these third-party services. To address these issues, we have developed PolyStream, an enhancement to existing data stream management systems that enables data providers to specify attribute-based access control policies that are cryptographically enforced while simultaneously allowing many types of in-network data processing. We detail the access control models and mechanisms used by PolyStream, and describe a novel use of security punctuations that enables flexible, online policy management and key distribution. We detail how queries are submitted and executed using an unmodified Data Stream Management System, and show through an extensive evaluation that PolyStream yields a 550x performance gain versus the state-of-the-art system StreamForce in CODASPY 2014, while providing greater functionality to the querier.

Afanasyev, Alexander, Halderman, J. Alex, Ruoti, Scott, Seamons, Kent, Yu, Yingdi, Zappala, Daniel, Zhang, Lixia.  2016.  Content-based Security for the Web. Proceedings of the 2016 New Security Paradigms Workshop. :49–60.

The World Wide Web has become the most common platform for building applications and delivering content. Yet despite years of research, the web continues to face severe security challenges related to data integrity and confidentiality. Rather than continuing the exploit-and-patch cycle, we propose addressing these challenges at an architectural level, by supplementing the web's existing connection-based and server-based security models with a new approach: content-based security. With this approach, content is directly signed and encrypted at rest, enabling it to be delivered via any path and then validated by the browser. We explore how this new architectural approach can be applied to the web and analyze its security benefits. We then discuss a broad research agenda to realize this vision and the challenges that must be overcome.

Ha, Duy An, Nguyen, Kha Tho, Zao, John K..  2016.  Efficient Authentication of Resource-constrained IoT Devices Based on ECQV Implicit Certificates and Datagram Transport Layer Security Protocol. Proceedings of the Seventh Symposium on Information and Communication Technology. :173–179.

This paper introduces a design and implementation of a security scheme for the Internet of Things (IoT) based on ECQV Implicit Certificates and Datagram Transport Layer Security (DTLS) protocol. In this proposed security scheme, Elliptic curve cryptography based ECQV implicit certificate plays a key role allowing mutual authentication and key establishment between two resource-constrained IoT devices. We present how IoT devices get ECQV implicit certificates and use them for authenticated key exchange in DTLS. An evaluation of execution time of the implementation is also conducted to assess the efficiency of the solution.

Dang, Hung, Chong, Yun Long, Brun, Francois, Chang, Ee-Chien.  2016.  Practical and Scalable Sharing of Encrypted Data in Cloud Storage with Key Aggregation. Proceedings of the 4th ACM Workshop on Information Hiding and Multimedia Security. :69–80.

We study a sensor network setting in which samples are encrypted individually using different keys and maintained on a cloud storage. For large systems, e.g. those that generate several millions of samples per day, fine-grained sharing of encrypted samples is challenging. Existing solutions, such as Attribute-Based Encryption (ABE) and Key Aggregation Cryptosystem (KAC), can be utilized to address the challenge, but only to a certain extent. They are often computationally expensive and thus unlikely to operate at scale. We propose an algorithmic enhancement and two heuristics to improve KAC's key reconstruction cost, while preserving its provable security. The improvement is particularly significant for range and down-sampling queries – accelerating the reconstruction cost from quadratic to linear running time. Experimental study shows that for queries of size 32k samples, the proposed fast reconstruction techniques speed-up the original KAC by at least 90 times on range and down-sampling queries, and by eight times on general (arbitrary) queries. It also shows that at the expense of splitting the query into 16 sub-queries and correspondingly issuing that number of different aggregated keys, reconstruction time can be reduced by 19 times. As such, the proposed techniques make KAC more applicable in practical scenarios such as sensor networks or the Internet of Things.

Cangialosi, Frank, Chung, Taejoong, Choffnes, David, Levin, Dave, Maggs, Bruce M., Mislove, Alan, Wilson, Christo.  2016.  Measurement and Analysis of Private Key Sharing in the HTTPS Ecosystem. Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. :628–640.

The semantics of online authentication in the web are rather straightforward: if Alice has a certificate binding Bob's name to a public key, and if a remote entity can prove knowledge of Bob's private key, then (barring key compromise) that remote entity must be Bob. However, in reality, many websites' and the majority of the most popular ones-are hosted at least in part by third parties such as Content Delivery Networks (CDNs) or web hosting providers. Put simply: administrators of websites who deal with (extremely) sensitive user data are giving their private keys to third parties. Importantly, this sharing of keys is undetectable by most users, and widely unknown even among researchers. In this paper, we perform a large-scale measurement study of key sharing in today's web. We analyze the prevalence with which websites trust third-party hosting providers with their secret keys, as well as the impact that this trust has on responsible key management practices, such as revocation. Our results reveal that key sharing is extremely common, with a small handful of hosting providers having keys from the majority of the most popular websites. We also find that hosting providers often manage their customers' keys, and that they tend to react more slowly yet more thoroughly to compromised or potentially compromised keys.

Sicari, Sabrina, Rizzardi, Alessandra, Miorandi, Daniele, Coen-Porisini, Alberto.  2016.  Internet of Things: Security in the Keys. Proceedings of the 12th ACM Symposium on QoS and Security for Wireless and Mobile Networks. :129–133.

Security threats may hinder the large scale adoption of the emerging Internet of Things (IoT) technologies. Besides efforts have already been made in the direction of data integrity preservation, confidentiality and privacy, several issues are still open. The existing solutions are mainly based on encryption techniques, but no attention is actually paid to key management. A clever key distribution system, along with a key replacement mechanism, are essentials for assuring a secure approach. In this paper, two popular key management systems, conceived for wireless sensor networks, are integrated in a real IoT middleware and compared in order to evaluate their performance in terms of overhead, delay and robustness towards malicious attacks.

Sayler, Andy, Andrews, Taylor, Monaco, Matt, Grunwald, Dirk.  2016.  Tutamen: A Next-Generation Secret-Storage Platform. Proceedings of the Seventh ACM Symposium on Cloud Computing. :251–264.

The storage and management of secrets (encryption keys, passwords, etc) are significant open problems in the age of ephemeral, cloud-based computing infrastructure. How do we store and control access to the secrets necessary to configure and operate a range of modern technologies without sacrificing security and privacy requirements or significantly curtailing the desirable capabilities of our systems? To answer this question, we propose Tutamen: a next-generation secret-storage service. Tutamen offers a number of desirable properties not present in existing secret-storage solutions. These include the ability to operate across administrative domain boundaries and atop minimally trusted infrastructure. Tutamen also supports access control based on contextual, multi-factor, and alternate-band authentication parameters. These properties have allowed us to leverage Tutamen to support a variety of use cases not easily realizable using existing systems, including supporting full-disk encryption on headless servers and providing fully-featured client-side encryption for cloud-based file-storage services. In this paper, we present an overview of the secret-storage challenge, Tutamen's design and architecture, the implementation of our Tutamen prototype, and several of the applications we have built atop Tutamen. We conclude that Tutamen effectively eases the secret-storage burden and allows developers and systems administrators to achieve previously unattainable security-oriented goals while still supporting a wide range of feature-oriented requirements.

Grover, Kanika, Lim, Alvin.  2016.  Performance Comparison Between Broadcast Authentication Methods for Vehicular Networks. Proceedings of the 4th International Conference on Information and Network Security. :39–44.

For authenticating time critical broadcast messages, IEEE 1609.2 security standard for Vehicular Ad hoc Networks (VANETs) suggests the use of secure Elliptic Curve Digital Signature Algorithm (ECDSA). Since ECDSA has an expensive verification in terms of time, most commonly suggested alternate algorithms are TESLA and signature amortization. Unfortunately, these algorithms lack immediate authentication and non-repudiation. Therefore, we introduce a probabilistic verification scheme for an ECDSA-based authentication protocol. Using ns2 simulation tools, we compare the performance of all above-mentioned broadcast authentication algorithms. The results show with our proposed scheme, there is an increase in packet processed ratio over that of all the other algorithms.

Chow, Sherman S.M..  2016.  Functional Credentials for Internet of Things. Proceedings of the 2Nd ACM International Workshop on IoT Privacy, Trust, and Security. :1–1.

To ensure the authenticity and integrity, data are traditionally signed by digital signatures, which will be invalidated by any processing of the data. With the vast amount of data generated every day, it is however desirable to allow flexible processing of the signed data via applying computations or functions on them, without losing the authenticity. Signatures can also serve as credentials for access control, which appears in many aspects of life, ranging from unlocking security gates of buildings, to virtual access of data by computer programs. With the prolific use of Internet-of-Things (IoT), everything is getting connected together. There is an emerging need for more versatile credentials to secure new application scenarios, for instance, assigning different credentials to different devices, such that they can authenticate and cooperate with each other to jointly perform some computation tasks. To realize the above, we envision a general framework called functional credentials. Functional credentials allow multiple entities to (jointly) issue, combine, delegate, present, verify, escrow, and decrypt different forms of credentials, by operating on the associated "cryptographic objects" including secret keys, attributes, ciphertexts, and auxiliary data (e.g., pseudonym, expiry date, or policies for combination / delegation / revocation). Instantiating this framework with different functions can provide a spectrum of solutions for securing IoT. This talk covers both the practical applications and theoretic foundations. I will first motivate the versatility of functional credentials by case studies on IoT, which identify the need of new credential systems. I will then formulate the definition of functional credentials. Finally, I will share some initial ideas in realizing functional credentials, and discuss the obstacles ahead.

Libert, Benoît, Mouhartem, Fabrice, Peters, Thomas, Yung, Moti.  2016.  Practical "Signatures with Efficient Protocols" from Simple Assumptions. Proceedings of the 11th ACM on Asia Conference on Computer and Communications Security. :511–522.

Digital signatures are perhaps the most important base for authentication and trust relationships in large scale systems. More specifically, various applications of signatures provide privacy and anonymity preserving mechanisms and protocols, and these, in turn, are becoming critical (due to the recently recognized need to protect individuals according to national rules and regulations). A specific type of signatures called "signatures with efficient protocols", as introduced by Camenisch and Lysyanskaya (CL), efficiently accommodates various basic protocols and extensions like zero-knowledge proofs, signing committed messages, or re-randomizability. These are, in fact, typical operations associated with signatures used in typical anonymity and privacy-preserving scenarios. To date there are no "signatures with efficient protocols" which are based on simple assumptions and truly practical. These two properties assure us a robust primitive: First, simple assumptions are needed for ensuring that this basic primitive is mathematically robust and does not require special ad hoc assumptions that are more risky, imply less efficiency, are more tuned to the protocol itself, and are perhaps less trusted. In the other dimension, efficiency is a must given the anonymity applications of the protocol, since without proper level of efficiency the future adoption of the primitives is always questionable (in spite of their need). In this work, we present a new CL-type signature scheme that is re-randomizable under a simple, well-studied, and by now standard, assumption (SXDH). The signature is efficient (built on the recent QA-NIZK constructions), and is, by design, suitable to work in extended contexts that typify privacy settings (like anonymous credentials, group signature, and offline e-cash). We demonstrate its power by presenting practical protocols based on it.

Ali, Muqeet, Reaz, Rezwana, Gouda, Mohamed.  2016.  Two-phase Nonrepudiation Protocols. Proceedings of the 7th International Conference on Computing Communication and Networking Technologies. :22:1–22:8.

A nonrepudiation protocol from party S to party R performs two tasks. First, the protocol enables party S to send to party R some text x along with a proof (that can convince a judge) that x was indeed sent by S. Second, the protocol enables party R to receive text x from S and to send to S a proof (that can convince a judge) that x was indeed received by R. A nonrepudiation protocol from one party to another is called two-phase iff the two parties execute the protocol as specified until one of the two parties receives its complete proof. Then and only then does this party refrain from sending any message specified by the protocol because these messages only help the other party complete its proof. In this paper, we present methods for specifying and verifying two-phase nonrepudiation protocols.

Ali, Muqeet, Gouda, Mohamed.  2016.  Nonrepudiation Protocols in Cloud Systems. Proceedings of the 7th International Conference on Computing Communication and Networking Technologies. :23:1–23:6.

A nonrepudiation protocol from a sender S to a set of potential receivers \R1, R2, ..., Rn\ performs two functions. First, this protocol enables S to send to every potential receiver Ri a copy of file F along with a proof that can convince an unbiased judge that F was indeed sent by S to Ri. Second, this protocol also enables each Ri to receive from S a copy of file F and to send back to S a proof that can convince an unbiased judge that F was indeed received by Ri from S. When a nonrepudiation protocol from S to \R1, R2, ..., Rn\ is implemented in a cloud system, the communications between S and the set of potential receivers \R1, R2, ..., Rn\ are not carried out directly. Rather, these communications are carried out through a cloud C. In this paper, we present a nonrepudiation protocol that is implemented in a cloud system and show that this protocol is correct. We also show that this protocol has two clear advantages over nonrepudiation protocols that are not implemented in cloud systems.

Boroumand, Mehdi, Fridrich, Jessica.  2016.  Boosting Steganalysis with Explicit Feature Maps. Proceedings of the 4th ACM Workshop on Information Hiding and Multimedia Security. :149–157.

Explicit non-linear transformations of existing steganalysis features are shown to boost their ability to detect steganography in combination with existing simple classifiers, such as the FLD-ensemble. The non-linear transformations are learned from a small number of cover features using Nyström approximation on pilot vectors obtained with kernelized PCA. The best performance is achieved with the exponential form of the Hellinger kernel, which improves the detection accuracy by up to 2-3% for spatial-domain contentadaptive steganography. Since the non-linear map depends only on the cover source and its learning has a low computational complexity, the proposed approach is a practical and low cost method for boosting the accuracy of existing detectors built as binary classifiers. The map can also be used to significantly reduce the feature dimensionality (by up to factor of ten) without performance loss with respect to the non-transformed features.

Francis-Christie, Christopher A..  2016.  Detecting Insider Attacks with Video Websites Using Distributed Image Steganalysis (Abstract Only). Proceedings of the 47th ACM Technical Symposium on Computing Science Education. :725–725.

The safety of information inside of cloud networks is of interest to the network administrators. In a new insider attack, inside attackers merge confidential information with videos using digital video steganography. The video can be uploaded to video websites, where the information can be distributed online, where it can cost firms millions in damages. Standard behavior based exfiltration detection does not always prevent these attacks. This form of steganography is almost invisible. Existing compressed video steganalysis only detects small-payload watermarks. We develop such a strategy using distributed algorithms and classify videos, then compare existing algorithms to new ones. We find our approach improves on behavior based exfiltration detection, and on the existing online video steganalysis.

Chefranov, Alexander G., Narimani, Amir.  2016.  Participant Authenticating, Error Detecting, and 100% Multiple Errors Repairing Chang-Chen-Wang's Secret Sharing Method Enhancement. Proceedings of the 9th International Conference on Security of Information and Networks. :112–115.

Chang-Chen-Wang's (3,n) Secret grayscale image Sharing between n grayscale cover images method with participant Authentication and damaged pixels Repairing (SSAR) properties is analyzed; it restores the secret image from any three of the cover images used. We show that SSAR may fail, is not able fake participant recognizing, and has limited by 62.5% repairing ability. We propose SSAR (4,n) enhancement, SSAR-E, allowing 100% exact restoration of a corrupted pixel using any four of n covers, and recognizing a fake participant with the help of cryptographic hash functions with 5-bit values that allows better (vs. 4 bits) error detection. Using a special permutation with only one loop including all the secret image pixels, SSAR-E is able restoring all the secret image damaged pixels having just one correct pixel left. SSAR-E allows restoring the secret image to authorized parties only contrary to SSAR. The performance and size of cover images for SSAR-E are the same as for SSAR.

Trivedi, Munesh Chandra, Sharma, Shivani, Yadav, Virendra Kumar.  2016.  Analysis of Several Image Steganography Techniques in Spatial Domain: A Survey. Proceedings of the Second International Conference on Information and Communication Technology for Competitive Strategies. :84:1–84:7.

Steganography enables user to hide confidential data in any digital medium such that its existence cannot be concealed by the third party. Several research work is being is conducted to improve steganography algorithm's efficiency. Recent trends in computing technology use steganography as an important tool for hiding confidential data. This paper summarizes some of the research work conducted in the field of image steganography in spatial domain along with their advantages and disadvantages. Future research work and experimental results of some techniques is also being discussed. The key goal is to show the powerful impact of steganography in information hiding and image processing domain.

Abdulrahman, Hasan, Chaumont, Marc, Montesinos, Philippe, Magnier, Baptiste.  2016.  Color Image Steganalysis Based On Steerable Gaussian Filters Bank. Proceedings of the 4th ACM Workshop on Information Hiding and Multimedia Security. :109–114.

This article deals with color images steganalysis based on machine learning. The proposed approach enriches the features from the Color Rich Model by adding new features obtained by applying steerable Gaussian filters and then computing the co-occurrence of pixel pairs. Adding these new features to those obtained from Color-Rich Models allows us to increase the detectability of hidden messages in color images. The Gaussian filters are angled in different directions to precisely compute the tangent of the gradient vector. Then, the gradient magnitude and the derivative of this tangent direction are estimated. This refined method of estimation enables us to unearth the minor changes that have occurred in the image when a message is embedded. The efficiency of the proposed framework is demonstrated on three stenographic algorithms designed to hide messages in images: S-UNIWARD, WOW, and Synch-HILL. Each algorithm is tested using different payload sizes. The proposed approach is compared to three color image steganalysis methods based on computation features and Ensemble Classifier classification: the Spatial Color Rich Model, the CFA-aware Rich Model and the RGB Geometric Color Rich Model.