Biblio
So far, cloud storage has been accepted by an increasing number of people, which is not a fresh notion any more. It brings cloud users a lot of conveniences, such as the relief of local storage and location independent access. Nevertheless, the correctness and completeness as well as the privacy of outsourced data are what worry could users. As a result, most people are unwilling to store data in the cloud, in case that the sensitive information concerning something important is disclosed. Only when people feel worry-free, can they accept cloud storage more easily. Certainly, many experts have taken this problem into consideration, and tried to solve it. In this paper, we survey the solutions to the problems concerning auditing in cloud computing and give a comparison of them. The methods and performances as well as the pros and cons are discussed for the state-of-the-art auditing protocols.
Compressed Sensing or Compressive Sampling is the process of signal reconstruction from the samples obtained at a rate far below the Nyquist rate. In this work, Differential Pulse Coded Modulation (DPCM) is coupled with Block Based Compressed Sensing (CS) reconstruction with Robbins Monro (RM) approach. RM is a parametric iterative CS reconstruction technique. In this work extensive simulation is done to report that RM gives better performance than the existing DPCM Block Based Smoothed Projected Landweber (SPL) reconstruction technique. The noise seen in Block SPL algorithm is not much evident in this non-parametric approach. To achieve further compression of data, Lempel-Ziv-Welch channel coding technique is proposed.
Steganography is the art of the hidden data in such a way that it detection of hidden knowledge prevents. As the necessity of security and privacy increases, the need of the hiding secret data is ongoing. In this paper proposed an enhanced detection of the 1-2-4 LSB steganography and RSA cryptography in Gray Scale and Color images. For color images, we apply 1-2-4 LSB on component of the RGB, then encrypt information applying RSA technique. For Gray Images, we use LSB to then encrypt information and also detect edges of gray image. In the experimental outcomes, calculate PSNR and MSE. We calculate peak signal noise ratio for quality and brightness. This method makes sure that the information has been encrypted before hiding it into an input image. If in any case the cipher text got revealed from the input image, the middle person other than receiver can't access the information as it is in encrypted form.
Globalization of semiconductor design, manufacturing, packaging and testing has led to several security issues like over production of chips, shipping of faulty or partially functional chips, intellectual property infringement, cloning, counterfeit chips and insertion of hardware trojans in design house or at foundry etc. Adversaries will extract chips from obsolete PCB's and release used parts as new chips into the supply chain. The faulty chips or partially functioning chips can enter supply chain from untrusted Assembly Packaging and Test (APT) centers. These counterfeit parts are not reliable and cause catastrophic consequences in critical applications. To mitigate the counterfeits entering supply chain, to protect the Intellectual Property (IP) rights of owners and to meter the chip, Secure Split Test (SST) is a promising solution. CSST (Connecticut SST) is an improvement to SST, which simplifies the communication required between ATP center and design house. CSST addresses the scan tests, but it does not address the functional testing of chips. The functional testing of chips during production testing is critical in weeding out faulty chips in recent times. In this paper, we present a method called PUF-SST (Physical Unclonable Function – SST) to perform both scan tests and functional tests without compromising on security features described in CSST.
The enormous size of video data of natural scene and objects is a practical threat to storage, transmission. The efficient handling of video data essentially requires compression for economic utilization of storage space, access time and the available network bandwidth of the public channel. In addition, the protection of important video is of utmost importance so as to save it from malicious intervention, attack or alteration by unauthorized users. Therefore, security and privacy has become an important issue. Since from past few years, number of researchers concentrate on how to develop efficient video encryption for secure video transmission, a large number of multimedia encryption schemes have been proposed in the literature like selective encryption, complete encryption and entropy coding based encryption. Among above three kinds of algorithms, they all remain some kind of shortcomings. In this paper, we have proposed a lightweight selective encryption algorithm for video conference which is based on efficient XOR operation and symmetric hierarchical encryption, successfully overcoming the weakness of complete encryption while offering a better security. The proposed algorithm guarantees security, fastness and error tolerance without increasing the video size.
Denial of Service (DoS) attacks is one of the major threats and among the hardest security problems in the Internet world. Of particular concern are Distributed Denial of Service (DDoS) attacks, whose impact can be proportionally severe. With little or no advance warning, an attacker can easily exhaust the computing resources of its victim within a short period of time. In this paper, we study the impact of a UDP flood attack on TCP throughput, round-trip time, and CPU utilization for a Web Server with the new generation of Linux platform, Linux Ubuntu 13. This paper also evaluates the impact of various defense mechanisms, including Access Control Lists (ACLs), Threshold Limit, Reverse Path Forwarding (IP Verify), and Network Load Balancing. Threshold Limit is found to be the most effective defense.
This paper applies a con-resistant trust mechanism to improve the performance of a communications-based special protection system to enhance its effectiveness and resiliency. Smart grids incorporate modern information technologies to increase reliability and efficiency through better situational awareness. However, with the benefits of this new technology come the added risks associated with threats and vulnerabilities to the technology and to the critical infrastructure it supports. The research in this paper uses con-resistant trust to quickly identify malicious or malfunctioning (untrustworthy) protection system nodes to mitigate instabilities. The con-resistant trust mechanism allows protection system nodes to make trust assessments based on the node's cooperative and defective behaviors. These behaviors are observed via frequency readings which are prediodically reported. The trust architecture is tested in experiments by comparing a simulated special protection system with a con-resistant trust mechanism to one without the mechanism via an analysis of the variance statistical model. Simulation results show promise for the proposed con-resistant trust mechanism.
Cryptography and steganography are the two major fields available for data security. While cryptography is a technique in which the information is scrambled in an unintelligent gibberish fashion during transmission, steganography focuses on concealing the existence of the information. Combining both domains gives a higher level of security in which even if the use of covert channel is revealed, the true information will not be exposed. This paper focuses on concealing multiple secret images in a single 24-bit cover image using LSB substitution based image steganography. Each secret image is encrypted before hiding in the cover image using Arnold Transform. Results reveal that the proposed method successfully secures the high capacity data keeping the visual quality of transmitted image satisfactory.
The video streaming between the sender and the receiver involves multiple unsecured hops where the video data can be illegally copied if the nodes run malicious forwarding logic. This paper introduces a novel method to stream video data through dual channels using dual data paths. The frames' pixels are also scrambled. The video frames are divided into two frame streams. At the receiver side video is re-constructed and played for a limited time period. As soon as small chunk of merged video is played, it is deleted from video buffer. The approach has been tried to formalize and initial simulation has been done over MATLAB. Preliminary results are optimistic and a refined approach may lead to a formal designing of network layer routing protocol with corrections in transport layer.
The modern day approach in boulevard network centers on efficient factor in safe routing. The safe routing must follow up the low risk cities. The troubles in routing are a perennial one confronting people day in and day out. The common goal of everyone using a boulevard seems to be reaching the desired point through the fastest manner which involves the balancing conundrum of multiple expected and unexpected influencing factors such as time, distance, security and cost. It is universal knowledge that travelling is an almost inherent aspect in everyone's daily routine. With the gigantic and complex road network of a modern city or country, finding a low risk community for traversing the distance is not easy to achieve. This paper follows the code based community for detecting the boulevard network and fuzzy technique for identifying low risk community.
Botnets are emerging as the most serious cyber threat among different forms of malware. Today botnets have been facilitating to launch many cybercriminal activities like DDoS, click fraud, phishing attacks etc. The main purpose of botnet is to perform massive financial threat. Many large organizations, banks and social networks became the target of bot masters. Botnets can also be leased to motivate the cybercriminal activities. Recently several researches and many efforts have been carried out to detect bot, C&C channels and bot masters. Ultimately bot maters also strengthen their activities through sophisticated techniques. Many botnet detection techniques are based on payload analysis. Most of these techniques are inefficient for encrypted C&C channels. In this paper we explore different categories of botnet and propose a detection methodology to classify bot host from the normal host by analyzing traffic flow characteristics based on time intervals instead of payload inspection. Due to that it is possible to detect botnet activity even encrypted C&C channels are used.
Integrity of image data plays an important role in data communication. Image data contain confidential information so it is very important to protect data from intruder. When data is transmitted through the network, there may be possibility that data may be get lost or damaged. Existing system does not provide all functionality for securing image during transmission. i.e image compression, encryption and user authentication. In this paper hybrid cryptosystem is proposed in which biometric fingerprint is used for key generation which is further useful for encryption purpose. Secret fragment visible mosaic image method is used for secure transmission of image. For reducing the size of image lossless compression technique is used which leads to the fast transmission of image data through transmission channel. The biometric fingerprint is useful for authentication purpose. Biometric method is more secure method of authentication because it requires physical presence of human being and it is untraceable.
The limited battery lifetime and rapidly increasing functionality of portable multimedia devices demand energy-efficient designs. The filters employed mainly in these devices are based on Gaussian smoothing, which is slow and, severely affects the performance. In this paper, we propose a novel energy-efficient approximate 2D Gaussian smoothing filter (2D-GSF) architecture by exploiting "nearest pixel approximation" and rounding-off Gaussian kernel coefficients. The proposed architecture significantly improves Speed-Power-Area-Accuracy (SPAA) metrics in designing energy-efficient filters. The efficacy of the proposed approximate 2D-GSF is demonstrated on real application such as edge detection. The simulation results show 72%, 79% and 76% reduction in area, power and delay, respectively with acceptable 0.4dB loss in PSNR as compared to the well-known approximate 2D-GSF.
This paper addresses the issue of magnetic resonance (MR) Image reconstruction at compressive sampling (or compressed sensing) paradigm followed by its segmentation. To improve image reconstruction problem at low measurement space, weighted linear prediction and random noise injection at unobserved space are done first, followed by spatial domain de-noising through adaptive recursive filtering. Reconstructed image, however, suffers from imprecise and/or missing edges, boundaries, lines, curvatures etc. and residual noise. Curvelet transform is purposely used for removal of noise and edge enhancement through hard thresholding and suppression of approximate sub-bands, respectively. Finally Genetic algorithms (GAs) based clustering is done for segmentation of sharpen MR Image using weighted contribution of variance and entropy values. Extensive simulation results are shown to highlight performance improvement of both image reconstruction and segmentation problems.
The increasing exploitation of the internet leads to new uncertainties, due to interdependencies and links between cyber and physical layers. As an example, the integration between telecommunication and physical processes, that happens when the power grid is managed and controlled, yields to epistemic uncertainty. Managing this uncertainty is possible using specific frameworks, usually coming from fuzzy theory such as Evidence Theory. This approach is attractive due to its flexibility in managing uncertainty by means of simple rule-based systems with data coming from heterogeneous sources. In this paper, Evidence Theory is applied in order to evaluate risk. Therefore, the authors propose a frame of discernment with a specific property among the elements based on a graph representation. This relationship leads to a smaller power set (called Reduced Power Set) that can be used as the classical power set, when the most common combination rules, such as Dempster or Smets, are applied. The paper demonstrates how the use of the Reduced Power Set yields to more efficient algorithms for combining evidences and to application of Evidence Theory for assessing risk.
In this paper, we focus on the principal-agent problems in continuous time when the participants have ambiguity on the output process in the framework of g-expectation. The first best (or, risk-sharing) type is studied. The necessary condition of the optimal contract is derived by means of the optimal control theory. Finally, we present some examples to clarify our results.
Quadrature compressive sampling (QuadCS) is a newly introduced sub-Nyquist sampling for acquiring inphase and quadrature components of radio-frequency signals. This paper develops a target detection scheme of pulsed-type radars in the presence of digital radio frequency memory (DRFM) repeat jammers with the radar echoes sampled by the QuadCS system. For diversifying pulses, the proposed scheme first separates the target echoes from the DRFM repeat jammers using CS recovery algorithms, and then removes the jammers to perform the target detection. Because of the separation processing, the jammer leakage through range sidelobe variation of the classical match-filter processing will not appear. Simulation results confirm our findings. The proposed scheme with the data at one eighth the Nyquist rate outperforms the classic processing with Nyquist samples in the presence of DRFM repeat jammers.
Sampling multiband radar signals is an essential issue of multiband/multifunction radar. This paper proposes a multiband quadrature compressive sampling (MQCS) system to perform the sampling at sub-Landau rate. The MQCS system randomly projects the multiband signal into a compressive multiband one by modulating each subband signal with a low-pass signal and then samples the compressive multiband signal at Landau-rate with output of compressive measurements. The compressive inphase and quadrature (I/Q) components of each subband are extracted from the compressive measurements respectively and are exploited to recover the baseband I/Q components. As effective bandwidth of the compressive multiband signal is much less than that of the received multiband one, the sampling rate is much less than Landau rate of the received signal. Simulation results validate that the proposed MQCS system can effectively acquire and reconstruct the baseband I/Q components of the multiband signals.
Compressive sensing (CS) is a novel technology for sparse signal acquisition with sub-Nyquist sampling rate but with relative high resolution. Photonics-assisted CS has attracted much attention recently due the benefit of wide bandwidth provided by photonics. This paper discusses the approaches to realizing photonics-assisted CS.