Biblio
Intrusion Detection system (IDS) was an application which was aimed to monitor network activity or system and it could find if there was a dangerous operation. Implementation of IDS on Software Define Network architecture (SDN) has drawbacks. IDS on SDN architecture might decreasing network Quality of Service (QoS). So the network could not provide services to the existing network traffic. Throughput, delay and packet loss were important parameters of QoS measurement. Snort IDS and bro IDS were tools in the application of IDS on the network. Both had differences, one of which was found in the detection method. Snort IDS used a signature based detection method while bro IDS used an anomaly based detection method. The difference between them had effects in handling the network traffic through it. In this research, we compared both tools. This comparison are done with testing parameters such as throughput, delay, packet loss, CPU usage, and memory usage. From this test, it was found that bro outperform snort IDS for throughput, delay , and packet loss parameters. However, CPU usage and memory usage on bro requires higher resource than snort.
In this paper, we develop a statistical framework for image steganography in which the cover and stego messages are modeled as multivariate Gaussian random variables. By minimizing the detection error of an optimal detector within the generalized adopted statistical model, we propose a novel Gaussian embedding method. Furthermore, we extend the formulation to cost-based steganography, resulting in a universal embedding scheme that works with embedding costs as well as variance estimators. Experimental results show that the proposed approach avoids embedding in smooth regions and significantly improves the security of the state-of-the-art methods, such as HILL, MiPOD, and S-UNIWARD.
The effects of quantum confinement on the charge distribution in planar Double-Gate (DG) SOI (Siliconon-Insulator) MOSFETs were examined, for sub-10 nm SOI film thicknesses (tsi $łeq$ 10 nm), by modeling the potential experienced by the charge carriers as that of an an-harmonic oscillator potential, consistent with the inherent structural symmetry of nanoscale symmetric DGSOI MOSFETs. By solving the 1-D Poisson's equation using this potential, the results obtained were validated through comparisons with TCAD simulations. The present model satisfactorily predicted the electron density and channel charge density for a wide range of SOI channel thicknesses and gate voltages.
Due to the importance of securing electronic transactions, many cryptographic protocols have been employed, that mainly depend on distributed keys between the intended parties. In classical computers, the security of these protocols depends on the mathematical complexity of the encoding functions and on the length of the key. However, the existing classical algorithms 100% breakable with enough computational power, which can be provided by quantum machines. Moving to quantum computation, the field of security shifts into a new area of cryptographic solutions which is now the field of quantum cryptography. The era of quantum computers is at its beginning. There are few practical implementations and evaluations of quantum protocols. Therefore, the paper defines a well-known quantum key distribution protocol which is BB84 then provides a practical implementation of it on IBM QX software. The practical implementations showed that there were differences between BB84 theoretical expected results and the practical implementation results. Due to this, the paper provides a statistical analysis of the experiments by comparing the standard deviation of the results. Using the BB84 protocol the existence of a third-party eavesdropper can be detected. Thus, calculations of the probability of detecting/not detecting a third-party eavesdropping have been provided. These values are again compared to the theoretical expectation. The calculations showed that with the greater number of qubits, the percentage of detecting eavesdropper will be higher.
Originally implemented by Google, QUIC gathers a growing interest by providing, on top of UDP, the same service as the classical TCP/TLS/HTTP/2 stack. The IETF will finalise the QUIC specification in 2019. A key feature of QUIC is that almost all its packets, including most of its headers, are fully encrypted. This prevents eavesdropping and interferences caused by middleboxes. Thanks to this feature and its clean design, QUIC is easier to extend than TCP. In this paper, we revisit the reliable transmission mechanisms that are included in QUIC. More specifically, we design, implement and evaluate Forward Erasure Correction (FEC) extensions to QUIC. These extensions are mainly intended for high-delays and lossy communications such as In-Flight Communications. Our design includes a generic FEC frame and our implementation supports the XOR, Reed-Solomon and Convolutional RLC error-correcting codes. We also conservatively avoid hindering the loss-based congestion signal by distinguishing the packets that have been received from the packets that have been recovered by the FEC. We evaluate its performance by applying an experimental design covering a wide range of delay and packet loss conditions with reproducible experiments. These confirm that our modular design allows the protocol to adapt to the network conditions. For long data transfers or when the loss rate and delay are small, the FEC overhead negatively impacts the download completion time. However, with high packet loss rates and long delays or smaller files, FEC allows drastically reducing the download completion time by avoiding costly retransmission timeouts. These results show that there is a need to use FEC adaptively to the network conditions.
With the proliferation of smartphones, a novel sensing paradigm called Mobile Crowd Sensing (MCS) has emerged very recently. However, the attacks and faults in MCS cause a serious false data problem. Observing the intrinsic low dimensionality of general monitoring data and the sparsity of false data, false data detection can be performed based on the separation of normal data and anomalies. Although the existing separation algorithm based on Direct Robust Matrix Factorization (DRMF) is proven to be effective, requiring iteratively performing Singular Value Decomposition (SVD) for low-rank matrix approximation would result in a prohibitively high accumulated computation cost when the data matrix is large. In this work, we observe the quick false data location feature from our empirical study of DRMF, based on which we propose an intelligent Light weight Low Rank and False Matrix Separation algorithm (LightLRFMS) that can reuse the previous result of the matrix decomposition to deduce the one for the current iteration step. Our algorithm can largely speed up the whole iteration process. From a theoretical perspective, we validate that LightLRFMS only requires one round of SVD computation and thus has very low computation cost. We have done extensive experiments using a PM 2.5 air condition trace and a road traffic trace. Our results demonstrate that LightLRFMS can achieve very good false data detection performance with the same highest detection accuracy as DRMF but with up to 10 times faster speed thanks to its lower computation cost.
Vehicular Named Data Network (VNDN) uses Named Data Network (NDN) as a communication enabler. The communication is achieved using the content name instead of the host address. NDN integrates content caching at the network level rather than the application level. Hence, the network becomes aware of content caching and delivering. The content caching is a fundamental element in VNDN communication. However, due to the limitations of the cache store, only the most used content should be cached while the less used should be evicted. Traditional caching replacement policies may not work efficiently in VNDN due to the large and diverse exchanged content. To solve this issue, we propose an efficient cache replacement policy that takes the quality of service into consideration. The idea consists of classifying the traffic into different classes, and split the cache store into a set of sub-cache stores according to the defined traffic classes with different storage capacities according to the network requirements. Each content is assigned a popularity-density value that balances the content popularity with its size. Content with the highest popularity-density value is cached while the lowest is evicted. Simulation results prove the efficiency of the proposed solution to enhance the overall network quality of service.
Securing Internet of things is a major concern as it deals with data that are personal, needed to be reliable, can direct and manipulate device decisions in a harmful way. Also regarding data generation process is heterogeneous, data being immense in volume, complex management. Quantum Computing and Internet of Things (IoT) coined as Quantum IoT defines a concept of greater security design which harness the virtue of quantum mechanics laws in Internet of Things (IoT) security management. Also it ensures secured data storage, processing, communication, data dynamics. In this paper, an IoT security infrastructure is introduced which is a hybrid one, with an extra layer, which ensures quantum state. This state prevents any sort of harmful actions from the eavesdroppers in the communication channel and cyber side, by maintaining its state, protecting the key by quantum cryptography BB84 protocol. An adapted version is introduced specific to this IoT scenario. A classical cryptography system `One-Time pad (OTP)' is used in the hybrid management. The novelty of this paper lies with the integration of classical and quantum communication for Internet of Things (IoT) security.
This paper presents the results of a qualitative study on discussions about two major law enforcement interventions against Dark Net Market (DNM) users extracted from relevant Reddit forums. We assess the impact of Operation Hyperion and Operation Bayonet (combined with the closure of the site Hansa) by analyzing posts and comments made by users of two Reddit forums created for the discussion of Dark Net Markets. The operations are compared in terms of the size of the discussions, the consequences recorded, and the opinions shared by forum users. We find that Operation Bayonet generated a higher number of discussions on Reddit, and from the qualitative analysis of such discussions it appears that this operation also had a greater impact on the DNM ecosystem.
Quantum low probability of intercept transmits ciphertext in a way that prevents an eavesdropper possessing the decryption key from recovering the plaintext. It is capable of Gbps communication rates on optical fiber over metropolitan-area distances.
Quickest detection of false data injection attacks (FDIAs) in dynamic smart grids is considered in this paper. The unknown time-varying state variables of the smart grid and the FDIAs impose a significant challenge for designing a computationally efficient detector. To address this challenge, we propose new Cumulative-Sum-type algorithms with computational complex scaling linearly with the number of meters. Moreover, for any constraint on the expected false alarm period, a lower bound on the threshold employed in the proposed algorithm is provided. For any given threshold employed in the proposed algorithm, an upper bound on the worstcase expected detection delay is also derived. The proposed algorithm is numerically investigated in the context of an IEEE standard power system under FDIAs, and is shown to outperform some representative algorithm in the test case.
The problems of random numbers application to the information security of data, communication lines, computer units and automated driving systems are considered. The possibilities for making up quantum generators of random numbers and existing solutions for acquiring of sufficiently random sequences are analyzed. The authors found out the method for the creation of quantum generators on the basis of semiconductor electronic components. The electron-quantum generator based on electrons tunneling is experimentally demonstrated. It is shown that it is able to create random sequences of high security level and satisfying known NIST statistical tests (P-Value\textbackslashtextgreater0.9). The generator created can be used for formation of both closed and open cryptographic keys in computer systems and other platforms and has great potential for realization of random walks and probabilistic computing on the basis of neural nets and other IT problems.