Biblio
Information, not just data, is key to today's global challenges. To solve these challenges requires not only advancing geospatial and big data analytics but requires new analysis and decision-making environments that enable reliable decisions from trustable, understandable information that go beyond current approaches to machine learning and artificial intelligence. These environments are successful when they effectively couple human decision making with advanced, guided spatial analytics in human-computer collaborative discourse and decision making (HCCD). Our HCCD approach builds upon visual analytics, natural scale templates, traceable information, human-guided analytics, and explainable and interactive machine learning, focusing on empowering the decisionmaker through interactive visual spatial analytic environments where non-digital human expertise and experience can be combined with state-of-the-art and transparent analytical techniques. When we combine this approach with real-world application-driven research, not only does the pace of scientific innovation accelerate, but impactful change occurs. I'll describe how we have applied these techniques to challenges in sustainability, security, resiliency, public safety, and disaster management.
The Internet of Things (IoT) and RFID devices are essential parts of the new information technology generation. They are mostly characterized by their limited power and computing resources. In order to ensure their security under computing and power constraints, a number of lightweight cryptography algorithms has emerged. This paper outlines the performance analysis of six lightweight blocks crypto ciphers with different structures - LED, PRESENT, HIGHT, LBlock, PICCOLO and TWINE on a LEON3 open source processor. We have implemented these crypto ciphers on the FPGA board using the C language and the LEON3 processor. Analysis of these crypto ciphers is evaluated after considering various benchmark parameters like throughput, execution time, CPU performance, AHB bandwidth, Simulator performance, and speed. These metrics are tested with different key sizes provided by each crypto algorithm.
Dendritic cell algorithm (DCA) is an immune-inspired classification algorithm which is developed for the purpose of anomaly detection in computer networks. The DCA uses a weighted function in its context detection phase to process three categories of input signals including safe, danger and pathogenic associated molecular pattern to three output context values termed as co-stimulatory, mature and semi-mature, which are then used to perform classification. The weighted function used by the DCA requires either manually pre-defined weights usually provided by the immunologists, or empirically derived weights from the training dataset. Neither of these is sufficiently flexible to work with different datasets to produce optimum classification result. To address such limitation, this work proposes an approach for computing the three output context values of the DCA by employing the recently proposed TSK+ fuzzy inference system, such that the weights are always optimal for the provided data set regarding a specific application. The proposed approach was validated and evaluated by applying it to the two popular datasets KDD99 and UNSW NB15. The results from the experiments demonstrate that, the proposed approach outperforms the conventional DCA in terms of classification accuracy.
During the last years, the Modular Multilevel Matrix Converter (M3C) has been investigated due to its capacity tooperate in high voltage and power levels. This converter is appropriate for Wind Energy Conversion Systems (WECSs), due to its advantages such as redundancy, high power quality, expandability and control flexibility. For Double-Fed Induction Generator (DFIG) WECSs, the M3C has advantages additional benefits, for instance, high power density in the rotor, with a more compact modular converter, and control of bidirectional reactive power flow. Therefore, this paper presents a WECS composed of a DFIG and an M3C. The modelling and control of this WECS topology are described and analyzed in this paper. Additionally, simulation results are presented to validate the effectiveness of this proposal.
With the wide use of smart device made huge amount of information arise. This information needed new methods to deal with it from that perspective big data concept arise. Most of the concerns on big data are given to handle data without concentrating on its security. Encryption is the best use to keep data safe from malicious users. However, ordinary encryption methods are not suitable for big data. Selective encryption is an encryption method that encrypts only the important part of the message. However, we deal with uncertainty to evaluate the important part of the message. The problem arises when the important part is not encrypted. This is the motivation of the paper. In this paper we propose security framework to secure important and unimportant portion of the message to overcome the uncertainty. However, each will take a different encryption technique for better performance without losing security. The framework selects the important parts of the message to be encrypted with a strong algorithm and the weak part with a medium algorithm. The important of the word is defined according to how its origin frequently appears. This framework is applied on amazon EC2 (elastic compute cloud). A comparison between the proposed framework, the full encryption method and Toss-A-Coin method are performed according to encryption time and throughput. The results showed that the proposed method gives better performance according to encryption time, throughput than full encryption.
Protocols for securely testing the equality of two encrypted integers are common building blocks for a number of proposals in the literature that aim for privacy preservation. Being used repeatedly in many cryptographic protocols, designing efficient equality testing protocols is important in terms of computation and communication overhead. In this work, we consider a scenario with two parties where party A has two integers encrypted using an additively homomorphic scheme and party B has the decryption key. Party A would like to obtain an encrypted bit that shows whether the integers are equal or not but nothing more. We propose three secure equality testing protocols, which are more efficient in terms of communication, computation or both compared to the existing work. To support our claims, we present experimental results, which show that our protocols achieve up to 99% computation-wise improvement compared to the state-of-the-art protocols in a fair experimental set-up.
State estimation allows continuous monitoring of a power system by estimating the power system state variables from measurement data. Unfortunately, the measurement data provided by the devices can serve as attack vectors for false data injection attacks. As more components are connected to the internet, power system is exposed to various known and unknown cyber threats. Previous investigations have shown that false data can be injected on data from traditional meters that bypasses bad data detection systems. This paper extends this investigation by giving an overview of cyber security threats to phasor measurement units, assessing the impact of false data injection on hybrid state estimators and suggesting security recommendations. Simulations are performed on IEEE-30 and 118 bus test systems.
Recently Distributed Denial-of-Service (DDoS) are becoming more and more sophisticated, which makes the existing defence systems not capable of tolerating by themselves against wide-ranging attacks. Thus, collaborative protection mitigation has become a needed alternative to extend defence mechanisms. However, the existing coordinated DDoS mitigation approaches either they require a complex configuration or are highly-priced. Blockchain technology offers a solution that reduces the complexity of signalling DDoS system, as well as a platform where many autonomous systems (Ass) can share hardware resources and defence capabilities for an effective DDoS defence. In this work, we also used a Deep learning DDoS detection system; we identify individual DDoS attack class and also define whether the incoming traffic is legitimate or attack. By classifying the attack traffic flow separately, our proposed mitigation technique could deny only the specific traffic causing the attack, instead of blocking all the traffic coming towards the victim(s).
This paper presents a novel technique to quantify the operational resilience for power electronic-based components affected by High-Impact Low-Frequency (HILF) weather-related events such as high speed winds. In this study, the resilience quantification is utilized to investigate how prompt the system goes back to the pre-disturbance or another stable operational state. A complexity quantification metric is used to assess the system resilience. The test system is a Solid-State Transformer (SST) representing a complex, nonlinear interconnected system. Results show the effectiveness of the proposed technique for quantifying the operational resilience in systems affected by weather-related disturbances.
Blockchain-based cryptocurrencies secure a decentralized consensus protocol by incentives. The protocol participants, called miners, generate (mine) a series of blocks, each containing monetary transactions created by system users. As incentive for participation, miners receive newly minted currency and transaction fees paid by transaction creators. Blockchain bandwidth limits lead users to pay increasing fees in order to prioritize their transactions. However, most prior work focused on models where fees are negligible. In a notable exception, Carlsten et al. [17] postulated that if incentives come only from fees then a mining gap would form\textasciitilde— miners would avoid mining when the available fees are insufficient. In this work, we analyze cryptocurrency security in realistic settings, taking into account all elements of expenses and rewards. To study when gaps form, we analyze the system as a game we call the gap game. We analyze the game with a combination of symbolic and numeric analysis tools in a wide range of scenarios. Our analysis confirms Carlsten et al.'s postulate; indeed, we show that gaps form well before fees are the only incentive, and analyze the implications on security. Perhaps surprisingly, we show that different miners choose different gap sizes to optimize their utility, even when their operating costs are identical. Alarmingly, we see that the system incentivizes large miner coalitions, reducing system decentralization. We describe the required conditions to avoid the incentive misalignment, providing guidelines for future cryptocurrency design.
Due to privacy threats associated with computation of outsourced data, processing data on the encrypted domain has become a viable alternative. Secure computation of encrypted data is relevant for analysing datasets in areas (such as genome processing, private data aggregation, cloud computations) that require basic arithmetic operations. Performing division operation over-all encrypted inputs has not been achieved using homomorphic schemes in non-interactive modes. In interactive protocols, the cost of obtaining an encrypted quotient (from encrypted values) is computationally expensive. To the best of our knowledge, existing homomorphic solutions on encrypted division are often relaxed to consider public or private divisor. We acknowledge that there are other techniques such as secret sharing and garbled circuits adopted to compute secure division, but we are interested in homomorphic solutions. We propose an efficient and interactive two-party protocol that computes the fixed-point quotient of two encrypted inputs, using an efficient and secure comparison protocol as a sub-protocol. Our proposal provides a computational advantage, with a linear complexity in the digit precision of the quotient. We provide proof of security in the universally composable framework and complexity analyses. We present experimental results for two cryptosystem implementations in order to compare performance. An efficient prototype of our protocol is implemented using additive homomorphic scheme (Paillier), whereas a non-efficient fully-homomorphic scheme (BGV) version is equally presented as a proof of concept and analyses of our proposal.
In this paper, the security performance of a dual-hop underlay cognitive radio (CR) system is investigated. In this system, we consider that the transmitted information by a source node S is forwarded by a multi-antenna relay R to its intended destination D. The relay performs the maximal-ratio combining (MRC) technique to process the multiple copies of the received signal. We also consider the presence of an eavesdropper who is attempting to intercept the transmitted information at both communication links, (i.e, S-R and R-D). In underlay cognitive radio networks (CRN), the source and the relay are required to adjust their transmission power to avoid causing interference to the primary user. Under this constraint, a closed-form expression of the secrecy outage probability is derived subject to Nakagami-m fading model. The derived expression is validated using Monte-Carlo simulation for various values of fading severity parameters as well as the number of MRC branches.
The collection of students' sensible data raises adverse reactions against Learning Analytics that decreases the confidence in its adoption. The laws and policies that surround the use of educational data are not enough to ensure privacy, security, validity, integrity and reliability of students' data. This problem has been detected through literature review and can be solved if a technological layer of automated checking rules is added above these policies. The aim of this thesis is to research about an emerging technology such as blockchain to preserve the identity of students and secure their data. In a first stage a systematic literature review will be conducted in order to set the context of the research. Afterwards, and through the scientific method, we will develop a blockchain based solution to automate rules and constraints with the aim to let students the governance of their data and to ensure data privacy and security.
With the growth of technology, designs became more complex and may contain bugs. This makes verification an indispensable part in product development. UVM describe a standard method for verification of designs which is reusable and portable. This paper verifies IIC bus protocol using Universal Verification Methodology. IIC controller is designed in Verilog using Vivado. It have APB interface and its function and code coverage is carried out in Mentor graphic Questasim 10.4e. This work achieved 83.87% code coverage and 91.11% functional coverage.
This article describes a privacy policy framework that can represent and reason about complex privacy policies. By using a Common Data Model together with a formal shareability theory, this framework enables the specification of expressive policies in a concise way without burdening the user with technical details of the underlying formalism. We also build a privacy policy decision engine that implements the framework and that has been deployed as the policy decision point in a novel enterprise privacy prototype system. Our policy decision engine supports two main uses: (1) interfacing with user interfaces for the creation, validation, and management of privacy policies; and (2) interfacing with systems that manage data requests and replies by coordinating privacy policy engine decisions and access to (encrypted) databases using various privacy enhancing technologies.
In this paper, we analyze the effectiveness of quantum algorithms to solve some classical computing complexities. In fact, we focus in this study on several famous quantum algorithms, where we discussed their impact on classical computing using in computer science.
Being able to describe a specific network as consistent is a large step towards resiliency. Next to the importance of security lies the necessity of consistency verification. Attackers are currently focusing on targeting small and crutial goals such as network configurations or flow tables. These types of attacks would defy the whole purpose of a security system when built on top of an inconsistent network. Advances in Artificial Intelligence (AI) are playing a key role in ensuring a fast responce to the large number of evolving threats. Software Defined Networking (SDN), being centralized by design, offers a global overview of the network. Robustness and adaptability are part of a package offered by programmable networking, which drove us to consider the integration between both AI and SDN. The general goal of our series is to achieve an Artificial Intelligence Resiliency System (ARS). The aim of this paper is to propose a new AI-based consistency verification system, which will be part of ARS in our future work. The comparison of different deep learning architectures shows that Convolutional Neural Networks (CNN) give the best results with an accuracy of 99.39% on our dataset and 96% on our consistency test scenario.
A Stoner-Wohlfarth-type model is used to demonstrate the effect of the buildup of magnetic charges near the grain boundaries of low anisotropy polycrystalline materials, revealed by measuring the magnetization during positive-field warming after negative-field cooling. The remnant magnetization after negative-field cooling has two different contributions. The temperature-dependent component is modeled as an assembly of particles with thermal relaxation. The temperature-independent component is modeled as an assembly of particles overcoming variable phenomenological energy barriers corresponding to the change in susceptibility when the anisotropy constant changes its sign. The model is applicable to soft-magnetic materials where the buildup of the magnetic charges near the grain boundaries creates demagnetizing fields opposing, and comparable in magnitude to, the anisotropy field. The results of the model are in qualitative agreement with published data revealing the magneto-thermal characteristics of polycrystalline gadolinium.
We introduce FairSwap – an efficient protocol for fair exchange of digital goods using smart contracts. A fair exchange protocol allows a sender S to sell a digital commodity x for a fixed price p to a receiver R. The protocol is said to be secure if R only pays if he receives the correct x. Our solution guarantees fairness by relying on smart contracts executed over decentralized cryptocurrencies, where the contract takes the role of an external judge that completes the exchange in case of disagreement. While in the past there have been several proposals for building fair exchange protocols over cryptocurrencies, our solution has two distinctive features that makes it particular attractive when users deal with large commodities. These advantages are: (1) minimizing the cost for running the smart contract on the blockchain, and (2) avoiding expensive cryptographic tools such as zero-knowledge proofs. In addition to our new protocols, we provide formal security definitions for smart contract based fair exchange, and prove security of our construction. Finally, we illustrate several applications of our basic protocol and evaluate practicality of our approach via a prototype implementation for fairly selling large files over the cryptocurrency Ethereum. This article is summarized in: the morning paper an interesting/influential/important paper from the world of CS every weekday morning, as selected by Adrian Colyer
This paper presents Checked C, an extension to C designed to support spatial safety, implemented in Clang and LLVM. Checked C's design is distinguished by its focus on backward-compatibility, incremental conversion, developer control, and enabling highly performant code. Like past approaches to a safer C, Checked C employs a form of checked pointer whose accesses can be statically or dynamically verified. Performance evaluation on a set of standard benchmark programs shows overheads to be relatively low. More interestingly, Checked C introduces the notions of a checked region and bounds-safe interfaces.