Biblio
The problems of random numbers application to the information security of data, communication lines, computer units and automated driving systems are considered. The possibilities for making up quantum generators of random numbers and existing solutions for acquiring of sufficiently random sequences are analyzed. The authors found out the method for the creation of quantum generators on the basis of semiconductor electronic components. The electron-quantum generator based on electrons tunneling is experimentally demonstrated. It is shown that it is able to create random sequences of high security level and satisfying known NIST statistical tests (P-Value\textbackslashtextgreater0.9). The generator created can be used for formation of both closed and open cryptographic keys in computer systems and other platforms and has great potential for realization of random walks and probabilistic computing on the basis of neural nets and other IT problems.
The paper discusses the architectural, algorithmic and computing aspects of creating and operating a class of expert system for managing technological safety of an enterprise, in conditions of a large flow of diagnostic variables. The algorithm for finding a faulty technological chain uses expert information, formed as a set of evidence on the influence of diagnostic variables on the correctness of the technological process. Using the Dempster-Schafer trust function allows determining the overall probability measure on subsets of faulty process chains. To combine different evidence, the orthogonal sums of the base probabilities determined for each evidence are calculated. The procedure described above is converted into the rules of the knowledge base production. The description of the developed prototype of the expert system, its architecture, algorithmic and software is given. The functionality of the expert system and configuration tools for a specific type of production are under discussion.
Science gateways bring out the possibility of reproducible science as they are integrated into reusable techniques, data and workflow management systems, security mechanisms, and high performance computing (HPC). We introduce BioinfoPortal, a science gateway that integrates a suite of different bioinformatics applications using HPC and data management resources provided by the Brazilian National HPC System (SINAPAD). BioinfoPortal follows the Software as a Service (SaaS) model and the web server is freely available for academic use. The goal of this paper is to describe the science gateway and its usage, addressing challenges of designing a multiuser computational platform for parallel/distributed executions of large-scale bioinformatics applications using the Brazilian HPC resources. We also present a study of performance and scalability of some bioinformatics applications executed in the HPC environments and perform machine learning analyses for predicting features for the HPC allocation/usage that could better perform the bioinformatics applications via BioinfoPortal.
Complex software is built by composing components implementing largely independent blocks of functionality. However, once the sources are compiled into an executable, that modularity is lost. This is unfortunate for code recipients, for whom knowing the components has many potential benefits, such as improved program understanding for reverse-engineering, identifying shared code across different programs, binary code reuse, and authorship attribution. A novel approach for decomposing such source-free program executables into components is here proposed. Given an executable, the approach first statically builds a decomposition graph, where nodes are functions and edges capture three types of relationships: code locality, data references, and function calls. It then applies a graph-theoretic approach to partition the functions into disjoint components. A prototype implementation, BCD, demonstrates the approach's efficacy: Evaluation of BCD with 25 C++ binary programs to recover the methods belonging to each class achieves high precision and recall scores for these tested programs.
In this paper, we propose a graph-based algorithmic technique for malware detection, utilizing the System-call Dependency Graphs (ScDG) obtained through taint analysis traces. We leverage the grouping of system-calls into system-call groups with respect to their functionality to merge disjoint vertices of ScDG graphs, transforming them to Group Relation Graphs (GrG); note that, the GrG graphs represent malware's behavior being hence more resilient to probable mutations of its structure. More precisely, we extend the use of GrG graphs by mapping their vertices on the plane utilizing the degrees and the vertex-weights of a specific underlying graph of the GrG graph as to compute domination relations. Furthermore, we investigate how the activity of each system-call group could be utilized in order to distinguish graph-representations of malware and benign software. The domination relations among the vertices of GrG graphs result to a new graph representation that we call Coverage Graph of the GrG graph. Finally, we evaluate the potentials of our detection model using graph similarity between Coverage Graphs of known malicious and benign software samples of various types.
Assured Mission Delivery Network (AMDN) is a collaborative network to support data-intensive scientific collaborations in a multi-cloud environment. Each scientific collaboration group, called a mission, specifies a set of rules to handle computing and network resources. Security is an integral part of the AMDN design since the rules must be set by authorized users and the data generated by each mission may be privacy-sensitive. In this paper, we propose a CertificateLess cryptography-based Rule-management Protocol (CL-RP) for AMDN, which supports authenticated rule registrations and updates with non-repudiation. We evaluate CL-RP through test-bed experiments and compare it with other standard protocols.
Aerial photography is fast becoming essential in scientific research that requires multi-agent system in several perspective and we proposed a secured system using one of the well-known public key cryptosystem namely NTRU that is somewhat homomorphic in nature. Here we processed images of aerial photography that were captured by multi-agents. The agents encrypt the images and upload those in the cloud server that is untrusted. Cloud computing is a buzzword in modern era and public cloud is being used by people everywhere for its shared, on-demand nature. Cloud Environment faces a lot of security and privacy issues that needs to be solved. This paper focuses on how to use cloud so effectively that there remains no possibility of data or computation breaches from the cloud server itself as it is prone to the attack of treachery in different ways. The cloud server computes on the encrypted data without knowing the contents of the images. After concatenation, encrypted result is delivered to the concerned authority where it is decrypted retaining its originality. We set up our experiment in Amazon EC2 cloud server where several instances were the agents and an instance acted as the server. We varied several parameters so that we could minimize encryption time. After experimentation we produced our desired result within feasible time sustaining the image quality. This work ensures data security in public cloud that was our main concern.
Industrial networking has many issues based on the type of industries, data storage, data centers, and cloud computing, etc. Green data storage improves the scientific, commercial and industrial profile of the networking. Future industries are looking for cybersecurity solution with the low-cost resources in which the energy serving is the main problem in the industrial networking. To improve these problems, green data storage will be the priority because data centers and cloud computing deals with the data storage. In this analysis, we have decided to use solar energy source and different light rays as methodologies include a prism and the Li-Fi techniques. In this approach, light rays sent through the prism which allows us to transmit the data with different frequencies. This approach provides green energy and maximum protection within the data center. As a result, we have illustrated that cloud services within the green data center in industrial networking will achieve better protection with the low-cost energy through this analysis. Finally, we have to conclude that Li-Fi enhances the use of green energy and protection which are advantages to current and future industrial networking.
Security and privacy of big data becomes challenging as data grows and more accessible by more and more clients. Large-scale data storage is becoming a necessity for healthcare, business segments, government departments, scientific endeavors and individuals. Our research will focus on the privacy, security and how we can make sure that big data is secured. Managing security policy is a challenge that our framework will handle for big data. Privacy policy needs to be integrated, flexible, context-aware and customizable. We will build a framework to receive data from customer and then analyze data received, extract privacy policy and then identify the sensitive data. In this paper we will present the techniques for privacy policy which will be created to be used in our framework.