Biblio
With the rapid development of Internet of things (IOT) and big data, the number of network terminal devices and big data transmission are increasing rapidly. Traditional cloud computing faces a great challenge in dealing with this massive amount of data. Fog computing which extends the computing at the edge of the network can provide computation and data storage. Attribute based-encryption can effectively achieve the fine-grained access control. However, the computational complexity of the encryption and decryption is growing linearly with the increase of the number of attributes. In order to reduce the computational cost and guarantee the confidentiality of data, distributed access control with outsourced computation in fog computing is proposed in this paper. In our proposed scheme, fog device takes most of computational cost in encryption and decryption phase. The computational cost of the receiver and sender can be reduced. Moreover, the private key of the user is generated by multi-authority which can enhance the security of data. The analysis of security and performance shows that our proposed scheme proves to be effective and secure.
As cloud services enter the Internet market, cloud security issues are gradually exposed. In the era of knowledge economy, the unique potential value of big data is being gradually explored. However, the control of data security is facing many challenges. According to the development status and characteristics of database within the cloud environment, this paper preliminary studies on the database security risks faced by the “three-clouds” of State Grid Corporation of China. Based on the mature standardization of information security, this paper deeply studies the database security requirements of cloud environment, and six-step method for cloud database protection is presented, which plays an important role in promoting development of security work for the cloud database. Four key technologies of cloud database security protection are introduced, including database firewall technology, sensitive data encryption, production data desensitization, and database security audit technology. It is helpful to the technology popularization of the grade protection in the security of the cloud database, and plays a great role in the construction of the security of the state grid.
With the advent of the big data era, information systems have exhibited some new features, including boundary obfuscation, system virtualization, unstructured and diversification of data types, and low coupling among function and data. These features not only lead to a big difference between big data technology (DT) and information technology (IT), but also promote the upgrading and evolution of network security technology. In response to these changes, in this paper we compare the characteristics between IT era and DT era, and then propose four DT security principles: privacy, integrity, traceability, and controllability, as well as active and dynamic defense strategy based on "propagation prediction, audit prediction, dynamic management and control". We further discuss the security challenges faced by DT and the corresponding assurance strategies. On this basis, the big data security technologies can be divided into four levels: elimination, continuation, improvement, and innovation. These technologies are analyzed, combed and explained according to six categories: access control, identification and authentication, data encryption, data privacy, intrusion prevention, security audit and disaster recovery. The results will support the evolution of security technologies in the DT era, the construction of big data platforms, the designation of security assurance strategies, and security technology choices suitable for big data.
With the wide use of smart device made huge amount of information arise. This information needed new methods to deal with it from that perspective big data concept arise. Most of the concerns on big data are given to handle data without concentrating on its security. Encryption is the best use to keep data safe from malicious users. However, ordinary encryption methods are not suitable for big data. Selective encryption is an encryption method that encrypts only the important part of the message. However, we deal with uncertainty to evaluate the important part of the message. The problem arises when the important part is not encrypted. This is the motivation of the paper. In this paper we propose security framework to secure important and unimportant portion of the message to overcome the uncertainty. However, each will take a different encryption technique for better performance without losing security. The framework selects the important parts of the message to be encrypted with a strong algorithm and the weak part with a medium algorithm. The important of the word is defined according to how its origin frequently appears. This framework is applied on amazon EC2 (elastic compute cloud). A comparison between the proposed framework, the full encryption method and Toss-A-Coin method are performed according to encryption time and throughput. The results showed that the proposed method gives better performance according to encryption time, throughput than full encryption.
Nowadays, many applications involve big data and big data analysis methods appear in many fields. As a preliminary attempt to solve the challenge of big data analysis, this paper presents a distributed online learning algorithm based on differential privacy. Since online learning can effectively process sensitive data, we introduce the concept of differential privacy in distributed online learning algorithms, with the aim at ensuring data privacy during online learning to prevent adversarial nodes from inferring any important data information. In particular, for different adversary models, we consider different type graphs to tolerate a limited number of adversaries near each regular node or tolerate a global limited number of adversaries.
CFRS (Collaborative Filtering Recommendation System) is one of the most widely used individualized recommendation systems. However, CFRS is susceptible to shilling attacks based on profile injection. The current research on shilling attack mainly focuses on the recognition of false user profiles, but these methods depend on the specific attack models and the computational cost is huge. From the view of item, some abnormal item detection methods are proposed which are independent of attack models and overcome the defects of user profiles model, but its detection rate, false alarm rate and time overhead need to be further improved. In order to solve these problems, it proposes an abnormal item detection method based on time window merging. This method first uses the small window to partition rating time series, and determine whether the window is suspicious in terms of the number of abnormal ratings within it. Then, the suspicious small windows are merged to form suspicious intervals. We use the rating distribution characteristics RAR (Ratio of Abnormal Rating), ATIAR (Average Time Interval of Abnormal Rating), DAR(Deviation of Abnormal Rating) and DTIAR (Deviation of Time Interval of Abnormal Rating) in the suspicious intervals to determine whether the item is subject to attacks. Experiment results on the MovieLens 100K data set show that the method has a high detection rate and a low false alarm rate.
Cyber criminals have been extensively using malicious Ransomware software for years. Ransomware is a subset of malware in which the data on a victim's computer is locked, typically by encryption, and payment is demanded before the ransomed data is decrypted and access returned to the victim. The motives for such attacks are not only limited to economical scumming. Illegal attacks on official databases may also target people with political or social power. Although billions of dollars have been spent for preventing or at least reducing the tremendous amount of losses, these malicious Ransomware attacks have been expanding and growing. Therefore, it is critical to perform technical analysis of such malicious codes and, if possible, determine the source of such attacks. It might be almost impossible to recover the affected files due to the strong encryption imposed on such files, however the determination of the source of Ransomware attacks have been becoming significantly important for criminal justice. Unfortunately, there are only a few technical analysis of real life attacks in the literature. In this work, a real life Ransomware attack on an official institute is investigated and fully analyzed. The analysis have been performed by both static and dynamic methods. The results show that the source of the Ransomware attack has been shown to be traceable from the server's whois information.
The evolution of the microelectronics manufacturing industry is characterized by increased complexity, analysis, integration, distribution, data sharing and collaboration, all of which is enabled by the big data explosion. This evolution affords a number of opportunities in improved productivity and quality, and reduced cost, however it also brings with it a number of risks associated with maintaining security of data systems. The International Roadmap for Devices and System Factory Integration International Focus Team (IRDS FI IFT) determined that a security technology roadmap for the industry is needed to better understand the needs, challenges and potential solutions for security in the microelectronics industry and its supply chain. As a first step in providing this roadmap, the IFT conducted a security survey, soliciting input from users, suppliers and OEMs. Preliminary results indicate that data partitioning with IP protection is the number one topic of concern, with the need for industry-wide standards as the second most important topic. Further, the "fear" of security breach is considered to be a significant hindrance to Advanced Process Control efforts as well as use of cloud-based solutions. The IRDS FI IFT will endeavor to provide components of a security roadmap for the industry in the 2018 FI chapter, leveraging the output of the survey effort combined with follow-up discussions with users and consultations with experts.
In the process of mobile intelligent terminal for file transfer, ensure the safety of data transmission is significant. It is necessary to prevent the file from being eavesdropped and tampered during transmission. The method of using double encryption on covert channel is proposed in this paper based on the analysis of encryption algorithms and covert channel, which uses asymmetric encryption algorithm to encrypt the key of symmetric encryption, to form hidden information, and to carry out covert transmission through covert channels to enhance the security of mobile terminal data transmission. By simulating the above scenarios in intelligent mobile terminal, the confidentiality and concealment of important information are realized in the transmission process.
Today, as surveillance systems are widely used for indoor and outdoor monitoring applications, there is a growing interest in real-time generation detection and there are many different applications for real-time generation detection and analysis. Two-dimensional videos; It is used in multimedia content-based indexing, information acquisition, visual surveillance and distributed cross-camera surveillance systems, human tracking, traffic monitoring and similar applications. It is of great importance for the development of systems for national security by following a moving target within the scope of military applications. In this research, a more efficient solution is proposed in addition to the existing methods. Therefore, we present YOLO, a new approach to object detection for military applications.
With the rapid development of big data technology, the requirement of data processing capacity and efficiency result in failure of a number of legacy security technologies, especially in the data security domain. Data security risks became extremely important for big data usage. We introduced a novel method to preform big data security control, which comprises three steps, namely, user context recognition based on zero trust, fine-grained data access authentication control, and data access audit based on full network traffic to recognize and intercept risky data access in big data environment. Experiments conducted on the fine-grained big data security method based on the zero trust model of drug-related information analysis system demonstrated that this method can identify the majority of data security risks.
With the development of cloud computing the topology properties of data center network are important to the computing resources. Recently a data center network structure - BCCC is proposed, which is recursively built structure with many good properties. and expandability. The Hamiltonian and expandability in data center network structure plays an extremely important role in network communication. This paper described the Hamiltonian and expandability of the expandable data center network for BCCC structure, the important role of Hamiltonian and expandability in network traffic.
An abundance of data in many disciplines of science, engineering, national security, health care, and business has led to the emerging field of Big Data Analytics that run in a cloud computing environment. To process massive quantities of data in the cloud, developers leverage Data-Intensive Scalable Computing (DISC) systems such as Google's MapReduce, Hadoop, and Spark. Currently, developers do not have easy means to debug DISC applications. The use of cloud computing makes application development feel more like batch jobs and the nature of debugging is therefore post-mortem. Developers of big data applications write code that implements a data processing pipeline and test it on their local workstation with a small sample data, downloaded from a TB-scale data warehouse. They cross fingers and hope that the program works in the expensive production cloud. When a job fails or they get a suspicious result, data scientists spend hours guessing at the source of the error, digging through post-mortem logs. In such cases, the data scientists may want to pinpoint the root cause of errors by investigating a subset of corresponding input records. The vision of my work is to provide interactive, real-time and automated debugging services for big data processing programs in modern DISC systems with minimum performance impact. My work investigates the following research questions in the context of big data analytics: (1) What are the necessary debugging primitives for interactive big data processing? (2) What scalable fault localization algorithms are needed to help the user to localize and characterize the root causes of errors? (3) How can we improve testing efficiency during iterative development of DISC applications by reasoning the semantics of dataflow operators and user-defined functions used inside dataflow operators in tandem? To answer these questions, we synthesize and innovate ideas from software engineering, big data systems, and program analysis, and coordinate innovations across the software stack from the user-facing API all the way down to the systems infrastructure.
In big data environments with big number of users and high volume of data, we need to manage the corresponding huge number of security policies. Due to the distributed management of these policies, they may contain several anomalies, such as conflicts and redundancies, which may lead to both safety and availability problems. The distributed systems guided by such security policies produce a huge number of access logs. Due to potential security breaches, the access logs may show the presence of non-allowed accesses. This may also be a consequence of conflicting rules in the security policies. In this paper, we present an ongoing work on developing an environment for verifying and correcting security policies. To make the approach efficient, an access log is used as input to determine suspicious parts of the policy that should be considered. The approach is also made efficient by clustering the policy and the access log and considering separately the obtained clusters. The clustering technique and the use of access log significantly reduces the complexity of the suggested approach, making it scalable for large amounts of data.
Recently, IoT, 5G mobile, big data, and artificial intelligence are increasingly used in the real world. These technologies are based on convergenced in Cyber Physical System(Cps). Cps technology requires core technologies to ensure reliability, real-time, safety, autonomy, and security. CPS is the system that can connect between cyberspace and physical space. Cyberspace attacks are confused in the real world and have a lot of damage. The personal information that dealing in CPS has high confidentiality, so the policies and technique will needed to protect the attack in advance. If there is an attack on the CPS, not only personal information but also national confidential data can be leaked. In order to prevent this, the risk is measured using the Factor Analysis of Information Risk (FAIR) Model, which can measure risk by element for situational awareness in CPS environment. To reduce risk by preventing attacks in CPS, this paper measures risk after using the concept of Crime Prevention Through Environmental Design(CPTED).
Malicious traffic has garnered more attention in recent years, owing to the rapid growth of information technology in today's world. In 2007 alone, an estimated loss of 13 billion dollars was made from malware attacks. Malware data in today's context is massive. To understand such information using primitive methods would be a tedious task. In this publication we demonstrate some of the most advanced deep learning techniques available, multilayer perceptron (MLP) and J48 (also known as C4.5 or ID3) on our selected dataset, Advanced Security Network Metrics & Non-Payload-Based Obfuscations (ASNM-NPBO) to show that the answer to managing cyber security threats lie in the fore-mentioned methodologies.