Biblio
In today's world privacy is paramount in everyone's life. Alongside the growth of IoT (Internet of things), wearable devices are becoming widely popular for real-time user monitoring and wise service support. However, in contrast with the traditional short-range communications, these resource-scanty devices face various vulnerabilities and security threats during the course of interactions. Hence, designing a security solution for these devices while dealing with the limited communication and computation capabilities is a challenging task. In this work, PUF (Physical Unclonable Function) and lightweight cryptographic parameters are used together for performing two-way authentication between wearable devices and smartphone, while the simultaneous verification is performed by providing yoking-proofs to the Cloud Server. At the end, it is shown that the proposed scheme satisfies many security aspects and is flexible as well as lightweight.
The developments made in IoT applications have made wearable devices a popular choice for collecting user data to monitor this information and provide intelligent service support. Since wearable devices are continuously collecting and transporting a user's sensitive data over the network, there exist increased security challenges. Moreover, wearable devices lack the computation capabilities in comparison to traditional short-range communication devices. In this paper, authors propounded a Yoking Proof based remote Authentication scheme for Cloud-aided Wearable devices (YPACW) which takes PUF and cryptographic functions and joins them to achieve mutual authentication between the wearable devices and smartphone via a cloud server, by performing the simultaneous verification of these devices, using the established yoking-proofs. Relative to Liu et al.'s scheme, YPACW provides better results with the reduction of communication and processing cost significantly.
In recent times cloud services are used widely and due to which there are so many attacks on the cloud devices. One of the major attacks is DDos (distributed denial-of-service) -attack which mainly targeted the Memcached which is a caching system developed for speeding the websites and the networks through Memcached's database. The DDoS attack tries to destroy the database by creating a flood of internet traffic at the targeted server end. Attackers send the spoofing applications to the vulnerable UDP Memcached server which even manipulate the legitimate identity of the sender. In this work, we have proposed a vector quantization approach based on a supervised deep learning approach to detect the Memcached attack performed by the use of malicious firmware on different types of Cloud attached devices. This vector quantization approach detects the DDoas attack performed by malicious firmware on the different types of cloud devices and this also classifies the applications which are vulnerable to attack based on cloud-The Hackbeased services. The result computed during the testing shows the 98.2 % as legally positive and 0.034% as falsely negative.
The number of applications and services that are hosted on cloud platforms is constantly increasing. Nowadays, more and more applications are hosted as services on cloud platforms, co-existing with other services in a mutually untrusted environment. Facilities such as virtual machines, containers and encrypted communication channels aim to offer isolation between the various applications and protect sensitive user data. However, such techniques are not always able to provide a secure execution environment for sensitive applications nor they offer guarantees that data are not monitored by an honest but curious provider once they reach the cloud infrastructure. The recent advancements of trusted execution environments within commodity processors, such as Intel SGX, provide a secure reverse sandbox, where code and data are isolated even from the underlying operating system. Moreover, Intel SGX provides a remote attestation mechanism, allowing the communicating parties to verify their identity as well as prove that code is executed on hardware-assisted software enclaves. Many approaches try to ensure code and data integrity, as well as enforce channel encryption schemes such as TLS, however, these techniques are not enough to achieve complete isolation and secure communications without hardware assistance or are not efficient in terms of performance. In this work, we design and implement a practical attestation system that allows the service provider to offer a seamless attestation service between the hosted applications and the end clients. Furthermore, we implement a novel caching system that is capable to eliminate the latencies introduced by the remote attestation process. Our approach allows the parties to attest one another before each communication attempt, with improved performance when compared to a standard TLS handshake.
Enforcing security and resilience in a cloud platform is an essential but challenging problem due to the presence of a large number of heterogeneous applications running on shared resources. A security analysis system that can detect threats or malware must exist inside the cloud infrastructure. Much research has been done on machine learning-driven malware analysis, but it is limited in computational complexity and detection accuracy. To overcome these drawbacks, we proposed a new malware detection system based on the concept of clustering and trend micro locality sensitive hashing (TLSH). We used Cuckoo sandbox, which provides dynamic analysis reports of files by executing them in an isolated environment. We used a novel feature extraction algorithm to extract essential features from the malware reports obtained from the Cuckoo sandbox. Further, the most important features are selected using principal component analysis (PCA), random forest, and Chi-square feature selection methods. Subsequently, the experimental results are obtained for clustering and non-clustering approaches on three classifiers, including Decision Tree, Random Forest, and Logistic Regression. The model performance shows better classification accuracy and false positive rate (FPR) as compared to the state-of-the-art works and non-clustering approach at significantly lesser computation cost.
The Open Data Cube (ODC) initiative, with support from the Committee on Earth Observation Satellites (CEOS) System Engineering Office (SEO) has developed a state-of-the-art suite of software tools and products to facilitate the analysis of Earth Observation data. This paper presents a short summary of our novel architecture approach in a project related to the Open Data Cube (ODC) community that provides users with their own ODC sandbox environment. Users can have a sandbox environment all to themselves for the purpose of running Jupyter notebooks that leverage the ODC. This novel architecture layout will remove the necessity of hosting multiple users on a single Jupyter notebook server and provides better management tooling for handling resource usage. In this new layout each user will have their own credentials which will give them access to a personal Jupyter notebook server with access to a fully deployed ODC environment enabling exploration of solutions to problems that can be supported by Earth observation data.
The dynamicity and complexity of clouds highlight the importance of automated root cause analysis solutions for explaining what might have caused a security incident. Most existing works focus on either locating malfunctioning clouds components, e.g., switches, or tracing changes at lower abstraction levels, e.g., system calls. On the other hand, a management-level solution can provide a big picture about the root cause in a more scalable manner. In this paper, we propose DOMINOCATCHER, a novel provenance-based solution for explaining the root cause of security incidents in terms of management operations in clouds. Specifically, we first define our provenance model to capture the interdependencies between cloud management operations, virtual resources and inputs. Based on this model, we design a framework to intercept cloud management operations and to extract and prune provenance metadata. We implement DOMINOCATCHER on OpenStack platform as an attached middleware and validate its effectiveness using security incidents based on real-world attacks. We also evaluate the performance through experiments on our testbed, and the results demonstrate that DOMINOCATCHER incurs insignificant overhead and is scalable for clouds.
Cloud computing (CC) systems prevail to be the widespread computational paradigms for offering immense scalable and elastic services. Computing resources in cloud environment should be scheduled to facilitate the providers to utilize the resources moreover the users could get low cost applications. The most prominent need in job scheduling is to ensure Quality of service (QoS) to the user. In the boundary of the third party the scheduling takes place hence it is a significant condition for assuring its security. The main objective of our work is to offer QoS i.e. cost, makespan, minimized migration of task with security enforcement moreover the proposed algorithm guarantees that the admitted requests are executed without violating service level agreement (SLA). These objectives are attained by the proposed Fuzzy Ant Bee Colony algorithm. The experimental outcome confirms that secured job scheduling objective with assured QoS is attained by the proposed algorithm.
Cloud, Software-Defined Networking (SDN), and Network Function Virtualization (NFV) technologies have introduced a new era of cybersecurity threats and challenges. To protect cloud infrastructure, in our earlier work, we proposed Software Defined Security Service (SDS2) to tackle security challenges centered around a new policy-based interaction model. The security architecture consists of three main components: a Security Controller, Virtual Security Functions (VSF), and a Sec-Manage Protocol. However, the security architecture requires an agile and specific protocol to transfer interaction parameters and security messages between its components where OpenFlow considers mainly as network routing protocol. So, The Sec-Manage protocol has been designed specifically for obtaining policy-based interaction parameters among cloud entities between the security controller and its VSFs. This paper focuses on the design and the implementation of the Sec-Manage protocol and demonstrates its use in setting, monitoring, and conveying relevant policy-based interaction security parameters.
The rapid development of cloud computing and the arrival of the big data era make the relationship between users and cloud closer. Cloud computing has powerful data computing and data storage capabilities, which can ubiquitously provide users with resources. However, users do not fully trust the cloud server's storage services, so lots of data is encrypted and uploaded to the cloud. Searchable encryption can protect the confidentiality of data and provide encrypted data retrieval functions. In this paper, we propose a time-controlled searchable encryption scheme with regular language over encrypted big data, which provides flexible search pattern and convenient data sharing. Our solution allows users with data's secret keys to generate trapdoors by themselves. And users without data's secret keys can generate trapdoors with the help of a trusted third party without revealing the data owner's secret key. Our system uses a time-controlled mechanism to collect keywords queried by users and ensures that the querying user's identity is not directly exposed. The obtained keywords are the basis for subsequent big data analysis. We conducted a security analysis of the proposed scheme and proved that the scheme is secure. The simulation experiment and comparison of our scheme show that the system has feasible efficiency.
with the advent of Cloud Computing a new era of computing has come into existence. No doubt, there are numerous advantages associated with the Cloud Computing but, there is other side of the picture too. The challenges associated with it need a more promising reply as far as the security of data that is stored, in process and in transit is concerned. This paper put forth a cloud computing model that tries to answer the data security queries; we are talking about, in terms of the four cryptographic techniques namely Homomorphic Encryption (HE), Verifiable Computation (VC), Secure Multi-Party Computation (SMPC), Functional Encryption (FE). This paper takes into account the various cryptographic techniques to undertake cloud computing security issues. It also surveys these important (existing) cryptographic tools/techniques through a proposed Cloud computation model that can be used for Big Data applications. Further, these cryptographic tools are also taken into account in terms of CIA triad. Then, these tools/techniques are analyzed by comparing them on the basis of certain parameters of concern.
Data mining visualization is an important aspect of big data visualization and analysis. The impact of the nature-inspired algorithm along with the impact of computing traditions for the complete visualization of the storage and data communication needs have been studied. This paper also explores the possibilities of the hybridization of data mining in terms of association of cloud computing. It also explores the data analytical view in the exploration of these approaches in terms of data storage in big data. Based on these aspects the methodological advancement along with the problem statements has been analyzed. This will help in the exploration of computational capability along with the new insights in this domain.
In cyber physical systems, cybersecurity and data privacy are among most critical considerations when dealing with communications, processing, and storage of data. Geospatial data and medical data are examples of big data that require seamless integration with computational algorithms as outlined in Industry 4.0 towards adoption of fourth industrial revolution. Healthcare Industry 4.0 is an application of the design principles of Industry 4.0 to the medical domain. Mobile applications are now widely used to accomplish important business functions in almost all industries. These mobile devices, however, are resource poor and proved insufficient for many important medical applications. Resource rich cloud services are used to augment poor mobile device resources for data and compute intensive applications in the mobile cloud computing paradigm. However, the performance of cloud services is undesirable for data-intensive, latency-sensitive mobile applications due increased hop count between the mobile device and the cloud server. Cloudlets are virtual machines hosted in server placed nearby the mobile device and offer an attractive alternative to the mobile cloud computing in the form of mobile edge computing. This paper outlines cybersecurity and data privacy aspects for communications of measured patient data from wearable wireless biosensors to nearby cloudlet host server in order to facilitate the cloudlet based preliminary and essential complex analytics for the medical big data.
This paper has firstly introduced big data services and cloud computing model based on different process forms, and analyzed the authentication technology and security services of the existing big data to understand their processing characteristics. Operation principles and complexity of the big data services and cloud computing have also been studied, and summary about their suitable environment and pros and cons have been made. Based on the Cloud Computing, the author has put forward the Model of Big Data Cloud Computing based on Extended Subjective Logic (MBDCC-ESL), which has introduced Jφsang's subjective logic to test the data credibility and expanded it to solve the problem of the trustworthiness of big data in the cloud computing environment. Simulation results show that the model works pretty well.
Big data provides a way to handle and analyze large amount of data or complex set. It provides a systematic extraction also. In this paper a hybrid security analysis based on intelligent adaptive learning in big data has been discussed with the current trends. This paper also explores the possibility of cloud computing collaboration with big data. The advantages along with the impact for the overall platform evaluation has been discussed with the traditional trends. It has been useful in the analysis and the exploration of future research. This discussion also covers the computational variability and the connotation in terms of data reliability, availability and management in big data with data security aspects.
Lately mining of information from online life is pulling in more consideration because of the blast in the development of Big Data. In security, Big Data manages an assortment of immense advanced data for investigating, envisioning and to draw the bits of knowledge for the expectation and anticipation of digital assaults. Big Data Analytics (BDA) is the term composed by experts to portray the art of dealing with, taking care of and gathering a great deal of data for future evaluation. Data is being made at an upsetting rate. The quick improvement of the Internet, Internet of Things (IoT) and other creative advances are the rule liable gatherings behind this proceeded with advancement. The data made is an impression of the earth, it is conveyed out of, along these lines can use the data got away from structures to understand the internal exercises of that system. This has become a significant element in cyber security where the objective is to secure resources. Moreover, the developing estimation of information has made large information a high worth objective. Right now, investigate ongoing exploration works in cyber security comparable to huge information and feature how Big information is secured and how huge information can likewise be utilized as a device for cyber security. Simultaneously, a Big Data based concentrated log investigation framework is actualized to distinguish the system traffic happened with assailants through DDOS, SQL Injection and Bruce Force assault. The log record is naturally transmitted to the brought together cloud server and big information is started in the investigation process.