Biblio
In the context of big data era, in order to prevent malicious access and information leakage during data services, researchers put forward a location big data encryption method based on privacy protection in practical exploration. According to the problems arising from the development of information network in recent years, users often encounter the situation of randomly obtaining location information in the network environment, which not only threatens their privacy security, but also affects the effective transmission of information. Therefore, this study proposed the privacy protection as the core position of big data encryption method, must first clear position with large data representation and positioning information, distinguish between processing position information and the unknown information, the fuzzy encryption theory, dynamic location data regrouping, eventually build privacy protection as the core of the encryption algorithm. The empirical results show that this method can not only effectively block the intrusion of attack data, but also effectively control the error of position data encryption.
At present, in the face of the huge and complex data in cloud computing, the parallel computing ability of quantum computing is particularly important. Quantum principal component analysis algorithm is used as a method of quantum state tomography. We perform feature extraction on the eigenvalue matrix of the density matrix after feature decomposition to achieve dimensionality reduction, proposed quantum principal component extraction algorithm (QPCE). Compared with the classic algorithm, this algorithm achieves an exponential speedup under certain conditions. The specific realization of the quantum circuit is given. And considering the limited computing power of the client, we propose a quantum homomorphic ciphertext dimension reduction scheme (QHEDR), the client can encrypt the quantum data and upload it to the cloud for computing. And through the quantum homomorphic encryption scheme to ensure security. After the calculation is completed, the client updates the key locally and decrypts the ciphertext result. We have implemented a quantum ciphertext dimensionality reduction scheme implemented in the quantum cloud, which does not require interaction and ensures safety. In addition, we have carried out experimental verification on the QPCE algorithm on IBM's real computing platform. Experimental results show that the algorithm can perform ciphertext dimension reduction safely and effectively.
Data value (DV) is a novel concept that is introduced as one of the Big Data phenomenon features. While continuing an investigation of the DV ontology and its relationship with the data quality (DQ) on the conceptual level, this paper researches possible applications and use of the DV in the practical design of security and privacy protection systems and tools. We present a novel approach to DV evaluation that maps DQ metrics into DV value. Developed methods allow DV and DQ use in a wide range of application domains. To demonstrate DQ and DV concept employment in real tasks we present two real-life scenarios. The first use case demonstrates the DV use in crowdsensing application design. It shows up how DV can be calculated by integrating various metrics characterizing data application functionality, accuracy, and security. The second one incorporates the privacy consideration into DV calculus by exploring the relationship between privacy, DQ, and DV in the defense against web-site fingerprinting in The Onion Router (TOR) networks. These examples demonstrate how our methods of the DV and DQ evaluation may be employed in the design of real systems with security and privacy consideration.
Data privacy has been an important area of research in recent years. Dataset often consists of sensitive data fields, exposure of which may jeopardize interests of individuals associated with the data. In order to resolve this issue, privacy techniques can be used to hinder the identification of a person through anonymization of the sensitive data in the dataset to protect sensitive information, while the anonymized dataset can be used by the third parties for analysis purposes without obstruction. In this research, we investigated a privacy technique, k-anonymity for different values of on different number columns of the dataset. Next, the information loss due to k-anonymity is computed. The anonymized files go through the classification process by some machine-learning algorithms i.e., Naive Bayes, J48 and neural network in order to check a balance between data anonymity and data utility. Based on the classification accuracy, the optimal values of and are obtained, and thus, the optimal and can be used for k-anonymity algorithm to anonymize optimal number of columns of the dataset.
The purpose of the General Data Protection Regulation (GDPR) is to provide improved privacy protection. If an app controls personal data from users, it needs to be compliant with GDPR. However, GDPR lists general rules rather than exact step-by-step guidelines about how to develop an app that fulfills the requirements. Therefore, there may exist GDPR compliance violations in existing apps, which would pose severe privacy threats to app users. In this paper, we take mobile health applications (mHealth apps) as a peephole to examine the status quo of GDPR compliance in Android apps. We first propose an automated system, named HPDROID, to bridge the semantic gap between the general rules of GDPR and the app implementations by identifying the data practices declared in the app privacy policy and the data relevant behaviors in the app code. Then, based on HPDROID, we detect three kinds of GDPR compliance violations, including the incompleteness of privacy policy, the inconsistency of data collections, and the insecurity of data transmission. We perform an empirical evaluation of 796 mHealth apps. The results reveal that 189 (23.7%) of them do not provide complete privacy policies. Moreover, 59 apps collect sensitive data through different measures, but 46 (77.9%) of them contain at least one inconsistent collection behavior. Even worse, among the 59 apps, only 8 apps try to ensure the transmission security of collected data. However, all of them contain at least one encryption or SSL misuse. Our work exposes severe privacy issues to raise awareness of privacy protection for app users and developers.