Biblio
LBSs are Location-Based Services that provide certain service based on the current or past user's location. During the past decade, LBSs have become more popular as a result of the widespread use of mobile devices with position functions. Location information is a secondary information that can provide personal insight about one's life. This issue associated with sharing of data in cloud-based locations. For example, a hospital is a public space and the actual location of the hospital does not carry any sensitive information. However, it may become sensitive if the specialty of the hospital is analyzed. In this paper we proposed design presents a combination of methods for providing data privacy protection for location-based services (LBSs) with the use of cloud service. The work built in zero trust and we start to manage the access to the system through different levels. The proposal is based on a model that stores user location data in supplementary servers and not in non-trustable third-party applications. The approach of the present research is to analyze the privacy protection possibilities through data partitioning. The data collected from the different recourses are distributed into different servers according to the partitioning model based on multi-level policy. Access is granted to third party applications only to designated servers and the privacy of the user profile is also ensured in each server, as they are not trustable.
Emerging cyber-physical systems (CPS) often require collecting end users' data to support data-informed decision making processes. There has been a long-standing argument as to the tradeoff between privacy and data utility. In this paper, we adopt a multiparametric programming approach to rigorously study conditions under which data utility has to be sacrificed to protect privacy and situations where free-lunch privacy can be achieved, i.e., data can be concealed without hurting the optimality of the decision making underlying the CPS. We formalize the concept of free-lunch privacy, and establish various results on its existence, geometry, as well as efficient computation methods. We propose the free-lunch privacy mechanism, which is a pragmatic mechanism that exploits free-lunch privacy if it exists with the constant guarantee of optimal usage of data. We study the resilience of this mechanism against attacks that attempt to infer the parameter of a user's data generating process. We close the paper by a case study on occupancy-adaptive smart home temperature control to demonstrate the efficacy of the mechanism.