Biblio
With billions of devices already connected to the network's edge, the Internet of Things (IoT) is shaping the future of pervasive computing. Nonetheless, IoT applications still cannot escape the need for the computing resources available at the fog layer. This becomes challenging since the fog nodes are not necessarily secure nor reliable, which widens even further the IoT threat surface. Moreover, the security risk appetite of heterogeneous IoT applications in different domains or deploy-ment contexts should not be assessed similarly. To respond to this challenge, this paper proposes a new approach to optimize the allocation of secure and reliable fog computing resources among IoT applications with varying security risk level. First, the security and reliability levels of fog nodes are quantitatively evaluated, and a security risk assessment methodology is defined for IoT services. Then, an online, incentive-compatible mechanism is designed to allocate secure fog resources to high-risk IoT offloading requests. Compared to the offline Vickrey auction, the proposed mechanism is computationally efficient and yields an acceptable approximation of the social welfare of IoT devices, allowing to attenuate security risk within the edge network.
We study the value of data privacy in a game-theoretic model of trading private data, where a data collector purchases private data from strategic data subjects (individuals) through an incentive mechanism. The private data of each individual represents her knowledge about an underlying state, which is the information that the data collector desires to learn. Different from most of the existing work on privacy-aware surveys, our model does not assume the data collector to be trustworthy. Then, an individual takes full control of its own data privacy and reports only a privacy-preserving version of her data. In this paper, the value of ε units of privacy is measured by the minimum payment of all nonnegative payment mechanisms, under which an individual's best response at a Nash equilibrium is to report the data with a privacy level of ε. The higher ε is, the less private the reported data is. We derive lower and upper bounds on the value of privacy which are asymptotically tight as the number of data subjects becomes large. Specifically, the lower bound assures that it is impossible to use less amount of payment to buy ε units of privacy, and the upper bound is given by an achievable payment mechanism that we designed. Based on these fundamental limits, we further derive lower and upper bounds on the minimum total payment for the data collector to achieve a given learning accuracy target, and show that the total payment of the designed mechanism is at most one individual's payment away from the minimum.