Biblio
A conventional visible light communication system consists of a transmitter, a jammer that includes a few light emitting diodes, a legal listener and an eavesdropper. In this work, a similar system is designed with a collimating lens in order to create an extra layer of practical physical security measure. The use of a collimating lens makes it available to spatially limiting data transmission to an area under the lensed transmitter. Also focused data transmission through the optical lens, increases the secrecy rate. To investigate the applicability of the proposed design we designed a sample experimental setup using USRP and implemented in a laboratory environment. In the proposed set up, the receiver is in a fixed position. However, it is possible to implement an easy, practical and cheap hardware solution with respect to a beamforming type VLC that uses directional beam forming method to establish transmission to a dynamic target. In addition, it is achievable to control the size of the area where a receiver can access data by manipulating the distance between the optical lens and transmitter.
Federated learning (FL) allows to train a massive amount of data privately due to its decentralized structure. Stochastic gradient descent (SGD) is commonly used for FL due to its good empirical performance, but sensitive user information can still be inferred from weight updates shared during FL iterations. We consider Gaussian mechanisms to preserve local differential privacy (LDP) of user data in the FL model with SGD. The trade-offs between user privacy, global utility, and transmission rate are proved by defining appropriate metrics for FL with LDP. Compared to existing results, the query sensitivity used in LDP is defined as a variable, and a tighter privacy accounting method is applied. The proposed utility bound allows heterogeneous parameters over all users. Our bounds characterize how much utility decreases and transmission rate increases if a stronger privacy regime is targeted. Furthermore, given a target privacy level, our results guarantee a significantly larger utility and a smaller transmission rate as compared to existing privacy accounting methods.
Current implementations of Differential Privacy (DP) focus primarily on the privacy of the data release. The planned thesis will investigate steps towards a user-centric approach of DP in the scope of the Internet-of-Things (IoT) which focuses on data subjects, IoT developers, and data analysts. We will conduct user studies to find out more about the often conflicting interests of the involved parties and the encountered challenges. Furthermore, a technical solution will be developed to assist data subjects and analysts in making better informed decisions. As a result, we expect our contributions to be a step towards the development of usable DP for IoT sensor data.