Visible to the public Biblio

Filters: Author is Pejo, Balazs  [Clear All Filters]
2019-10-15
Pejo, Balazs, Tang, Qiang, Biczók, Gergely.  2018.  The Price of Privacy in Collaborative Learning. Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security. :2261–2263.

Machine learning algorithms have reached mainstream status and are widely deployed in many applications. The accuracy of such algorithms depends significantly on the size of the underlying training dataset; in reality a small or medium sized organization often does not have enough data to train a reasonably accurate model. For such organizations, a realistic solution is to train machine learning models based on a joint dataset (which is a union of the individual ones). Unfortunately, privacy concerns prevent them from straightforwardly doing so. While a number of privacy-preserving solutions exist for collaborating organizations to securely aggregate the parameters in the process of training the models, we are not aware of any work that provides a rational framework for the participants to precisely balance the privacy loss and accuracy gain in their collaboration. In this paper, we model the collaborative training process as a two-player game where each player aims to achieve higher accuracy while preserving the privacy of its own dataset. We introduce the notion of Price of Privacy, a novel approach for measuring the impact of privacy protection on the accuracy in the proposed framework. Furthermore, we develop a game-theoretical model for different player types, and then either find or prove the existence of a Nash Equilibrium with regard to the strength of privacy protection for each player.

2018-09-05
Pejo, Balazs, Tang, Qiang.  2017.  To Cheat or Not to Cheat: A Game-Theoretic Analysis of Outsourced Computation Verification. Proceedings of the Fifth ACM International Workshop on Security in Cloud Computing. :3–10.

In the cloud computing era, in order to avoid computational burdens, many organizations tend to outsource their computations to third-party cloud servers. In order to protect service quality, the integrity of computation results need to be guaranteed. In this paper, we develop a game theoretic framework which helps the outsourcer to maximize its payoff while ensuring the desired level of integrity for the outsourced computation. We define two Stackelberg games and analyze the optimal setting's sensitivity for the parameters of the model.