Visible to the public Biblio

Filters: Author is Krishnamurthy, P.  [Clear All Filters]
2018-09-05
Palanisamy, B., Li, C., Krishnamurthy, P..  2017.  Group Differential Privacy-Preserving Disclosure of Multi-level Association Graphs. 2017 IEEE 37th International Conference on Distributed Computing Systems (ICDCS). :2587–2588.

Traditional privacy-preserving data disclosure solutions have focused on protecting the privacy of individual's information with the assumption that all aggregate (statistical) information about individuals is safe for disclosure. Such schemes fail to support group privacy where aggregate information about a group of individuals may also be sensitive and users of the published data may have different levels of access privileges entitled to them. We propose the notion ofεg-Group Differential Privacy that protects sensitive information of groups of individuals at various defined privacy levels, enabling data users to obtain the level of access entitled to them. We present a preliminary evaluation of the proposed notion of group privacy through experiments on real association graph data that demonstrate the guarantees on group privacy on the disclosed data.

2018-02-06
Palanisamy, B., Li, C., Krishnamurthy, P..  2017.  Group Privacy-Aware Disclosure of Association Graph Data. 2017 IEEE International Conference on Big Data (Big Data). :1043–1052.

In the age of Big Data, we are witnessing a huge proliferation of digital data capturing our lives and our surroundings. Data privacy is a critical barrier to data analytics and privacy-preserving data disclosure becomes a key aspect to leveraging large-scale data analytics due to serious privacy risks. Traditional privacy-preserving data publishing solutions have focused on protecting individual's private information while considering all aggregate information about individuals as safe for disclosure. This paper presents a new privacy-aware data disclosure scheme that considers group privacy requirements of individuals in bipartite association graph datasets (e.g., graphs that represent associations between entities such as customers and products bought from a pharmacy store) where even aggregate information about groups of individuals may be sensitive and need protection. We propose the notion of $ε$g-Group Differential Privacy that protects sensitive information of groups of individuals at various defined group protection levels, enabling data users to obtain the level of information entitled to them. Based on the notion of group privacy, we develop a suite of differentially private mechanisms that protect group privacy in bipartite association graphs at different group privacy levels based on specialization hierarchies. We evaluate our proposed techniques through extensive experiments on three real-world association graph datasets and our results demonstrate that the proposed techniques are effective, efficient and provide the required guarantees on group privacy.

2018-02-02
Paul-Pena, D., Krishnamurthy, P., Karri, R., Khorrami, F..  2017.  Process-aware side channel monitoring for embedded control system security. 2017 IFIP/IEEE International Conference on Very Large Scale Integration (VLSI-SoC). :1–6.

Cyber-physical systems (CPS) are interconnections of heterogeneous hardware and software components (e.g., sensors, actuators, physical systems/processes, computational nodes and controllers, and communication subsystems). Increasing network connectivity of CPS computational nodes facilitates maintenance and on-demand reprogrammability and reduces operator workload. However, such increasing connectivity also raises the potential for cyber-attacks that attempt unauthorized modifications of run-time parameters or control logic in the computational nodes to hamper process stability or performance. In this paper, we analyze the effectiveness of real-time monitoring using digital and analog side channels. While analog side channels might not typically provide sufficient granularity to observe each iteration of a periodic loop in the code in the CPS device, the temporal averaging inherent to side channel sensory modalities enables observation of persistent changes to the contents of a computational loop through their resulting effect on the level of activity of the device. Changes to code can be detected by observing readings from side channel sensors over a period of time. Experimental studies are performed on an ARM-based single board computer.