Biblio

Filters: Author is Jiang, Jun  [Clear All Filters]
2018-06-07
Jiang, Jun, Zhao, Xinghui, Wallace, Scott, Cotilla-Sanchez, Eduardo, Bass, Robert.  2017.  Mining PMU Data Streams to Improve Electric Power System Resilience. Proceedings of the Fourth IEEE/ACM International Conference on Big Data Computing, Applications and Technologies. :95–102.
Phasor measurement units (PMUs) provide high-fidelity situational awareness of electric power grid operations. PMU data are used in real-time to inform wide area state estimation, monitor area control error, and event detection. As PMU data becomes more reliable, these devices are finding roles within control systems such as demand response programs and early fault detection systems. As with other cyber physical systems, maintaining data integrity and security are significant challenges for power system operators. In this paper, we present a comprehensive study of multiple machine learning techniques for detecting malicious data injection within PMU data streams. The two datasets used in this study are from the Bonneville Power Administration's PMU network and an inter-university PMU network among three universities, located in the U.S. Pacific Northwest. These datasets contain data from both the transmission level and the distribution level. Our results show that both SVM and ANN are generally effective in detecting spoofed data, and TensorFlow, the newly released tool, demonstrates potential for distributing the training workload and achieving higher performance. We expect these results to shed light on future work of adopting machine learning and data analytics techniques in the electric power industry.
2017-04-10
Burcham, Morgan, Al-Zyoud, Mahran, Carver, Jeffrey C., Alsaleh, Mohammed, Du, Hongying, Gilani, Fida, Jiang, Jun, Rahman, Akond, Kafalı, Özgür, Al-Shaer, Ehab et al..  2017.  Characterizing Scientific Reporting in Security Literature: An Analysis of ACM CCS and IEEE S&P Papers. Proceedings of the Hot Topics in Science of Security: Symposium and Bootcamp. :13–23.

Scientific advancement is fueled by solid fundamental research, followed by replication, meta-analysis, and theory building. To support such advancement, researchers and government agencies have been working towards a "science of security". As in other sciences, security science requires high-quality fundamental research addressing important problems and reporting approaches that capture the information necessary for replication, meta-analysis, and theory building. The goal of this paper is to aid security researchers in establishing a baseline of the state of scientific reporting in security through an analysis of indicators of scientific research as reported in top security conferences, specifically the 2015 ACM CCS and 2016 IEEE S&P proceedings. To conduct this analysis, we employed a series of rubrics to analyze the completeness of information reported in papers relative to the type of evaluation used (e.g. empirical study, proof, discussion). Our findings indicated some important information is often missing from papers, including explicit documentation of research objectives and the threats to validity. Our findings show a relatively small number of replications reported in the literature. We hope that this initial analysis will serve as a baseline against which we can measure the advancement of the science of security.

2017-07-06
Burcham, Morgan, Al-Zyoud, Mahran, Carver, Jeffrey C., Alsaleh, Mohammed, Du, Hongying, Gilani, Fida, Jiang, Jun, Rahman, Akond, Kafalı, Özgür, Al-Shaer, Ehab et al..  2017.  Characterizing Scientific Reporting in Security Literature: An Analysis of ACM CCS and IEEE S&P Papers. Proceedings of the Hot Topics in Science of Security: Symposium and Bootcamp. :13–23.

Scientific advancement is fueled by solid fundamental research, followed by replication, meta-analysis, and theory building. To support such advancement, researchers and government agencies have been working towards a "science of security". As in other sciences, security science requires high-quality fundamental research addressing important problems and reporting approaches that capture the information necessary for replication, meta-analysis, and theory building. The goal of this paper is to aid security researchers in establishing a baseline of the state of scientific reporting in security through an analysis of indicators of scientific research as reported in top security conferences, specifically the 2015 ACM CCS and 2016 IEEE S&P proceedings. To conduct this analysis, we employed a series of rubrics to analyze the completeness of information reported in papers relative to the type of evaluation used (e.g. empirical study, proof, discussion). Our findings indicated some important information is often missing from papers, including explicit documentation of research objectives and the threats to validity. Our findings show a relatively small number of replications reported in the literature. We hope that this initial analysis will serve as a baseline against which we can measure the advancement of the science of security.