Visible to the public Biblio

Filters: Keyword is fair information practice principles  [Clear All Filters]
2020-10-12
Foreman, Zackary, Bekman, Thomas, Augustine, Thomas, Jafarian, Haadi.  2019.  PAVSS: Privacy Assessment Vulnerability Scoring System. 2019 International Conference on Computational Science and Computational Intelligence (CSCI). :160–165.
Currently, the guidelines for business entities to collect and use consumer information from online sources is guided by the Fair Information Practice Principles set forth by the Federal Trade Commission in the United States. These guidelines are inadequate, outdated, and provide little protection for consumers. Moreover, there are many techniques to anonymize the stored data that was collected by large companies and governments. However, what does not exist is a framework that is capable of evaluating and scoring the effects of this information in the event of a data breach. In this work, a framework for scoring and evaluating the vulnerability of private data is presented. This framework is created to be used in parallel with currently adopted frameworks that are used to score and evaluate other areas of deficiencies within the software, including CVSS and CWSS. It is dubbed the Privacy Assessment Vulnerability Scoring System (PAVSS) and quantifies the privacy-breach vulnerability an individual takes on when using an online platform. This framework is based on a set of hypotheses about user behavior, inherent properties of an online platform, and the usefulness of available data in performing a cyber attack. The weight each of these metrics has within our model is determined by surveying cybersecurity experts. Finally, we test the validity of our user-behavior based hypotheses, and indirectly our model by analyzing user posts from a large twitter data set.
2017-05-18
Landwehr, Carl E..  2016.  How Can We Enable Privacy in an Age of Big Data Analytics? Proceedings of the 2016 ACM on International Workshop on Security And Privacy Analytics. :47–47.

Even though some seem to think privacy is dead, we are all still wearing clothes, as Bruce Schneier observed at a recent conference on surveillance[1]. Yet big data and big data analytics are leaving some of us feeling a bit more naked than before. This talk will provide some personal observations on privacy today and then outline some research areas where progress is needed to enable society to gain the benefits of analyzing large datasets without giving up more privacy than necessary. Not since the early 1970s, when computing pioneer Willis Ware chaired the committee that produced the initial Fair Information Practice Principles [2] has privacy been so much in the U.S. public eye. Snowden's revelations, as well as a growing awareness that merely living our lives seems to generate an expanding "digital exhaust." Have triggered many workshops and meetings. A national strategy for privacy research is in preparation by a Federal interagency group. The ability to analyze large datasets rapidly and to extract commercially useful insights from them is spawning new industries. Must this industrial growth come at the cost of substantial privacy intrusions?