ICSI

news

Visible to the public "Access Management Issues May Create Security Holes"

According to a study by the security vendor strongDM that polled 600 IT, security, and DevOps workers, access restrictions meant to secure corporate systems may have the adverse effect of causing employees to find workarounds and share credentials with co-workers, thus creating potential security vulnerabilities. The study found that in many cases, users will find alternative methods for accessing their containers, cloud services, and other important tools when they do not have access via the managed company channels. The problem stems from a natural conflict related to the pressure that employees face when trying to meet deadlines. While executives and managers press IT administrators to update network services to the most recent versions and to implement secure and well-maintained access protocols, end-users, particularly developers and DevOps teams who rely on stored code and containers, require access to those resources. The survey discovered that end-users need about 15 minutes of access per day to get the data they need for work. Meanwhile, nearly 39 percent of administrators polled said that simply connecting new tools to their existing access management systems takes several days. While new systems are being integrated with access management controls, end-users will still need to meet deadlines and complete projects, meaning they will likely operate outside the management controls. These workarounds could include directly accessing the cloud service or system using their personal credentials or even a shared login. Of those polled, 55 percent said they had seen their teams maintain a backdoor access method, while 53 percent said they shared credentials to important services. This is where major security risks emerge as these credentials are then vulnerable to hackers through account theft, malware, or other common methods. This article continues to discuss key findings from strongDM's study regarding how access management issues can create security vulnerabilities.

TechTarget reports "Access Management Issues May Create Security Holes"

file

Visible to the public Perspectives of Stakeholders in Data Governance

ABSTRACT

event

Visible to the public  Summer'21 Science of Security Quarterly Lablet Meeting
Jul 13, 2021 10:00 am - Jul 14, 2021 2:30 pm CDT

The Summer'21 Science of Security Quarterly Lablet Meeting will be hosted by Carnegie Mellon University on July 13 and 14, 2021. The theme of the meeting is AI and Machine Learning. Day 1 will begin with a panel on the Science of Security Hard Problems. Following the panel there will be briefs from the Lablets. Day 2 will continue with briefs from the Lablets.

The meeting will be virtual. To gain access to the meeting, please register here: https://cps-vo.org/LabletQTRLY/2021/CMU-register

group_project

Visible to the public Scalable Privacy Analysis

One major shortcoming of the current "notice and consent" privacy framework is that the constraints for data usage stated in policies--be they stated privacy practices, regulation, or laws--cannot easily be compared against the technologies that they govern. To that end, we are developing a framework to automatically compare policy against practice. Broadly, this involves identifying the relevant data usage policies and practices in a given domain, then measuring the real-world exchanges of data restricted by those rules.

group_project

Visible to the public Contextual Integrity for Computer Systems

Despite the success of Contextual Integrity (see project "Operationalizing Contextual Integrity"), its uptake by computer scientists has been limited due to the philosophical framework not meeting them on their terms. In this project we will both refine Contextual Integrity (CI) to better fit the problems computer scientists face and to express it in the mathematical terms they expect.

group_project

Visible to the public Operationalizing Contextual Integrity

According to Nissenbaum's theory of contextual integrity (CI), protecting privacy means ensuring that personal information flows appropriately; it does not mean that no information flows (e.g., confidentiality), or that it flows only if the information subject allows it (e.g., control). Flow is appropriate if it conforms to legitimate, contextual informational norms. Contextual informational norms prescribe information flows in terms of five parameters: actors (sender, subject, recipient), information types, and transmission principles.

group_project

Visible to the public Governance for Big Data

Privacy governance for Big Data is challenging--data may be rich enough to allow the inference of private information that has been removed, redacted, or minimized. We must protect against both malicious and accidental inference, both by data analysts and by automated systems. To do this, we are extending existing methods for controlling the inference risks of common analysis tools (drawn from literature on the related problem of nondiscriminatory data analysis). We are coupling these methods with auditing tools such as verifiably integral audit logs.

group_project

Visible to the public Designing for Privacy

Methods, approaches, and tools to identify the correct conceptualization of privacy early in the design and engineering process are important. For example, early whole body imaging technology for airport security were analyzed by the Department of Homeland Security through a Privacy Impact Assessment, focusing on the collection of personally identifiable information finding that the images of persons' individual bodies were not detailed enough to constitute PII, and would not pose a privacy problem.