Biblio
In order to be more environmentally friendly, a lot of parts and aspects of life become electrified to reduce the usage of fossil fuels. This can be seen in the increased number of electrical vehicles in everyday life. This of course only makes a positive impact on the environment, if the electricity is produced environmentally friendly and comes from renewable sources. But when the green electrical power is produced, it still needs to be transported to where it's needed, which is not necessarily near the production site. In China, one of the ways to do this transport is to use High Voltage Direct Current (HVDC) technology. This of course means, that the current has to be converted to DC before being transported to the end user. That implies that the converter stations are of great importance for the grid security. Therefore, a precise monitoring of the stations is necessary. Ideally, this could be accomplished with wireless sensor nodes with an autarkic energy supply. A role in this energy supply could be played by a thermoelectrical generator (TEG). But to assess the power generated in the specific environment, a simulation would be highly desirable, to evaluate the power gained from the temperature difference in the converter station. This paper proposes a method to simulate the generated power by combining a model for the generator with a Computational Fluid Dynamics (CFD) model converter.
Identifying services constituting traffic from given IP network flows is essential to various applications, such as the management of quality of service (QoS) and the prevention of security issues. Typical methods for achieving this objective include identifications based on IP addresses and port numbers. However, such methods are not sufficiently accurate and require improvement. Deep Packet Inspection (DPI) is one of the most promising methods for improving the accuracy of identification. In addition, many current IP flows are encrypted using Transport Layer Security (TLS). Hence, it is necessary for identification methods to analyze flows encrypted by TLS. For that reason, a service identification method based on DPI and n-gram that focuses only on the non-encrypted parts in the TLS session establishment was proposed. However, there is room for improvement in identification accuracy because this method analyzes all the non-encrypted parts including Random Values without protocol analyses. In this paper, we propose a method for identifying the service from given IP flows based on analysis of Server Name Indication (SNI). The proposed method clusters flow according to the value of SNI and identify services from the occurrences of all clusters. Our evaluations, which involve identifications of services on Google and Yahoo sites, demonstrate that the proposed method can identify services more accurately than the existing method.
The era of information technology has, unfortunately, contributed to the tremendous rise in the number of criminal activities. However, digital artifacts can be utilized in convicting cybercriminal and exposing their activities. The digital forensics science concerns about all aspects related to cybercrimes. It seeks digital evidence by following standard methodologies to be admitted in court rooms. This paper concerns about memory forensics for the unique artifacts it holds. Memory contains information about the current state of systems and applications. Moreover, an application's data explains how a criminal has been interacting the application just before the memory is acquired. Memory forensics at the application level is currently random and cumbersome. Targeting specific applications is what forensic researchers and practitioner are currently striving to provide. This paper suggests a general solution to investigate any application. Our solution aims to utilize an application's data structures and variables' information in the investigation process. This is because an application's data has to be stored and retrieved in the means of variables. Data structures and variables' information can be generated by compilers for debugging purposes. We show that an application's information is a valuable resource to the investigator.
A prioritized cyber defense remediation plan is critical for effective risk management in cyber-physical systems (CPS). The increased integration of Information Technology (IT)/Operational Technology (OT) in CPS has to lead to the need to identify the critical assets which, when affected, will impact resilience and safety. In this work, we propose a methodology for prioritized cyber risk remediation plan that balances operational resilience and economic loss (safety impacts) in CPS. We present a platform for modeling and analysis of the effect of cyber threats and random system faults on the safety of CPS that could lead to catastrophic damages. We propose to develop a data-driven attack graph and fault graph-based model to characterize the exploitability and impact of threats in CPS. We develop an operational impact assessment to quantify the damages. Finally, we propose the development of a strategic response decision capability that proposes optimal mitigation actions and policies that balances the trade-off between operational resilience (Tactical Risk) and Strategic Risk.
Information, not just data, is key to today's global challenges. To solve these challenges requires not only advancing geospatial and big data analytics but requires new analysis and decision-making environments that enable reliable decisions from trustable, understandable information that go beyond current approaches to machine learning and artificial intelligence. These environments are successful when they effectively couple human decision making with advanced, guided spatial analytics in human-computer collaborative discourse and decision making (HCCD). Our HCCD approach builds upon visual analytics, natural scale templates, traceable information, human-guided analytics, and explainable and interactive machine learning, focusing on empowering the decisionmaker through interactive visual spatial analytic environments where non-digital human expertise and experience can be combined with state-of-the-art and transparent analytical techniques. When we combine this approach with real-world application-driven research, not only does the pace of scientific innovation accelerate, but impactful change occurs. I'll describe how we have applied these techniques to challenges in sustainability, security, resiliency, public safety, and disaster management.
FastChain is a simulator built in NS-3 which simulates the networked battlefield scenario with military applications, connecting tankers, soldiers and drones to form Internet-of-Battlefield-Things (IoBT). Computing, storage and communication resources in IoBT are limited during certain situations in IoBT. Under these circumstances, these resources should be carefully combined to handle the task to accomplish the mission. FastChain simulator uses Sharding approach to provide an efficient solution to combine resources of IoBT devices by identifying the correct and the best set of IoBT devices for a given scenario. Then, the set of IoBT devices for a given scenario collaborate together for sharding enabled Blockchain technology. Interested researchers, policy makers and developers can download and use the FastChain simulator to design, develop and evaluate blockchain enabled IoBT scenarios that helps make robust and trustworthy informed decisions in mission-critical IoBT environment.
Anomaly detection generally involves the extraction of features from entities' or users' properties, and the design of anomaly detection models using machine learning or deep learning algorithms. However, only considering entities' property information could lead to high false positives. We posit the importance of also considering connections or relationships between entities in the detecting of anomalous behaviors and associated threat groups. Therefore, in this paper, we design a GCN (graph convolutional networks) based anomaly detection model to detect anomalous behaviors of users and malicious threat groups. The GCN model could characterize entities' properties and structural information between them into graphs. This allows the GCN based anomaly detection model to detect both anomalous behaviors of individuals and associated anomalous groups. We then evaluate the proposed model using a real-world insider threat data set. The results show that the proposed model outperforms several state-of-art baseline methods (i.e., random forest, logistic regression, SVM, and CNN). Moreover, the proposed model can also be applied to other anomaly detection applications.
Continued advances in IoT technology have prompted new investigation into its usage for military operations, both to augment and complement existing military sensing assets and support next-generation artificial intelligence and machine learning systems. Under the emerging Internet of Battlefield Things (IoBT) paradigm, current operational conditions necessitate the development of novel security techniques, centered on establishment of trust for individual assets and supporting resilience of broader systems. To advance current IoBT efforts, a collection of prior-developed cybersecurity techniques is reviewed for applicability to conditions presented by IoBT operational environments (e.g., diverse asset ownership, degraded networking infrastructure, adversary activities) through use of supporting case study examples. The research techniques covered focus on two themes: (1) Supporting trust assessment for known/unknown IoT assets; (2) ensuring continued trust of known IoT assets and IoBT systems.
We classify .NET files as either benign or malicious by examining directed graphs derived from the set of functions comprising the given file. Each graph is viewed probabilistically as a Markov chain where each node represents a code block of the corresponding function, and by computing the PageRank vector (Perron vector with transport), a probability measure can be defined over the nodes of the given graph. Each graph is vectorized by computing Lebesgue antiderivatives of hand-engineered functions defined on the vertex set of the given graph against the PageRank measure. Files are subsequently vectorized by aggregating the set of vectors corresponding to the set of graphs resulting from decompiling the given file. The result is a fast, intuitive, and easy-to-compute glass-box vectorization scheme, which can be leveraged for training a standalone classifier or to augment an existing feature space. We refer to this vectorization technique as PageRank Measure Integration Vectorization (PMIV). We demonstrate the efficacy of PMIV by training a vanilla random forest on 2.5 million samples of decompiled. NET, evenly split between benign and malicious, from our in-house corpus and compare this model to a baseline model which leverages a text-only feature space. The median time needed for decompilation and scoring was 24ms. 11Code available at https://github.com/gtownrocks/grafuple.
IoT devices introduce unprecedented threats into home and professional networks. As they fail to adhere to security best practices, they are broadly exploited by malicious actors to build botnets or steal sensitive information. Their adoption challenges established security standard as classic security measures are often inappropriate to secure them. This is even more problematic in sensitive environments where the presence of insecure IoTs can be exploited to bypass strict security policies. In this paper, we demonstrate an attack against a highly secured network using a Bluetooth smart bulb. This attack allows a malicious actor to take advantage of a smart bulb to exfiltrate data from an air gapped network.