Biblio
Cover time measures the time (or number of steps) required for a mobile agent to visit each node in a network (graph) at least once. A short cover time is important for search or foraging applications that require mobile agents to quickly inspect or monitor nodes in a network, such as providing situational awareness or security. Speed can be achieved if details about the graph are known or if the agent maintains a history of visited nodes, however, these requirements may not be feasible for agents with limited resources, they are difficult in dynamic graph topologies, and they do not easily scale to large networks. This paper introduces a set-based form of heading (directional bias) that allows an agent to more efficiently explore any connected graph, static or dynamic. When deciding the next node to visit, agents are discouraged from visiting nodes that neighbor both their previous and current locations. Modifying a traditional movement method, e.g., random walk, with this concept encourages an agent to move toward nodes that are less likely to have been previously visited, reducing cover time. Simulation results with grid, scale-free, and minimum distance graphs demonstrate heading can consistently reduce cover time as compared to non-heading movement techniques.
One of the important direction of research in situational awareness is implementation of visual analytics techniques which can be efficiently applied when working with big security data in critical operational domains. The paper considers a visual analytics technique for displaying a set of security metrics used to assess overall network security status and evaluate the efficiency of protection mechanisms. The technique can assist in solving such security tasks which are important for security information and event management (SIEM) systems. The approach suggested is suitable for displaying security metrics of large networks and support historical analysis of the data. To demonstrate and evaluate the usefulness of the proposed technique we implemented a use case corresponding to the Olympic Games scenario.
This article talks about online deception, deception to them is considered as a deliberate act with the intent to mislead others while the recipients are not made aware or expect that such an act is taking place and that the goal of the deceiver is to transfer that false belief to the deceived ones. Understanding how online deception works through social media and future technologies remains a significant challenge. To address this challenge one needs to design social media applications with various rules and norms that our traditional physical space does not have.
Cyber scanning refers to the task of probing enterprise networks or Internet wide services, searching for vulnerabilities or ways to infiltrate IT assets. This misdemeanor is often the primarily methodology that is adopted by attackers prior to launching a targeted cyber attack. Hence, it is of paramount importance to research and adopt methods for the detection and attribution of cyber scanning. Nevertheless, with the surge of complex offered services from one side and the proliferation of hackers' refined, advanced, and sophisticated techniques from the other side, the task of containing cyber scanning poses serious issues and challenges. Furthermore recently, there has been a flourishing of a cyber phenomenon dubbed as cyber scanning campaigns - scanning techniques that are highly distributed, possess composite stealth capabilities and high coordination - rendering almost all current detection techniques unfeasible. This paper presents a comprehensive survey of the entire cyber scanning topic. It categorizes cyber scanning by elaborating on its nature, strategies and approaches. It also provides the reader with a classification and an exhaustive review of its techniques. Moreover, it offers a taxonomy of the current literature by focusing on distributed cyber scanning detection methods. To tackle cyber scanning campaigns, this paper uniquely reports on the analysis of two recent cyber scanning incidents. Finally, several concluding remarks are discussed.
Zero-day polymorphic worms pose a serious threat to the Internet security. With their ability to rapidly propagate, these worms increasingly threaten the Internet hosts and services. Not only can they exploit unknown vulnerabilities but can also change their own representations on each new infection or can encrypt their payloads using a different key per infection. They have many variations in the signatures of the same worm thus, making their fingerprinting very difficult. Therefore, signature-based defenses and traditional security layers miss these stealthy and persistent threats. This paper provides a detailed survey to outline the research efforts in relation to detection of modern zero-day malware in form of zero-day polymorphic worms.
Our vision in this paper is that agency, as the individual ability to intervene and tailor the system, is a crucial element in building trust in IoT technologies. Following up on this vision, we will first address the issue of agency, namely the individual capability to adopt free decisions, as a relevant driver in building trusted human-IoT relations, and how agency should be embedded in digital systems. Then we present the main challenges posed by existing approaches to implement this vision. We show then our proposal for a model-based approach that realizes the agency concept, including a prototype implementation.
Specifics of an alias-free digitizer application for compressed digitizing and recording of wideband signals are considered. Signal sampling in this case is performed on the basis of picosecond resolution event timing, the digitizer actually is a subsystem of Event Timer A033-ET and specific events that are detected and then timed are the signal and reference sine-wave crossings. The used approach to development of this subsystem is described and some results of experimental studies are given.
We present a new paradigm for unification arising out of a technique commonly used in cryptographic protocol analysis tools that employ unification modulo equational theories. This paradigm relies on: (i) a decomposition of an equational theory into (R, E) where R is confluent, terminating, and coherent modulo E, and (ii) on reducing unifi- cation problems to a set of problems s =? t under the constraint that t remains R/E-irreducible. We call this method asymmetric unification . We first present a general-purpose generic asymmetric unification algorithm.and then outline an approach for converting special-purpose conventional unification algorithms to asymmetric ones, demonstrating it for exclusive-or with uninterpreted function symbols. We demonstrate how asymmetric unification can improve performanceby running the algorithm on a set of benchmark problems. We also give results on the complexity and decidability of asymmetric unification.
Atomic sets are a synchronization mechanism in which the programmer specifies the groups of data that must be ac- cessed as a unit. The compiler can check this specifica- tion for consistency, detect deadlocks, and automatically add the primitives to prevent interleaved access. Atomic sets relieve the programmer from the burden of recognizing and pruning execution paths which lead to interleaved ac- cess, thereby reducing the potential for data races. However, manually converting programs from lock-based synchroniza- tion to atomic sets requires reasoning about the program’s concurrency structure, which can be a challenge even for small programs. Our analysis eliminates the challenge by automating the reasoning. Our implementation of the anal- ysis allowed us to derive the atomic sets for large code bases such as the Java collections framework in a matter of min- utes. The analysis is based on execution traces; assuming all traces reflect intended behavior, our analysis enables safe concurrency by preventing unobserved interleavings which may harbor latent Heisenbugs.