Biblio
The limited battery lifetime and rapidly increasing functionality of portable multimedia devices demand energy-efficient designs. The filters employed mainly in these devices are based on Gaussian smoothing, which is slow and, severely affects the performance. In this paper, we propose a novel energy-efficient approximate 2D Gaussian smoothing filter (2D-GSF) architecture by exploiting "nearest pixel approximation" and rounding-off Gaussian kernel coefficients. The proposed architecture significantly improves Speed-Power-Area-Accuracy (SPAA) metrics in designing energy-efficient filters. The efficacy of the proposed approximate 2D-GSF is demonstrated on real application such as edge detection. The simulation results show 72%, 79% and 76% reduction in area, power and delay, respectively with acceptable 0.4dB loss in PSNR as compared to the well-known approximate 2D-GSF.
In this paper, an optimization model of automobile supply chain network with risks under fuzzy price is put forward. The supply chain network is composed of component suppliers, plants, and distribution centers. The total costs of automobile supply chain consist of variable costs, fixed costs, and transportation costs. The objective of this study is to minimize the risks of total profits. In order to deal with this model, this paper puts forward an approximation method to transform a continuous fuzzy problem into discrete fuzzy problem. The model is solved using Cplex 12.6. The results show that Cplex 12.6 can perfectly solve this model, the expected value and lower semi-variance of total profits converge with the increasing number of discretization points, the structure of automobile supply chain network keeps unchanged with the increasing number of discretization points.
Compressed sensing (CS) or compressive sampling deals with reconstruction of signals from limited observations/ measurements far below the Nyquist rate requirement. This is essential in many practical imaging system as sampling at Nyquist rate may not always be possible due to limited storage facility, slow sampling rate or the measurements are extremely expensive e.g. magnetic resonance imaging (MRI). Mathematically, CS addresses the problem for finding out the root of an unknown distribution comprises of unknown as well as known observations. Robbins-Monro (RM) stochastic approximation, a non-parametric approach, is explored here as a solution to CS reconstruction problem. A distance based linear prediction using the observed measurements is done to obtain the unobserved samples followed by random noise addition to act as residual (prediction error). A spatial domain adaptive Wiener filter is then used to diminish the noise and to reveal the new features from the degraded observations. Extensive simulation results highlight the relative performance gain over the existing work.
In distributed wireless storage systems, failed recovery probability depends on not only wireless channel conditions but also storage size of each distributed storage node. For efficient utilization of limited storage capacity, we asymptotically analyze the failed recovery probability of a distributed wireless storage system with a sum storage capacity constraint when signal-to-noise ratio goes to infinity, and find the optimal storage allocation strategy across distributed storage nodes in terms of the asymptotic failed recovery probability. It is also shown that when the number of storage nodes is sufficiently large the storage size required at each node is not so large for high exponential order of the failed recovery probability.
Game theory can provide a useful tool to study the security problem in mobile ad hoc networks (MANETs). Most of existing works on applying game theories to security only consider two players in the security game model: an attacker and a defender. While this assumption may be valid for a network with centralized administration, it is not realistic in MANETs, where centralized administration is not available. In this paper, using recent advances in mean field game theory, we propose a novel game theoretic approach with multiple players for security in MANETs. The mean field game theory provides a powerful mathematical tool for problems with a large number of players. The proposed scheme can enable an individual node in MANETs to make strategic security defence decisions without centralized administration. In addition, since security defence mechanisms consume precious system resources (e.g., energy), the proposed scheme considers not only the security requirement of MANETs but also the system resources. Moreover, each node in the proposed scheme only needs to know its own state information and the aggregate effect of the other nodes in the MANET. Therefore, the proposed scheme is a fully distributed scheme. Simulation results are presented to illustrate the effectiveness of the proposed scheme.
The longstanding debate on a fundamental science of security has led to advances in systems, software, and network security. However, existing efforts have done little to inform how an environment should react to emerging and ongoing threats and compromises. The authors explore the goals and structures of a new science of cyber-decision-making in the Cyber-Security Collaborative Research Alliance, which seeks to develop a fundamental theory for reasoning under uncertainty the best possible action in a given cyber environment. They also explore the needs and limitations of detection mechanisms; agile systems; and the users, adversaries, and defenders that use and exploit them, and conclude by considering how environmental security can be cast as a continuous optimization problem.
Web applications need to validate and sanitize user inputs in order to avoid attacks such as Cross Site Scripting (XSS) and SQL Injection. Writing string manipulation code for input validation and sanitization is an error-prone process leading to many vulnerabilities in real-world web applications. Automata-based static string analysis techniques can be used to automatically compute vulnerability signatures (represented as automata) that characterize all the inputs that can exploit a vulnerability. However, there are several factors that limit the applicability of static string analysis techniques in general: 1) undesirability of static string analysis requires the use of approximations leading to false positives, 2) static string analysis tools do not handle all string operations, 3) dynamic nature of the scripting languages makes static analysis difficult. In this paper, we show that vulnerability signatures computed for deliberately insecure web applications (developed for demonstrating different types of vulnerabilities) can be used to generate test cases for other applications. Given a vulnerability signature represented as an automaton, we present algorithms for test case generation based on state, transition, and path coverage. These automatically generated test cases can be used to test applications that are not analyzable statically, and to discover attack strings that demonstrate how the vulnerabilities can be exploited.
The deployment and operation of global network architectures can exhibit complex, dynamic behavior and the comprehensive validation of their properties, without actually building and running the systems, can only be achieved with the help of simulations. Packet-level models are not feasible in the Internet scale, but we are still interested in the phenomena that emerge when the systems are run in their intended environment. We argue for the high-level simulation methodology and introduce a simulation environment based on aggregate models built on state-of-the-art datasets available while respecting invariants observed in measurements. The models developed are aimed at studying a clean slate name-based interdomain routing architecture and provide an abundance of parameters for sensitivity analysis and a modular design with a balanced level of detail in different aspects of the model. In addition to introducing several reusable models for traffic, topology, and deployment, we report our experiences in using the high-level simulation approach and potential pitfalls related to it.
This paper presents a novel design of content fingerprints based on maximization of the mutual information across the distortion channel. We use the information bottleneck method to optimize the filters and quantizers that generate these fingerprints. A greedy optimization scheme is used to select filters from a dictionary and allocate fingerprint bits. We test the performance of this method for audio fingerprinting and show substantial improvements over existing learning based fingerprints.
In Wireless Mesh Networks (WMNs), Network-Wide Broadcasts (NWBs) are a fundamental operation, required by routing and other mechanisms that distribute information to all nodes in the network. However, due to the characteristics of wireless communication, NWBs are generally problematic. Optimizing them thus is a prime target when improving the overall performance and dependability of WMNs. Most existing optimizations neglect the real nature of WMNs and are based on simple graph models, which provide optimistic assumptions of NWB dissemination. On the other hand, models that fully consider the complex propagation characteristics of NWBs quickly become unsolvable due to their complexity. In this paper, we present the Monte Carlo method Probabilistic Breadth-First Search (PBFS) to approximate the reachability of NWB protocols. PBFS simulates individual NWBs on graphs with probabilistic edge weights, which reflect link qualities of individual wireless links in the WMN, and estimates reachability over a configurable number of simulated runs. This approach is not only more efficient than existing ones, but further provides additional information, such as the distribution of path lengths. Furthermore, it is easily extensible to NWB schemes other than flooding. The applicability of PBFS is validated both theoretically and empirically, in the latter by comparing reachability as calculated by PBFS and measured in a real-world WMN. Validation shows that PBFS quickly converges to the theoretically correct value and approximates the behavior of real-life testbeds very well. The feasibility of PBFS to support research on NWB optimizations or higher level protocols that employ NWBs is demonstrated in two use cases.
The longstanding debate on a fundamental science of security has led to advances in systems, software, and network security. However, existing efforts have done little to inform how an environment should react to emerging and ongoing threats and compromises. The authors explore the goals and structures of a new science of cyber-decision-making in the Cyber-Security Collaborative Research Alliance, which seeks to develop a fundamental theory for reasoning under uncertainty the best possible action in a given cyber environment. They also explore the needs and limitations of detection mechanisms; agile systems; and the users, adversaries, and defenders that use and exploit them, and conclude by considering how environmental security can be cast as a continuous optimization problem.
Existing methods for multi-objective optimization usually provide only an approximation of a Pareto front, and there is little theoretical guarantee of finding the real Pareto front. This paper is concerned with the possibility of fully determining the true Pareto front for those continuous multi-objective optimization problems for which there are a finite number of local optima in terms of each single objective function and there is an effective method to find all such local optima. To this end, some generalized theoretical conditions are firstly given to guarantee a complete cover of the actual Pareto front for both discrete and continuous problems. Then based on such conditions, an effective search procedure inspired by the rising sea level phenomenon is proposed particularly for continuous problems of the concerned class. Even for general continuous problems to which not all local optima are available, the new method may still work well to approximate the true Pareto front. The good practicability of the proposed method is especially underpinned by multi-optima evolutionary algorithms. The advantages of the proposed method in terms of both solution quality and computational efficiency are illustrated by the simulation results.