Ruiz-Martin, Cristina, Wainer, Gabriel, Lopez-Paredes, Adolfo.
2020.
Studying Communications Resiliency in Emergency Plans. 2020 Spring Simulation Conference (SpringSim). :1–12.
Recent disasters have shown that hazards can be unpredictable and can have catastrophic consequences. Emergency plans are key to dealing with these situations and communications play a key role in emergency management. In this paper, we provide a formalism to design resilient emergency plans in terms of communications. We exemplify how to use the formalism using a case study of a Nuclear Emergency Plan.
Yang, Chien-Sheng, Avestimehr, A. Salman.
2020.
Coded Computing for Boolean Functions. 2020 International Symposium on Information Theory and Its Applications (ISITA). :141–145.
The growing size of modern datasets necessitates splitting a large scale computation into smaller computations and operate in a distributed manner for improving overall performance. However, adversarial servers in a distributed computing system deliberately send erroneous data in order to affect the computation for their benefit. Computing Boolean functions is the key component of many applications of interest, e.g., classification problem, verification functions in the blockchain and the design of cryptographic algorithm. In this paper, we consider the problem of computing a Boolean function in which the computation is carried out distributively across several workers with particular focus on security against Byzantine workers. We note that any Boolean function can be modeled as a multivariate polynomial which can have high degree in general. Hence, the recently proposed Lagrange Coded Computing (LCC) can be used to simultaneously provide resiliency, security, and privacy. However, the security threshold (i.e., the maximum number of adversarial workers that can be tolerated) provided by LCC can be extremely low if the degree of the polynomial is high. Our goal is to design an efficient coding scheme which achieves the optimal security threshold. We propose two novel schemes called coded Algebraic normal form (ANF) and coded Disjunctive normal form (DNF). Instead of modeling the Boolean function as a general polynomial, the key idea of the proposed schemes is to model it as the concatenation of some linear functions and threshold functions. The proposed coded ANF and coded DNF outperform LCC by providing the security threshold which is independent of the polynomial's degree.
Xiao, Wenli, Jiang, Hao, Xia, Song.
2020.
A New Black Box Attack Generating Adversarial Examples Based on Reinforcement Learning. 2020 Information Communication Technologies Conference (ICTC). :141–146.
Machine learning can be misled by adversarial examples, which is formed by making small changes to the original data. Nowadays, there are kinds of methods to produce adversarial examples. However, they can not apply non-differentiable models, reduce the amount of calculations, and shorten the sample generation time at the same time. In this paper, we propose a new black box attack generating adversarial examples based on reinforcement learning. By using deep Q-learning network, we can train the substitute model and generate adversarial examples at the same time. Experimental results show that this method only needs 7.7ms to produce an adversarial example, which solves the problems of low efficiency, large amount of calculation and inapplicable to non-differentiable model.
Driss, Maha, Aljehani, Amani, Boulila, Wadii, Ghandorh, Hamza, Al-Sarem, Mohammed.
2020.
Servicing Your Requirements: An FCA and RCA-Driven Approach for Semantic Web Services Composition. IEEE Access. 8:59326—59339.
The evolution of Service-Oriented Computing (SOC) provides more efficient software development methods for building and engineering new value-added service-based applications. SOC is a computing paradigm that relies on Web services as fundamental elements. Research and technical advancements in Web services composition have been considered as an effective opportunity to develop new service-based applications satisfying complex requirements rapidly and efficiently. In this paper, we present a novel approach enhancing the composition of semantic Web services. The novelty of our approach, as compared to others reported in the literature, rests on: i) mapping user's/organization's requirements with Business Process Modeling Notation (BPMN) and semantic descriptions using ontologies, ii) considering functional requirements and also different types of non-functional requirements, such as quality of service (QoS), quality of experience (QoE), and quality of business (QoBiz), iii) using Formal Concept Analysis (FCA) technique to select the optimal set of Web services, iv) considering composability levels between sequential Web services using Relational Concept Analysis (RCA) technique to decrease the required adaptation efforts, and finally, v) validating the obtained service-based applications by performing an analytical technique, which is the monitoring. The approach experimented on an extended version of the OWLS-TC dataset, which includes more than 10830 Web services descriptions from various domains. The obtained results demonstrate that our approach allows to successfully and effectively compose Web services satisfying different types of user's functional and non-functional requirements.
Basu, Prithwish, Salonidis, Theodoros, Kraczek, Brent, Saghaian, Sayed M., Sydney, Ali, Ko, Bongjun, La Porta, Tom, Chan, Kevin.
2020.
Decentralized placement of data and analytics in wireless networks for energy-efficient execution. IEEE INFOCOM 2020 - IEEE Conference on Computer Communications. :486—495.
We address energy-efficient placement of data and analytics components of composite analytics services on a wireless network to minimize execution-time energy consumption (computation and communication) subject to compute, storage and network resource constraints. We introduce an expressive analytics service hypergraph model for representing k-ary composability relationships (k ≥ 2) between various analytics and data components and leverage binary quadratic programming (BQP) to minimize the total energy consumption of a given placement of the analytics hypergraph nodes on the network subject to resource availability constraints. Then, after defining a potential energy functional Φ(·) to model the affinities of analytics components and network resources using analogs of attractive and repulsive forces in physics, we propose a decentralized Metropolis Monte Carlo (MMC) sampling method which seeks to minimize Φ by moving analytics and data on the network. Although Φ is non-convex, using a potential game formulation, we identify conditions under which the algorithm provably converges to a local minimum energy equilibrium placement configuration. Trace-based simulations of the placement of a deep-neural-network analytics service on a realistic wireless network show that for smaller problem instances our MMC algorithm yields placements with total energy within a small factor of BQP and more balanced workload distributions; for larger problems, it yields low-energy configurations while the BQP approach fails.
Fatehi, Nina, Shahhoseini, HadiShahriar.
2020.
A Hybrid Algorithm for Evaluating Trust in Online Social Networks. 2020 10th International Conference on Computer and Knowledge Engineering (ICCKE). :158—162.
The acceleration of extending popularity of Online Social Networks (OSNs) thanks to various services with which they provide people, is inevitable. This is why in OSNs security as a way to protect private data of users to be abused by unauthoritative people has a vital role to play. Trust evaluation is the security approach that has been utilized since the advent of OSNs. Graph-based approaches are among the most popular methods for trust evaluation. However, graph-based models need to employ limitations in the search process of finding trusted paths. This contributes to a reduction in trust accuracy. In this investigation, a learning-based model which with no limitation is able to find reliable users of any target user, is proposed. Experimental results depict 12% improvement in trust accuracy compares to models based on the graph-based approach.