Biblio
Box queries on a dataset in a multidimensional data space are a type of query which specifies a set of allowed values for each dimension. Indexing a dataset in a multidimensional Non-ordered Discrete Data Space (NDDS) for supporting efficient box queries is becoming increasingly important in many application domains such as genome sequence analysis. The BoND-tree was recently introduced as an index structure specifically designed for box queries in an NDDS. Earlier work focused on developing strategies for building an effective BoND-tree to achieve high query performance. Developing efficient and effective techniques for deleting indexed vectors from the BoND-tree remains an open issue. In this paper, we present three deletion algorithms based on different underflow handling strategies in an NDDS. Our study shows that incorporating a new BoND-tree inspired heuristic can provide improved performance compared to the traditional underflow handling heuristics in NDDSs.
Critical business applications in domains ranging from technical support to healthcare increasingly rely on large-scale, automatically constructed knowledge graphs. These applications use the results of complex queries over knowledge graphs in order to help users in taking crucial decisions such as which drug to administer, or whether certain actions are compliant with all the regulatory requirements and so on. However, these knowledge graphs constantly evolve, and the newer versions may adversely impact the results of queries that the previously taken business decisions were based on. We propose a framework based on provenance polynomials to track the impact of knowledge graph changes on arbitrary SPARQL query results. Focusing on the deletion of facts, we show how to efficiently determine the queries impacted by the change, develop ways to incrementally maintain these polynomials, and present an efficient implementation on top of RDF graph databases. Our experimental evaluation over large-scale RDF/SPARQL benchmarks show the effectiveness of our proposal.
Searching and retrieving information from the Web is a primary activity needed to monitor the development and usage of Web resources. Possible benefits include improving user experience (e.g. by optimizing query results) and enforcing data/user security (e.g. by identifying harmful websites). Motivated by the lack of ready-to-use solutions, in this paper we present a flexible and accessible toolkit for structure and content mining, able to crawl, download, extract and index resources from the Web. While being easily configurable to work in the "surface" Web, our suite is specifically tailored to explore the Tor dark Web, i.e. the ensemble of Web servers composing the world's most famous darknet. Notably, the toolkit is not just a Web scraper, but it includes two mining modules, respectively able to prepare content to be fed to an (external) semantic engine, and to reconstruct the graph structure of the explored portion of the Web. Other than discussing in detail the design, features and performance of our toolkit, we report the findings of a preliminary run over Tor, that clarify the potential of our solution.
Existing probabilistic privacy enforcement approaches permit the execution of a program that processes sensitive data only if the information it leaks is within the bounds specified by a given policy. Thus, to extract any information, users must manually design a program that satisfies the policy. In this work, we present a novel synthesis approach that automatically transforms a program into one that complies with a given policy. Our approach consists of two ingredients. First, we phrase the problem of determining the amount of leaked information as Bayesian inference, which enables us to leverage existing probabilistic programming engines. Second, we present two synthesis procedures that add uncertainty to the program's outputs as a way of reducing the amount of leaked information: an optimal one based on SMT solving and a greedy one with quadratic running time. We implemented and evaluated our approach on 10 representative programs from multiple application domains. We show that our system can successfully synthesize a permissive enforcement mechanism for all examples.
The expeditious expansion of the networking technologies have remarkably driven the usage of the distributedcomputing as well as services, such as task offloading to the cloud. However, security and privacy concerns are restricting the implementations of cloud computing because of the threats from both outsiders and insiders. The primary alternative of protecting users' data is developing a Fully Homomorphic Encryption (FHE) scheme, which can cover both data protections and data processing in the cloud. Despite many previous attempts addressing this approach, none of the proposed work can simultaneously satisfy two requirements that include the non-noise accuracy and an efficiency execution. This paper focuses on the issue of FHE design and proposes a novel FHE scheme, which is called Optimal Fully Homomorphic Encryption (O-FHE). Our approach utilizes the properties of the Kronecker Product (KP) and designs a mechanism of achieving FHE, which consider both accuracy and efficiency. We have assessed our scheme in both theoretical proofing and experimental evaluations with the confirmed and exceptional results.
Emerging cyber-physical systems (CPS) often require collecting end users' data to support data-informed decision making processes. There has been a long-standing argument as to the tradeoff between privacy and data utility. In this paper, we adopt a multiparametric programming approach to rigorously study conditions under which data utility has to be sacrificed to protect privacy and situations where free-lunch privacy can be achieved, i.e., data can be concealed without hurting the optimality of the decision making underlying the CPS. We formalize the concept of free-lunch privacy, and establish various results on its existence, geometry, as well as efficient computation methods. We propose the free-lunch privacy mechanism, which is a pragmatic mechanism that exploits free-lunch privacy if it exists with the constant guarantee of optimal usage of data. We study the resilience of this mechanism against attacks that attempt to infer the parameter of a user's data generating process. We close the paper by a case study on occupancy-adaptive smart home temperature control to demonstrate the efficacy of the mechanism.
Data storage in cloud should come along with high safety and confidentiality. It is accountability of cloud service provider to guarantee the availability and security of client data. There exist various alternatives for storage services but confidentiality and complexity solutions for database as a service are still not satisfactory. Proposed system gives alternative solution for database as a service that integrates benefits of different services along with advance encryption techniques. It yields possibility of applying concurrency on encrypted data. This alternative provides supporting facility to connect dispersed clients with elimination of intermediate proxy by which simplicity can acquired. Performance of proposed system evaluated on basis of theoretical analyses.
The prevalence of mobile devices and location-based services (LBS) has generated great concerns regarding the LBS users' privacy, which can be compromised by statistical analysis of their movement patterns. A number of algorithms have been proposed to protect the privacy of users in such systems, but the fundamental underpinnings of such remain unexplored. Recently, the concept of perfect location privacy was introduced and its achievability was studied for anonymization-based LBS systems, where user identifiers are permuted at regular intervals to prevent identification based on statistical analysis of long time sequences. In this paper, we significantly extend that investigation by incorporating the other major tool commonly employed to obtain location privacy: obfuscation, where user locations are purposely obscured to protect their privacy. Since anonymization and obfuscation reduce user utility in LBS systems, we investigate how location privacy varies with the degree to which each of these two methods is employed. We provide: (1) achievability results for the case where the location of each user is governed by an i.i.d. process; (2) converse results for the i.i.d. case as well as the more general Markov Chain model. We show that, as the number of users in the network grows, the obfuscation-anonymization plane can be divided into two regions: in the first region, all users have perfect location privacy; and, in the second region, no user has location privacy.
Enabled by artificial intelligence techniques, we are witnessing the rise of a new paradigm of computational creativity support: mixed-initiative creative interfaces put human and computer in a tight interactive loop where each suggests, produces, evaluates, modifies, and selects creative outputs in response to the other. This paradigm could broaden and amplify creative capacity for all, but has so far remained mostly confined to artificial intelligence for game content generation, and faces many unsolved interaction design challenges. This workshop therefore convenes CHI and game researchers to advance mixed-initiative approaches to creativity support.
We have presented a unique PT-VCSEL arrangement which experimentally demonstrates the process of optically controlled current confinement. Lessons learned will be transferred to future generations of solitary device which will be optimized with respect to the degree of confinement (depending on the parameters of the PT, in particular the current gain), threshold current and electro-optic efficiency.
We demonstrate high-speed operation of ultracompact electroabsorption modulators based on epsilon-near-zero confinement in indium oxide (In$_\textrm2$$_\textrm3$\$) on silicon using field-effect carrier density tuning. Additionally, we discuss strategies to enhance modulator performance and reduce confinement-related losses by introducing high-mobility conducting oxides such as cadmium oxide (CdO).
The paper introduces a smart system developed with sensors that is useful for internal and external security. The system is useful for people living in houses, apartments, high officials, bank, and offices. The system is developed in two phases one for internal security like home another is external security like open areas, streets. The system is consist of a mobile application, capacitive sensing, smart routing these valuable features to ensure safety of life and wealth. This security system is wireless sensor based which is an effective alternative of cctv cameras and other available security systems. Efficiency of this system is developed after going through practical studies and prototyping. The end result explains the feasibility rate, positive impact factor, reliability of the system. More research is possible in future based on this system this research explains that.