Biblio
In the era of mass agriculture to keep up with the increasing demand for food production, advanced monitoring systems are required in order to handle several challenges such as perishable products, food waste, unpredictable supply variations and stringent food safety and sustainability requirements. The evolution of Internet of Things have provided means for collecting, processing, and communicating data associated with agricultural processes. This have opened several opportunities to sustain, improve productivity and reduce waste in every step in the food supply chain system. On the hand, this resulted in several new challenges, such as, the security of the data, recording and representation of data, providing real time control, reliability of the system, and dealing with big data. This paper proposes an architecture for security of big data in the agricultural supply chain management system. This can help in reducing food waste, increasing the reliability of the supply chain, and enhance the performance of the food supply chain system.
Utilization of Wireless sensor network is growing with the development in modern technologies. On other side electromagnetic spectrum is limited resources. Application of wireless communication is expanding day by day which directly threaten electromagnetic spectrum band to become congested. Cognitive Radio solves this issue by implementation of unused frequency bands as "White Space". There is another important factor that gets attention in cognitive model i.e: Wireless Security. One of the famous causes of security threat is malicious node in cognitive radio wireless sensor networks (CRWSN). The goal of this paper is to focus on security issues which are related to CRWSN as Fusion techniques, Co-operative Spectrum sensing along with two dangerous attacks in CR: Primary User Emulation (PUE) and Spectrum Sensing Data Falsification (SSDF).
In military operations, Commander's Intent describes the desired end state and purpose of the operation, expressed in a concise and clear manner. Command by intent is a paradigm that empowers subordinate units to exercise measured initiative to meet mission goals and accept prudent risk within commander's intent. It improves agility of military operations by allowing exploitation of local opportunities without an explicit directive from the commander to do so. This paper discusses what the paradigm entails in terms of architectural decisions for data fusion systems tasked with real-time information collection to satisfy operational mission goals. In our system, information needs of decisions are expressed at a high level, and shared among relevant nodes. The selected nodes, then, jointly operate to meet mission information needs by forwarding and caching relevant data without explicit directives regarding the objects to fetch and sources to contact. A preliminary evaluation of the system is presented using a target tracking application, set in the context of a NATO-based mission scenario, called Anglova. Evaluation results show that delegating some decision authority to the data fusion system (in terms of objects to fetch and sources to contact) allows it to save more network resources, while also increasing mission success rate. The system is therefore particularly well-suited to operation in partially denied or contested environments, where resource bottlenecks caused by adversarial activity impair one's ability to collect real-time information for mission-critical decision making.
In this paper, a distributed architecture for the implementation of smart city has been proposed to facilitate various smart features like solid waste management, efficient urban mobility and public transport, smart parking, robust IT connectivity, safety and security of citizens and a roadmap for achieving it. How massive volume of IoT data can be analyzed and a layered architecture of IoT is explained. Why data integration is important for analyzing and processing of data collected by the different smart devices like sensors, actuators and RFIDs is discussed. The wireless sensor network can be used to sense the data from various locations but there has to be more to it than stuffing sensors everywhere for everything. Why only the sensor is not sufficient for data collection and how human beings can be used to collect data is explained. There is some communication protocols between the volunteers engaged in collecting data to restrict the sharing of data and ensure that the target area is covered with minimum numbers of volunteers. Every volunteer should cover some predefined area to collect data. Then the proposed architecture model is having one central server to store all data in a centralized server. The data processing and the processing of query being made by the user is taking place in centralized server.
Wearable devices for fitness tracking and health monitoring have gained considerable popularity and become one of the fastest growing smart devices market. More and more companies are offering integrated health and activity monitoring solutions for fitness trackers. Recently insurances are offering their customers better conditions for health and condition monitoring. However, the extensive sensitive information collected by tracking products and accessibility by third party service providers poses vital security and privacy challenges on the employed solutions. In this paper, we present our security analysis of a representative sample of current fitness tracking products on the market. In particular, we focus on malicious user setting that aims at injecting false data into the cloud-based services leading to erroneous data analytics. We show that none of these products can provide data integrity, authenticity and confidentiality.
Wireless sensor networks are responsible for sensing, gathering and processing the information of the objects in the network coverage area. Basic data fusion technology generally does not provide data privacy protection mechanism, and the privacy protection mechanism in health care, military reconnaissance, smart home and other areas of the application is usually indispensable. In this paper, we consider the privacy, confidentiality, and the accuracy of fusion results, and propose a data fusion algorithm for privacy preserving. This algorithm relies on the characteristics of data fusion, and uses the method of pre-distribution random number in the node to get the privacy protection requirements of the original data. Theoretical analysis shows that the malicious attacker attempts to steal the difficulty of node privacy in PPND algorithm. At the same time in the TOSSIM simulation results also show that, compared with TAG, SMART algorithm, PPND algorithm in the data traffic, the convergence accuracy of the good performance.
In recent years, the usage of unmanned aircraft systems (UAS) for security-related purposes has increased, ranging from military applications to different areas of civil protection. The deployment of UAS can support security forces in achieving an enhanced situational awareness. However, in order to provide useful input to a situational picture, sensor data provided by UAS has to be integrated with information about the area and objects of interest from other sources. The aim of this study is to design a high-level data fusion component combining probabilistic information processing with logical and probabilistic reasoning, to support human operators in their situational awareness and improving their capabilities for making efficient and effective decisions. To this end, a fusion component based on the ISR (Intelligence, Surveillance and Reconnaissance) Analytics Architecture (ISR-AA) [1] is presented, incorporating an object-oriented world model (OOWM) for information integration, an expressive knowledge model and a reasoning component for detection of critical events. Approaches for translating the information contained in the OOWM into either an ontology for logical reasoning or a Markov logic network for probabilistic reasoning are presented.
DeepDive is a system for extracting relational databases from dark data: the mass of text, tables, and images that are widely collected and stored but which cannot be exploited by standard relational tools. If the information in dark data — scientific papers, Web classified ads, customer service notes, and so on — were instead in a relational database, it would give analysts access to a massive and highly-valuable new set of "big data" to exploit. DeepDive is distinctive when compared to previous information extraction systems in its ability to obtain very high precision and recall at reasonable engineering cost; in a number of applications, we have used DeepDive to create databases with accuracy that meets that of human annotators. To date we have successfully deployed DeepDive to create data-centric applications for insurance, materials science, genomics, paleontologists, law enforcement, and others. The data unlocked by DeepDive represents a massive opportunity for industry, government, and scientific researchers. DeepDive is enabled by an unusual design that combines large-scale probabilistic inference with a novel developer interaction cycle. This design is enabled by several core innovations around probabilistic training and inference.
We present a process of linking various Deutsche Bundesbank datasources on companies based on a semi-automatic classification. The linkage process involves data cleaning and harmonization, blocking, construction of comparison features, as well as training and testing a statistical classification model on a "ground-truth" subset of known matches and non-matches. The evaluation of our method shows that the process limits the need for manual classifications to a small percentage of ambiguously classified match candidates.
It is common practice for data scientists to acquire and integrate disparate data sources to achieve higher quality results. But even with a perfectly cleaned and merged data set, two fundamental questions remain: (1) is the integrated data set complete and (2) what is the impact of any unknown (i.e., unobserved) data on query results? In this work, we develop and analyze techniques to estimate the impact of the unknown data (a.k.a., unknown unknowns) on simple aggregate queries. The key idea is that the overlap between different data sources enables us to estimate the number and values of the missing data items. Our main techniques are parameter-free and do not assume prior knowledge about the distribution. Through a series of experiments, we show that estimating the impact of unknown unknowns is invaluable to better assess the results of aggregate queries over integrated data sources.
Using heterogeneous clouds has been considered to improve performance of big-data analytics for healthcare platforms. However, the problem of the delay when transferring big-data over the network needs to be addressed. The purpose of this paper is to analyze and compare existing cloud computing environments (PaaS, IaaS) in order to implement middleware services. Understanding the differences and similarities between cloud technologies will help in the interconnection of healthcare platforms. The paper provides a general overview of the techniques and interfaces for cloud computing middleware services, and proposes a cloud architecture for healthcare. Cloud middleware enables heterogeneous devices to act as data sources and to integrate data from other healthcare platforms, but specific APIs need to be developed. Furthermore, security and management problems need to be addressed, given the heterogeneous nature of the communication and computing environment. The present paper fills a gap in the electronic healthcare register literature by providing an overview of cloud computing middleware services and standardized interfaces for the integration with medical devices.
Information fusion deals with the integration and merging of data and information from multiple (heterogeneous) sources. In many cases, the information that needs to be fused has security classification. The result of the fusion process is then by necessity restricted with the strictest information security classification of the inputs. This has severe drawbacks and limits the possible dissemination of the fusion results. It leads to decreased situational awareness: the organization knows information that would enable a better situation picture, but since parts of the information is restricted, it is not possible to distribute the most correct situational information. In this paper, we take steps towards defining fusion and data mining processes that can be used even when all the underlying data that was used cannot be disseminated. The method we propose here could be used to produce a classifier where all the sensitive information has been removed and where it can be shown that an antagonist cannot even in principle obtain knowledge about the classified information by using the classifier or situation picture.
Big data's explosive growth has prompted the US government to release new reports that address the issues--particularly related to privacy--resulting from this growth. The Web extra at http://youtu.be/j49eoe5g8-c is an audio recording from the Computing and the Law column, in which authors Brian M. Gaff, Heather Egan Sussman, and Jennifer Geetter discuss how big data's explosive growth has prompted the US government to release new reports that address the issues--particularly related to privacy--resulting from this growth.