Biblio
Information on cyber incidents and threats are currently collected and processed with a strong technical focus. Threat and vulnerability information alone are not a solid base for effective, affordable or actionable security advice for decision makers. They need more than a small technical cut of a bigger situational picture to combat and not only to mitigate the cyber threat. We first give a short overview over the related work that can be found in the literature. We found that the approaches mostly analysed “what” has been done, instead of looking more generically beyond the technical aspects for the tactics, techniques and procedures to identify the “how” it was done, by whom and why. We examine then, what information categories and data already exist to answer the question for an adversary's capabilities and objectives. As traditional intelligence tries to serve a better understanding of adversaries' capabilities, actions, and intent, the same is feasible in the cyber space with cyber intelligence. Thus, we identify information sources in the military and civil environment, before we propose to link that traditional information with the technical data for a better situational picture. We give examples of information that can be collected from traditional intelligence for correlation with technical data. Thus, the same intelligence operational picture for the cyber sphere could be developed like the one that is traditionally fed from conventional intelligence disciplines. Finally we propose a way of including intelligence processing in cyber analysis. We finally outline requirements that are key for a successful exchange of information and intelligence between military/civil information providers.
The University of Illinois at Urbana Champaign (Illinois), Pacific Northwest National Labs (PNNL), and the University of Southern California Information Sciences Institute (USC-ISI) consortium is working toward providing tools and expertise to enable collaborative research to improve security and resiliency of cyber physical systems. In this extended abstract we discuss the challenges and the solution space. We demonstrate the feasibility of some of the proposed components through a wide-area situational awareness experiment for the power grid across the three sites.
Information fusion deals with the integration and merging of data and information from multiple (heterogeneous) sources. In many cases, the information that needs to be fused has security classification. The result of the fusion process is then by necessity restricted with the strictest information security classification of the inputs. This has severe drawbacks and limits the possible dissemination of the fusion results. It leads to decreased situational awareness: the organization knows information that would enable a better situation picture, but since parts of the information is restricted, it is not possible to distribute the most correct situational information. In this paper, we take steps towards defining fusion and data mining processes that can be used even when all the underlying data that was used cannot be disseminated. The method we propose here could be used to produce a classifier where all the sensitive information has been removed and where it can be shown that an antagonist cannot even in principle obtain knowledge about the classified information by using the classifier or situation picture.
Power network is important part of national comprehensive energy resources transmission system in the way of energy security promise and the economy society running. Meanwhile, because of many industries involved, the development of grid can push national innovation ability. Nowadays, it makes the inner of smart grid flourish that material science, computer technique and information and communication technology go forward. This paper researches the function and modality of smart grid on energy, geography and technology dimensions. The analysis on the technology dimension is addressed on two aspects which are network control and interaction with customer. The mapping relationship between functions fo smart grid and eight key technologies, which are Large-capacity flexible transmission technology, DC power distribution technology, Distributed power generation technology, Large-scale energy storage technology, Real-time tracking simulation technology, Intelligent electricity application technology, The big data analysis and cloud computing technology, Wide-area situational awareness technology, is given. The research emphasis of the key technologies is proposed.
Sources such as speakers and environments from different communication devices produce signal variations that result in interference generated by different communication devices. Despite these convolutions, signal variations produced by different mobile devices leave intrinsic fingerprints on recorded calls, thus allowing the tracking of the models and brands of engaged mobile devices. This study aims to investigate the use of recorded Voice over Internet Protocol calls in the blind identification of source mobile devices. The proposed scheme employs a combination of entropy and mel-frequency cepstrum coefficients to extract the intrinsic features of mobile devices and analyzes these features with a multi-class support vector machine classifier. The experimental results lead to an accurate identification of 10 source mobile devices with an average accuracy of 99.72%.
In large-scale systems, user authentication usually needs the assistance from a remote central authentication server via networks. The authentication service however could be slow or unavailable due to natural disasters or various cyber attacks on communication channels. This has raised serious concerns in systems which need robust authentication in emergency situations. The contribution of this paper is two-fold. In a slow connection situation, we present a secure generic multi-factor authentication protocol to speed up the whole authentication process. Compared with another generic protocol in the literature, the new proposal provides the same function with significant improvements in computation and communication. Another authentication mechanism, which we name stand-alone authentication, can authenticate users when the connection to the central server is down. We investigate several issues in stand-alone authentication and show how to add it on multi-factor authentication protocols in an efficient and generic way.
Educational software systems have an increasingly significant presence in engineering sciences. They aim to improve students' attitudes and knowledge acquisition typically through visual representation and simulation of complex algorithms and mechanisms or hardware systems that are often not available to the educational institutions. This paper presents a novel software system for CryptOgraphic ALgorithm visuAl representation (COALA), which was developed to support a Data Security course at the School of Electrical Engineering, University of Belgrade. The system allows users to follow the execution of several complex algorithms (DES, AES, RSA, and Diffie-Hellman) on real world examples in a step by step detailed view with the possibility of forward and backward navigation. Benefits of the COALA system for students are observed through the increase of the percentage of students who passed the exam and the average grade on the exams during one school year.
Being the most important critical infrastructure in Cyber-Physical Systems (CPSs), a smart grid exhibits the complicated nature of large scale, distributed, and dynamic environment. Taxonomy of attacks is an effective tool in systematically classifying attacks and it has been placed as a top research topic in CPS by a National Science Foundation (NSG) Workshop. Most existing taxonomy of attacks in CPS are inadequate in addressing the tight coupling of cyber-physical process or/and lack systematical construction. This paper attempts to introduce taxonomy of attacks of agent-based smart grids as an effective tool to provide a structured framework. The proposed idea of introducing the structure of space-time and information flow direction, security feature, and cyber-physical causality is innovative, and it can establish a taxonomy design mechanism that can systematically construct the taxonomy of cyber attacks, which could have a potential impact on the normal operation of the agent-based smart grids. Based on the cyber-physical relationship revealed in the taxonomy, a concrete physical process based cyber attack detection scheme has been proposed. A numerical illustrative example has been provided to validate the proposed physical process based cyber detection scheme.
This paper proposes an algorithm for multi-channel SAR ground moving target detection and estimation using the Fractional Fourier Transform(FrFT). To detect the moving target with low speed, the clutter is first suppressed by Displace Phase Center Antenna(DPCA), then the signal-to-clutter can be enhanced. Have suppressed the clutter, the echo of moving target remains and can be regarded as a chirp signal whose parameters can be estimated by FrFT. FrFT, one of the most widely used tools to time-frequency analysis, is utilized to estimate the Doppler parameters, from which the moving parameters, including the velocity and the acceleration can be obtained. The effectiveness of the proposed method is validated by the simulation.
Tracking moving objects is a task of the utmost importance to the defence community. As this task requires high accuracy, rather than employing a single detector, it has become common to use multiple ones. In such cases, the tracks produced by these detectors need to be correlated (if they belong to the same sensing modality) or associated (if they were produced by different sensing modalities). In this work, we introduce Computational-Intelligence-based methods for correlating and associating various contacts and tracks pertaining to maritime vessels in an area of interest. Fuzzy k-Nearest Neighbours will be used to conduct track correlation and Fuzzy C-Means clustering will be applied for association. In that way, the uncertainty of the track correlation and association is handled through fuzzy logic. To better model the state of the moving target, the traditional Kalman Filter will be extended using an Echo State Network. Experimental results on five different types of sensing systems will be discussed to justify the choices made in the development of our approach. In particular, we will demonstrate the judiciousness of using Fuzzy k-Nearest Neighbours and Fuzzy C-Means on our tracking system and show how the extension of the traditional Kalman Filter by a recurrent neural network is superior to its extension by other methods.
Since the massive deployment of Cyber-Physical Systems (CPSs) calls for long-range and reliable communication services with manageable cost, it has been believed to be an inevitable trend to relay a significant portion of CPS traffic through existing networking infrastructures such as the Internet. Adversaries who have access to networking infrastructures can therefore eavesdrop network traffic and then perform traffic analysis attacks in order to identify CPS sessions and subsequently launch various attacks. As we can hardly prevent all adversaries from accessing network infrastructures, thwarting traffic analysis attacks becomes indispensable. Traffic morphing serves as an effective means towards this direction. In this paper, a novel traffic morphing algorithm, CPSMorph, is proposed to protect CPS sessions. CPSMorph maintains a number of network sessions whose distributions of inter-packet delays are statistically indistinguishable from those of typical network sessions. A CPS message will be sent through one of these sessions with assured satisfaction of its time constraint. CPSMorph strives to minimize the overhead by dynamically adjusting the morphing process. It is characterized by low complexity as well as high adaptivity to changing dynamics of CPS sessions. Experimental results have shown that CPSMorph can effectively performing traffic morphing for real-time CPS messages with moderate overhead.
In this paper, we propose techniques for combating source selective jamming attacks in tactical cognitive MANETs. Secure, reliable and seamless communications are important for facilitating tactical operations. Selective jamming attacks pose a serious security threat to the operations of wireless tactical MANETs since selective strategies possess the potential to completely isolate a portion of the network from other nodes without giving a clear indication of a problem. Our proposed mitigation techniques use the concept of address manipulation, which differ from other techniques presented in open literature since our techniques employ de-central architecture rather than a centralized framework and our proposed techniques do not require any extra overhead. Experimental results show that the proposed techniques enable communications in the presence of source selective jamming attacks. When the presence of a source selective jammer blocks transmissions completely, implementing a proposed flipped address mechanism increases the expected number of required transmission attempts only by one in such scenario. The probability that our second approach, random address assignment, fails to solve the correct source MAC address can be as small as 10-7 when using accurate parameter selection.
This paper presents the design and implementation of an information flow tracking framework based on code rewrite to prevent sensitive information leaks in browsers, combining the ideas of taint and information flow analysis. Our system has two main processes. First, it abstracts the semantic of JavaScript code and converts it to a general form of intermediate representation on the basis of JavaScript abstract syntax tree. Second, the abstract intermediate representation is implemented as a special taint engine to analyze tainted information flow. Our approach can ensure fine-grained isolation for both confidentiality and integrity of information. We have implemented a proof-of-concept prototype, named JSTFlow, and have deployed it as a browser proxy to rewrite web applications at runtime. The experiment results show that JSTFlow can guarantee the security of sensitive data and detect XSS attacks with about 3x performance overhead. Because it does not involve any modifications to the target system, our system is readily deployable in practice.
CSRFGuard is a tool running on the Java EE platform to defend Cross-Site Request Forgery (CSRF) attacks, but there are some shortcomings: scripts should be inserted manually, dynamically created requests cannot be effectively handled as well as defense can be bypassed through Cross-Site Scripting (XSS). Corresponding improvements were made according to the shortcomings. The Servlet filter was used to intercept responses, and responses of pages' source codes were stored by a custom response wrapper class to add script tags, so that scripts were automatically inserted. JavaScript event delegation mechanism was used to bind forms with onfocus and onsubmit events, then dynamically created requests were effectively handled. Token dynamically added through event triggered effectively prevented defense bypassed through XSS. The experimental results show that improved CSRFGuard can be effective to defend CSRF attacks.
Cross-Site Scripting (XSS) is a common attack technique that lets attackers insert the code in the output application of web page which is referred to the web browser of visitor and then the inserted code executes automatically and steals the sensitive information. In order to prevent the users from XSS attack, many client- side solutions have been implemented; most of them being used are the filters that sanitize the malicious input. However, many of these filters do not provide prevention to the newly designed sophisticated attacks such as multiple points of injection, injection into script etc. This paper proposes and implements an approach based on encoding unfiltered reflections for detecting vulnerable web applications which can be exploited using above mentioned sophisticated attacks. Results prove that the proposed approach provides accurate higher detection rate of exploits. In addition to this, an implementation of blocking the execution of malicious scripts have contributed to XSS-Me: an open source Mozilla Firefox security extension that detects for reflected XSS vulnerabilities which can be considered as an effective solution if it is integrated inside the browser rather than being enforced as an extension.
Physical impairments in long-haul optical networks mandate that optical signals be regenerated within the (so-called translucent) network. Being expensive devices, regenerators are expected to be allocated sparsely and must be judiciously utilized. Next-generation optical-transport networks will include multiple domains with diverse technologies, protocols, granularities, and carriers. Because of confidentiality and scalability concerns, the scope of network-state information (e.g., topology, wavelength availability) may be limited to within a domain. In such networks, the problem of routing and wavelength assignment (RWA) aims to find an adequate route and wavelength(s) for lightpaths carrying end-to-end service demands. Some state information may have to be explicitly exchanged among the domains to facilitate the RWA process. The challenge is to determine which information is the most critical and make a wise choice for the path and wavelength(s) using the limited information. Recently, a framework for multidomain path computation called backward-recursive path-computation (BRPC) was standardized by the Internet Engineering Task Force. In this paper, we consider the RWA problem for connections within a single domain and interdomain connections so that the quality of transmission (QoT) requirement of each connection is satisfied, and the network-level performance metric of blocking probability is minimized. Cross-layer heuristics that are based on dynamic programming to effectively allocate the sparse regenerators are developed, and extensive simulation results are presented to demonstrate their effectiveness.
In 2013, Biswas and Misic proposed a new privacy-preserving authentication scheme for WAVE-based vehicular ad hoc networks (VANETs), claiming that they used a variant of the Elliptic Curve Digital Signature Algorithm (ECDSA). However, our study has discovered that the authentication scheme proposed by them is vulnerable to a private key reveal attack. Any malicious receiving vehicle who receives a valid signature from a legal signing vehicle can gain access to the signing vehicle private key from the learned valid signature. Hence, the authentication scheme proposed by Biswas and Misic is insecure. We thus propose an improved version to overcome this weakness. The proposed improved scheme also supports identity revocation and trace. Based on this security property, the CA and a receiving entity (RSU or OBU) can check whether a received signature has been generated by a revoked vehicle. Security analysis is also conducted to evaluate the security strength of the proposed authentication scheme.
In view of the difficulty in selecting wavelet base and decomposition level for wavelet-based de-noising method, this paper proposes an adaptive de-noising method based on Ensemble Empirical Mode Decomposition (EEMD). The autocorrelation, cross-correlation method is used to adaptively find the signal-to-noise boundary layer of the EEMD in this method. Then the noise dominant layer is filtered directly and the signal dominant layer is threshold de-noised. Finally, the de-noising signal is reconstructed by each layer component which is de-noised. This method solves the problem of mode mixing in Empirical Mode Decomposition (EMD) by using EEMD and combines the advantage of wavelet threshold. In this paper, we focus on the analysis and verification of the correctness of the adaptive determination of the noise dominant layer. The simulation experiment results prove that this de-noising method is efficient and has good adaptability.
Discrete fractional Fourier transform (DFRFT) is a generalization of discrete Fourier transform. There are a number of DFRFT proposals, which are useful for various signal processing applications. This paper investigates practical solutions toward the construction of unconditionally secure communication systems based on DFRFT via cross-layer approach. By introducing a distort signal parameter, the sender randomly flip-flops between the distort signal parameter and the general signal parameter to confuse the attacker. The advantages of the legitimate partners are guaranteed. We extend the advantages between legitimate partners via developing novel security codes on top of the proposed cross-layer DFRFT security communication model, aiming to achieve an error-free legitimate channel while preventing the eavesdropper from any useful information. Thus, a cross-layer strong mobile communication secure model is built.
This paper proposed a MIMO cross-layer precoding secure communications via pattern controlled by higher layer cryptography. By contrast to physical layer security system, the proposed scheme could enhance the security in adverse situations where the physical layer security hardly to be deal with. Two One typical situation is considered. One is that the attackers have the ideal CSI and another is eavesdropper's channel are highly correlated to legitimate channel. Our scheme integrates the upper layer with physical layer secure together to gaurantee the security in real communication system. Extensive theoretical analysis and simulations are conducted to demonstrate its effectiveness. The proposed method is feasible to spread in many other communicate scenarios.
Virtualized environments are widely thought to cause problems for software-based random number generators (RNGs), due to use of virtual machine (VM) snapshots as well as fewer and believed-to-be lower quality entropy sources. Despite this, we are unaware of any published analysis of the security of critical RNGs when running in VMs. We fill this gap, using measurements of Linux's RNG systems (without the aid of hardware RNGs, the most common use case today) on Xen, VMware, and Amazon EC2. Despite CPU cycle counters providing a significant source of entropy, various deficiencies in the design of the Linux RNG makes its first output vulnerable during VM boots and, more critically, makes it suffer from catastrophic reset vulnerabilities. We show cases in which the RNG will output the exact same sequence of bits each time it is resumed from the same snapshot. This can compromise, for example, cryptographic secrets generated after resumption. We explore legacy-compatible countermeasures, as well as a clean-slate solution. The latter is a new RNG called Whirlwind that provides a simpler, more-secure solution for providing system randomness.
The growing popularity and development of data mining technologies bring serious threat to the security of individual,'s sensitive information. An emerging research topic in data mining, known as privacy-preserving data mining (PPDM), has been extensively studied in recent years. The basic idea of PPDM is to modify the data in such a way so as to perform data mining algorithms effectively without compromising the security of sensitive information contained in the data. Current studies of PPDM mainly focus on how to reduce the privacy risk brought by data mining operations, while in fact, unwanted disclosure of sensitive information may also happen in the process of data collecting, data publishing, and information (i.e., the data mining results) delivering. In this paper, we view the privacy issues related to data mining from a wider perspective and investigate various approaches that can help to protect sensitive information. In particular, we identify four different types of users involved in data mining applications, namely, data provider, data collector, data miner, and decision maker. For each type of user, we discuss his privacy concerns and the methods that can be adopted to protect sensitive information. We briefly introduce the basics of related research topics, review state-of-the-art approaches, and present some preliminary thoughts on future research directions. Besides exploring the privacy-preserving approaches for each type of user, we also review the game theoretical approaches, which are proposed for analyzing the interactions among different users in a data mining scenario, each of whom has his own valuation on the sensitive information. By differentiating the responsibilities of different users with respect to security of sensitive information, we would like to provide some useful insights into the study of PPDM.
With the advent of social networks and cloud computing, the amount of multimedia data produced and communicated within social networks is rapidly increasing. In the mean time, social networking platform based on cloud computing has made multimedia big data sharing in social network easier and more efficient. The growth of social multimedia, as demonstrated by social networking sites such as Facebook and YouTube, combined with advances in multimedia content analysis, underscores potential risks for malicious use such as illegal copying, piracy, plagiarism, and misappropriation. Therefore, secure multimedia sharing and traitor tracing issues have become critical and urgent in social network. In this paper, we propose a scheme for implementing the Tree-Structured Harr (TSH) transform in a homomorphic encrypted domain for fingerprinting using social network analysis with the purpose of protecting media distribution in social networks. The motivation is to map hierarchical community structure of social network into tree structure of TSH transform for JPEG2000 coding, encryption and fingerprinting. Firstly, the fingerprint code is produced using social network analysis. Secondly, the encrypted content is decomposed by the TSH transform. Thirdly, the content is fingerprinted in the TSH transform domain. At last, the encrypted and fingerprinted contents are delivered to users via hybrid multicast-unicast. The use of fingerprinting along with encryption can provide a double-layer of protection to media sharing in social networks. Theory analysis and experimental results show the effectiveness of the proposed scheme.
Recently, cloud computing has been spotlighted as a new paradigm of database management system. In this environment, databases are outsourced and deployed on a service provider in order to reduce cost for data storage and maintenance. However, the service provider might be untrusted so that the two issues of data security, including data confidentiality and query result integrity, become major concerns for users. Existing bucket-based data authentication methods have problem that the original spatial data distribution can be disclosed from data authentication index due to the unsophisticated data grouping strategies. In addition, the transmission overhead of verification object is high. In this paper, we propose a privacy-aware query authentication which guarantees data confidentiality and query result integrity for users. A periodic function-based data grouping scheme is designed to privately partition a spatial database into small groups for generating a signature of each group. The group signature is used to check the correctness and completeness of outsourced data when answering a range query to users. Through performance evaluation, it is shown that proposed method outperforms the existing method in terms of range query processing time up to 3 times.
This paper has conducted analyzing the accident case of data spill to study policy issues for ICT security from a social science perspective focusing on risk. The results from case analysis are as follows. First, ICT risk can be categorized 'severe, strong, intensive and individual' from the level of both probability and impact. Second, strategy of risk management can be designated 'avoid, transfer, mitigate, accept' by understanding their own culture type of relative group such as 'hierarchy, egalitarianism, fatalism and individualism'. Third, personal data has contained characteristics of big data such like 'volume, velocity, variety' for each risk situation. Therefore, government needs to establish a standing organization responsible for ICT risk policy and management in a new big data era. And the policy for ICT risk management needs to balance in considering 'technology, norms, laws, and market' in big data era.