Visible to the public Biblio

Found 758 results

Filters: First Letter Of Last Name is E  [Clear All Filters]
2015-05-06
Mokhtar, B., Eltoweissy, M..  2014.  Towards a Data Semantics Management System for Internet Traffic. New Technologies, Mobility and Security (NTMS), 2014 6th International Conference on. :1-5.

Although current Internet operations generate voluminous data, they remain largely oblivious of traffic data semantics. This poses many inefficiencies and challenges due to emergent or anomalous behavior impacting the vast array of Internet elements such as services and protocols. In this paper, we propose a Data Semantics Management System (DSMS) for learning Internet traffic data semantics to enable smarter semantics- driven networking operations. We extract networking semantics and build and utilize a dynamic ontology of network concepts to better recognize and act upon emergent or abnormal behavior. Our DSMS utilizes: (1) Latent Dirichlet Allocation algorithm (LDA) for latent features extraction and semantics reasoning; (2) big tables as a cloud-like data storage technique to maintain large-scale data; and (3) Locality Sensitive Hashing algorithm (LSH) for reducing data dimensionality. Our preliminary evaluation using real Internet traffic shows the efficacy of DSMS for learning behavior of normal and abnormal traffic data and for accurately detecting anomalies at low cost.
 

Eddeen, L.M.H.N., Saleh, E.M., Saadah, D..  2014.  Genetic Hash Algorithm. Computer Science and Information Technology (CSIT), 2014 6th International Conference on. :23-26.

Security is becoming a major concern in computing. New techniques are evolving every day; one of these techniques is Hash Visualization. Hash Visualization uses complex random generated images for security, these images can be used to hide data (watermarking). This proposed new technique improves hash visualization by using genetic algorithms. Genetic algorithms are a search optimization technique that is based on the evolution of living creatures. The proposed technique uses genetic algorithms to improve hash visualization. The used genetic algorithm was away faster than traditional previous ones, and it improved hash visualization by evolving the tree that was used to generate the images, in order to obtain a better and larger tree that will generate images with higher security. The security was satisfied by calculating the fitness value for each chromosome based on a specifically designed algorithm.
 

Alshammari, H., Elleithy, K., Almgren, K., Albelwi, S..  2014.  Group signature entanglement in e-voting system. Systems, Applications and Technology Conference (LISAT), 2014 IEEE Long Island. :1-4.

In any security system, there are many security issues that are related to either the sender or the receiver of the message. Quantum computing has proven to be a plausible approach to solving many security issues such as eavesdropping, replay attack and man-in-the-middle attack. In the e-voting system, one of these issues has been solved, namely, the integrity of the data (ballot). In this paper, we propose a scheme that solves the problem of repudiation that could occur when the voter denies the value of the ballot either for cheating purposes or for a real change in the value by a third party. By using an entanglement concept between two parties randomly, the person who is going to verify the ballots will create the entangled state and keep it in a database to use it in the future for the purpose of the non-repudiation of any of these two voters.

Burley, Diana L., Eisenberg, Jon, Goodman, Seymour E..  2014.  Would Cybersecurity Professionalization Help Address the Cybersecurity Crisis? Commun. ACM. 57:24–27.

Evaluating the trade-offs involved in cybersecurity professionalization.

Endicott-Popovsky, Barbara E., Popovsky, Viatcheslav M..  2014.  Application of Pedagogical Fundamentals for the Holistic Development of Cybersecurity Professionals. ACM Inroads. 5:57–68.

Nowhere is the problem of lack of human capital more keenly felt than in the field of cybersecurity where the numbers and quality of well-trained graduates are woefully lacking [10]. In 2005, the National Academy of Sciences indicted the US education system as the culprit contributing to deficiencies in our technical workforce, sounding the alarm that we are at risk of losing our competitive edge [14]. While the government has made cybersecurity education a national priority, seeking to stimulate university and community college production of information assurance (IA) expertise, they still have thousands of IA jobs going unfilled. The big question for the last decade [17] has been 'where will we find the talent we need?' In this article, we describe one university's approach to begin addressing this problem and discuss an innovative curricular model that holistically develops future cybersecurity professionals.

 

Mukaddam, A., Elhajj, I., Kayssi, A., Chehab, A..  2014.  IP Spoofing Detection Using Modified Hop Count. Advanced Information Networking and Applications (AINA), 2014 IEEE 28th International Conference on. :512-516.

With the global widespread usage of the Internet, more and more cyber-attacks are being performed. Many of these attacks utilize IP address spoofing. This paper describes IP spoofing attacks and the proposed methods currently available to detect or prevent them. In addition, it presents a statistical analysis of the Hop Count parameter used in our proposed IP spoofing detection algorithm. We propose an algorithm, inspired by the Hop Count Filtering (HCF) technique, that changes the learning phase of HCF to include all the possible available Hop Count values. Compared to the original HCF method and its variants, our proposed method increases the true positive rate by at least 9% and consequently increases the overall accuracy of an intrusion detection system by at least 9%. Our proposed method performs in general better than HCF method and its variants.
 

2015-05-05
Koch, S., John, M., Worner, M., Muller, A., Ertl, T..  2014.  VarifocalReader #x2014; In-Depth Visual Analysis of Large Text Documents. Visualization and Computer Graphics, IEEE Transactions on. 20:1723-1732.

Interactive visualization provides valuable support for exploring, analyzing, and understanding textual documents. Certain tasks, however, require that insights derived from visual abstractions are verified by a human expert perusing the source text. So far, this problem is typically solved by offering overview-detail techniques, which present different views with different levels of abstractions. This often leads to problems with visual continuity. Focus-context techniques, on the other hand, succeed in accentuating interesting subsections of large text documents but are normally not suited for integrating visual abstractions. With VarifocalReader we present a technique that helps to solve some of these approaches' problems by combining characteristics from both. In particular, our method simplifies working with large and potentially complex text documents by simultaneously offering abstract representations of varying detail, based on the inherent structure of the document, and access to the text itself. In addition, VarifocalReader supports intra-document exploration through advanced navigation concepts and facilitates visual analysis tasks. The approach enables users to apply machine learning techniques and search mechanisms as well as to assess and adapt these techniques. This helps to extract entities, concepts and other artifacts from texts. In combination with the automatic generation of intermediate text levels through topic segmentation for thematic orientation, users can test hypotheses or develop interesting new research questions. To illustrate the advantages of our approach, we provide usage examples from literature studies.

Eun Hee Ko, Klabjan, D..  2014.  Semantic Properties of Customer Sentiment in Tweets. Advanced Information Networking and Applications Workshops (WAINA), 2014 28th International Conference on. :657-663.

An increasing number of people are using online social networking services (SNSs), and a significant amount of information related to experiences in consumption is shared in this new media form. Text mining is an emerging technique for mining useful information from the web. We aim at discovering in particular tweets semantic patterns in consumers' discussions on social media. Specifically, the purposes of this study are twofold: 1) finding similarity and dissimilarity between two sets of textual documents that include consumers' sentiment polarities, two forms of positive vs. negative opinions and 2) driving actual content from the textual data that has a semantic trend. The considered tweets include consumers' opinions on US retail companies (e.g., Amazon, Walmart). Cosine similarity and K-means clustering methods are used to achieve the former goal, and Latent Dirichlet Allocation (LDA), a popular topic modeling algorithm, is used for the latter purpose. This is the first study which discover semantic properties of textual data in consumption context beyond sentiment analysis. In addition to major findings, we apply LDA (Latent Dirichlet Allocations) to the same data and drew latent topics that represent consumers' positive opinions and negative opinions on social media.

Heimerl, F., Lohmann, S., Lange, S., Ertl, T..  2014.  Word Cloud Explorer: Text Analytics Based on Word Clouds. System Sciences (HICSS), 2014 47th Hawaii International Conference on. :1833-1842.

Word clouds have emerged as a straightforward and visually appealing visualization method for text. They are used in various contexts as a means to provide an overview by distilling text down to those words that appear with highest frequency. Typically, this is done in a static way as pure text summarization. We think, however, that there is a larger potential to this simple yet powerful visualization paradigm in text analytics. In this work, we explore the usefulness of word clouds for general text analysis tasks. We developed a prototypical system called the Word Cloud Explorer that relies entirely on word clouds as a visualization method. It equips them with advanced natural language processing, sophisticated interaction techniques, and context information. We show how this approach can be effectively used to solve text analysis tasks and evaluate it in a qualitative user study.

Okathe, T., Heydari, S.S., Sood, V., El-khatib, K..  2014.  Unified multi-critical infrastructure communication architecture. Communications (QBSC), 2014 27th Biennial Symposium on. :178-183.

Recent events have brought to light the increasingly intertwined nature of modern infrastructures. As a result much effort is being put towards protecting these vital infrastructures without which modern society suffers dire consequences. These infrastructures, due to their intricate nature, behave in complex ways. Improving their resilience and understanding their behavior requires a collaborative effort between the private sector that operates these infrastructures and the government sector that regulates them. This collaboration in the form of information sharing requires a new type of information network whose goal is in two parts to enable infrastructure operators share status information among interdependent infrastructure nodes and also allow for the sharing of vital information concerning threats and other contingencies in the form of alerts. A communication model that meets these requirements while maintaining flexibility and scalability is presented in this paper.
 

Chenine, M., Ullberg, J., Nordstrom, L., Wu, Y., Ericsson, G.N..  2014.  A Framework for Wide-Area Monitoring and Control Systems Interoperability and Cybersecurity Analysis. Power Delivery, IEEE Transactions on. 29:633-641.

Wide-area monitoring and control (WAMC) systems are the next-generation operational-management systems for electric power systems. The main purpose of such systems is to provide high resolution real-time situational awareness in order to improve the operation of the power system by detecting and responding to fast evolving phenomenon in power systems. From an information and communication technology (ICT) perspective, the nonfunctional qualities of these systems are increasingly becoming important and there is a need to evaluate and analyze the factors that impact these nonfunctional qualities. Enterprise architecture methods, which capture properties of ICT systems in architecture models and use these models as a basis for analysis and decision making, are a promising approach to meet these challenges. This paper presents a quantitative architecture analysis method for the study of WAMC ICT architectures focusing primarily on the interoperability and cybersecurity aspects.
 

Hussain, A., Faber, T., Braden, R., Benzel, T., Yardley, T., Jones, J., Nicol, D.M., Sanders, W.H., Edgar, T.W., Carroll, T.E. et al..  2014.  Enabling Collaborative Research for Security and Resiliency of Energy Cyber Physical Systems. Distributed Computing in Sensor Systems (DCOSS), 2014 IEEE International Conference on. :358-360.

The University of Illinois at Urbana Champaign (Illinois), Pacific Northwest National Labs (PNNL), and the University of Southern California Information Sciences Institute (USC-ISI) consortium is working toward providing tools and expertise to enable collaborative research to improve security and resiliency of cyber physical systems. In this extended abstract we discuss the challenges and the solution space. We demonstrate the feasibility of some of the proposed components through a wide-area situational awareness experiment for the power grid across the three sites.
 

Thompson, M., Evans, N., Kisekka, V..  2014.  Multiple OS rotational environment an implemented Moving Target Defense. Resilient Control Systems (ISRCS), 2014 7th International Symposium on. :1-6.

Cyber-attacks continue to pose a major threat to existing critical infrastructure. Although suggestions for defensive strategies abound, Moving Target Defense (MTD) has only recently gained attention as a possible solution for mitigating cyber-attacks. The current work proposes a MTD technique that provides enhanced security through a rotation of multiple operating systems. The MTD solution developed in this research utilizes existing technology to provide a feasible dynamic defense solution that can be deployed easily in a real networking environment. In addition, the system we developed was tested extensively for effectiveness using CORE Impact Pro (CORE), Nmap, and manual penetration tests. The test results showed that platform diversity and rotation offer improved security. In addition, the likelihood of a successful attack decreased proportionally with time between rotations.
 

Elwell, J., Riley, R., Abu-Ghazaleh, N., Ponomarev, D..  2014.  A Non-Inclusive Memory Permissions architecture for protection against cross-layer attacks. High Performance Computer Architecture (HPCA), 2014 IEEE 20th International Symposium on. :201-212.

Protecting modern computer systems and complex software stacks against the growing range of possible attacks is becoming increasingly difficult. The architecture of modern commodity systems allows attackers to subvert privileged system software often using a single exploit. Once the system is compromised, inclusive permissions used by current architectures and operating systems easily allow a compromised high-privileged software layer to perform arbitrary malicious activities, even on behalf of other software layers. This paper presents a hardware-supported page permission scheme for the physical pages that is based on the concept of non-inclusive sets of memory permissions for different layers of system software such as hypervisors, operating systems, and user-level applications. Instead of viewing privilege levels as an ordered hierarchy with each successive level being more privileged, we view them as distinct levels each with its own set of permissions. Such a permission mechanism, implemented as part of a processor architecture, provides a common framework for defending against a range of recent attacks. We demonstrate that such a protection can be achieved with negligible performance overhead, low hardware complexity and minimal changes to the commodity OS and hypervisor code.
 

Sarikaya, Y., Ercetin, O., Koksal, C.E..  2014.  Confidentiality-Preserving Control of Uplink Cellular Wireless Networks Using Hybrid ARQ. Networking, IEEE/ACM Transactions on. PP:1-1.

We consider the problem of cross-layer resource allocation with information-theoretic secrecy for uplink transmissions in time-varying cellular wireless networks. Particularly, each node in an uplink cellular network injects two types of traffic, confidential and open at rates chosen in order to maximize a global utility function while keeping the data queues stable and meeting a constraint on the secrecy outage probability. The transmitting node only knows the distribution of channel gains. Our scheme is based on Hybrid Automatic Repeat Request (HARQ) transmission with incremental redundancy. We prove that our scheme achieves a utility, arbitrarily close to the maximum achievable. Numerical experiments are performed to verify the analytical results and to show the efficacy of the dynamic control algorithm.
 

Rieke, R., Repp, J., Zhdanova, M., Eichler, J..  2014.  Monitoring Security Compliance of Critical Processes. Parallel, Distributed and Network-Based Processing (PDP), 2014 22nd Euromicro International Conference on. :552-560.

Enforcing security in process-aware information systems at runtime requires the monitoring of systems' operation using process information. Analysis of this information with respect to security and compliance aspects is growing in complexity with the increase in functionality, connectivity, and dynamics of process evolution. To tackle this complexity, the application of models is becoming standard practice. Considering today's frequent changes to processes, model-based support for security and compliance analysis is not only needed in pre-operational phases but also at runtime. This paper presents an approach to support evaluation of the security status of processes at runtime. The approach is based on operational formal models derived from process specifications and security policies comprising technical, organizational, regulatory and cross-layer aspects. A process behavior model is synchronized by events from the running process and utilizes prediction of expected close-future states to find possible security violations and allow early decisions on countermeasures. The applicability of the approach is exemplified by a misuse case scenario from a hydroelectric power plant.

Everspaugh, A., Yan Zhai, Jellinek, R., Ristenpart, T., Swift, M..  2014.  Not-So-Random Numbers in Virtualized Linux and the Whirlwind RNG. Security and Privacy (SP), 2014 IEEE Symposium on. :559-574.

Virtualized environments are widely thought to cause problems for software-based random number generators (RNGs), due to use of virtual machine (VM) snapshots as well as fewer and believed-to-be lower quality entropy sources. Despite this, we are unaware of any published analysis of the security of critical RNGs when running in VMs. We fill this gap, using measurements of Linux's RNG systems (without the aid of hardware RNGs, the most common use case today) on Xen, VMware, and Amazon EC2. Despite CPU cycle counters providing a significant source of entropy, various deficiencies in the design of the Linux RNG makes its first output vulnerable during VM boots and, more critically, makes it suffer from catastrophic reset vulnerabilities. We show cases in which the RNG will output the exact same sequence of bits each time it is resumed from the same snapshot. This can compromise, for example, cryptographic secrets generated after resumption. We explore legacy-compatible countermeasures, as well as a clean-slate solution. The latter is a new RNG called Whirlwind that provides a simpler, more-secure solution for providing system randomness.
 

Eckhoff, D., Sommer, C..  2014.  Driving for Big Data? Privacy Concerns in Vehicular Networking Security Privacy, IEEE. 12:77-79.

Communicating vehicles will change road traffic as we know it. With current versions of European and US standards in mind, the authors discuss privacy and traffic surveillance issues in vehicular network technology and outline research directions that could address these issues.

Marchal, S., Xiuyan Jiang, State, R., Engel, T..  2014.  A Big Data Architecture for Large Scale Security Monitoring. Big Data (BigData Congress), 2014 IEEE International Congress on. :56-63.

Network traffic is a rich source of information for security monitoring. However the increasing volume of data to treat raises issues, rendering holistic analysis of network traffic difficult. In this paper we propose a solution to cope with the tremendous amount of data to analyse for security monitoring perspectives. We introduce an architecture dedicated to security monitoring of local enterprise networks. The application domain of such a system is mainly network intrusion detection and prevention, but can be used as well for forensic analysis. This architecture integrates two systems, one dedicated to scalable distributed data storage and management and the other dedicated to data exploitation. DNS data, NetFlow records, HTTP traffic and honeypot data are mined and correlated in a distributed system that leverages state of the art big data solution. Data correlation schemes are proposed and their performance are evaluated against several well-known big data framework including Hadoop and Spark.

2015-05-04
Shaobu Wang, Shuai Lu, Ning Zhou, Guang Lin, Elizondo, M., Pai, M.A..  2014.  Dynamic-Feature Extraction, Attribution, and Reconstruction (DEAR) Method for Power System Model Reduction. Power Systems, IEEE Transactions on. 29:2049-2059.

In interconnected power systems, dynamic model reduction can be applied to generators outside the area of interest (i.e., study area) to reduce the computational cost associated with transient stability studies. This paper presents a method of deriving the reduced dynamic model of the external area based on dynamic response measurements. The method consists of three steps, namely dynamic-feature extraction, attribution, and reconstruction (DEAR). In this method, a feature extraction technique, such as singular value decomposition (SVD), is applied to the measured generator dynamics after a disturbance. Characteristic generators are then identified in the feature attribution step for matching the extracted dynamic features with the highest similarity, forming a suboptimal “basis” of system dynamics. In the reconstruction step, generator state variables such as rotor angles and voltage magnitudes are approximated with a linear combination of the characteristic generators, resulting in a quasi-nonlinear reduced model of the original system. The network model is unchanged in the DEAR method. Tests on several IEEE standard systems show that the proposed method yields better reduction ratio and response errors than the traditional coherency based reduction methods.
 

Zurek, E.E., Gamarra, A.M.R., Escorcia, G.J.R., Gutierrez, C., Bayona, H., Perez, R., Garcia, X..  2014.  Spectral analysis techniques for acoustic fingerprints recognition. Image, Signal Processing and Artificial Vision (STSIVA), 2014 XIX Symposium on. :1-5.

This article presents results of the recognition process of acoustic fingerprints from a noise source using spectral characteristics of the signal. Principal Components Analysis (PCA) is applied to reduce the dimensionality of extracted features and then a classifier is implemented using the method of the k-nearest neighbors (KNN) to identify the pattern of the audio signal. This classifier is compared with an Artificial Neural Network (ANN) implementation. It is necessary to implement a filtering system to the acquired signals for 60Hz noise reduction generated by imperfections in the acquisition system. The methods described in this paper were used for vessel recognition.

2015-05-01
El Masri, A., Sardouk, A., Khoukhi, L., Merghem-Boulahia, L., Gaiti, D..  2014.  Multimedia Support in Wireless Mesh Networks Using Interval Type-2 Fuzzy Logic System. New Technologies, Mobility and Security (NTMS), 2014 6th International Conference on. :1-5.

Wireless mesh networks (WMNs) are attracting more and more real time applications. This kind of applications is constrained in terms of Quality of Service (QoS). Existing works in this area are mostly designed for mobile ad hoc networks, which, unlike WMNs, are mainly sensitive to energy and mobility. However, WMNs have their specific characteristics (e.g. static routers and heavy traffic load), which require dedicated QoS protocols. This paper proposes a novel traffic regulation scheme for multimedia support in WMNs. The proposed scheme aims to regulate the traffic sending rate according to the network state, based on the buffer evolution at mesh routers and on the priority of each traffic type. By monitoring the buffer evolution at mesh routers, our scheme is able to predict possible congestion, or QoS violation, early enough before their occurrence; each flow is then regulated according to its priority and to its QoS requirements. The idea behind the proposed scheme is to maintain lightly loaded buffers in order to minimize the queuing delays, as well as, to avoid congestion. Moreover, the regulation process is made smoothly in order to ensure the continuity of real time and interactive services. We use the interval type-2 fuzzy logic system (IT2 FLS), known by its adequacy to uncertain environments, to make suitable regulation decisions. The performance of our scheme is proved through extensive simulations in different network and traffic load scales.

Y. Seifi, S. Suriadi, E. Foo, C. Boyd.  2014.  Security properties analysis in a TPM-based protocol. Int. J. of Security and Networks, 2014 Vol.9, No.2, pp.85 - 103.

Security protocols are designed in order to provide security properties (goals). They achieve their goals using cryptographic primitives such as key agreement or hash functions. Security analysis tools are used in order to verify whether a security protocol achieves its goals or not. The analysed property by specific purpose tools are predefined properties such as secrecy (confidentiality), authentication or non-repudiation. There are security goals that are defined by the user in systems with security requirements. Analysis of these properties is possible with general purpose analysis tools such as coloured petri nets (CPN). This research analyses two security properties that are defined in a protocol that is based on trusted platform module (TPM). The analysed protocol is proposed by Delaune to use TPM capabilities and secrets in order to open only one secret from two submitted secrets to a recipient.

2015-04-30
Kholidy, H.A., Erradi, A., Abdelwahed, S., Azab, A..  2014.  A Finite State Hidden Markov Model for Predicting Multistage Attacks in Cloud Systems. Dependable, Autonomic and Secure Computing (DASC), 2014 IEEE 12th International Conference on. :14-19.

Cloud computing significantly increased the security threats because intruders can exploit the large amount of cloud resources for their attacks. However, most of the current security technologies do not provide early warnings about such attacks. This paper presents a Finite State Hidden Markov prediction model that uses an adaptive risk approach to predict multi-staged cloud attacks. The risk model measures the potential impact of a threat on assets given its occurrence probability. The attacks prediction model was integrated with our autonomous cloud intrusion detection framework (ACIDF) to raise early warnings about attacks to the controller so it can take proactive corrective actions before the attacks pose a serious security risk to the system. According to our experiments on DARPA 2000 dataset, the proposed prediction model has successfully fired the early warning alerts 39.6 minutes before the launching of the LLDDoS1.0 attack. This gives the auto response controller ample time to take preventive measures.

El Masri, A., Wechsler, H., Likarish, P., Kang, B.B..  2014.  Identifying users with application-specific command streams. Privacy, Security and Trust (PST), 2014 Twelfth Annual International Conference on. :232-238.

This paper proposes and describes an active authentication model based on user profiles built from user-issued commands when interacting with GUI-based application. Previous behavioral models derived from user issued commands were limited to analyzing the user's interaction with the *Nix (Linux or Unix) command shell program. Human-computer interaction (HCI) research has explored the idea of building users profiles based on their behavioral patterns when interacting with such graphical interfaces. It did so by analyzing the user's keystroke and/or mouse dynamics. However, none had explored the idea of creating profiles by capturing users' usage characteristics when interacting with a specific application beyond how a user strikes the keyboard or moves the mouse across the screen. We obtain and utilize a dataset of user command streams collected from working with Microsoft (MS) Word to serve as a test bed. User profiles are first built using MS Word commands and identification takes place using machine learning algorithms. Best performance in terms of both accuracy and Area under the Curve (AUC) for Receiver Operating Characteristic (ROC) curve is reported using Random Forests (RF) and AdaBoost with random forests.