Visible to the public Biblio

Found 2371 results

Filters: First Letter Of Last Name is G  [Clear All Filters]
2015-05-06
Junho Hong, Chen-Ching Liu, Govindarasu, M..  2014.  Detection of cyber intrusions using network-based multicast messages for substation automation. Innovative Smart Grid Technologies Conference (ISGT), 2014 IEEE PES. :1-5.

This paper proposes a new network-based cyber intrusion detection system (NIDS) using multicast messages in substation automation systems (SASs). The proposed network-based intrusion detection system monitors anomalies and malicious activities of multicast messages based on IEC 61850, e.g., Generic Object Oriented Substation Event (GOOSE) and Sampled Value (SV). NIDS detects anomalies and intrusions that violate predefined security rules using a specification-based algorithm. The performance test has been conducted for different cyber intrusion scenarios (e.g., packet modification, replay and denial-of-service attacks) using a cyber security testbed. The IEEE 39-bus system model has been used for testing of the proposed intrusion detection method for simultaneous cyber attacks. The false negative ratio (FNR) is the number of misclassified abnormal packets divided by the total number of abnormal packets. The results demonstrate that the proposed NIDS achieves a low fault negative rate.
 

Goseva-Popstojanova, K., Dimitrijevikj, A..  2014.  Distinguishing between Web Attacks and Vulnerability Scans Based on Behavioral Characteristics. Advanced Information Networking and Applications Workshops (WAINA), 2014 28th International Conference on. :42-48.

The number of vulnerabilities and reported attacks on Web systems are showing increasing trends, which clearly illustrate the need for better understanding of malicious cyber activities. In this paper we use clustering to classify attacker activities aimed at Web systems. The empirical analysis is based on four datasets, each in duration of several months, collected by high-interaction honey pots. The results show that behavioral clustering analysis can be used to distinguish between attack sessions and vulnerability scan sessions. However, the performance heavily depends on the dataset. Furthermore, the results show that attacks differ from vulnerability scans in a small number of features (i.e., session characteristics). Specifically, for each dataset, the best feature selection method (in terms of the high probability of detection and low probability of false alarm) selects only three features and results into three to four clusters, significantly improving the performance of clustering compared to the case when all features are used. The best subset of features and the extent of the improvement, however, also depend on the dataset.

Odelu, Vanga, Das, Ashok Kumar, Goswami, Adrijit.  2014.  A Secure Effective Key Management Scheme for Dynamic Access Control in a Large Leaf Class Hierarchy. Inf. Sci.. 269:270–285.

Lo et al. (2011) proposed an efficient key assignment scheme for access control in a large leaf class hierarchy where the alternations in leaf classes are more frequent than in non-leaf classes in the hierarchy. Their scheme is based on the public-key cryptosystem and hash function where operations like modular exponentiations are very much costly compared to symmetric-key encryptions and decryptions, and hash computations. Their scheme performs better than the previously proposed schemes. However, in this paper, we show that Lo et al.’s scheme fails to preserve the forward security property where a security class can also derive the secret keys of its successor classes ’s even after deleting the security class  from the hierarchy. We aim to propose a new key management scheme for dynamic access control in a large leaf class hierarchy, which makes use of symmetric-key cryptosystem and one-way hash function. We show that our scheme requires significantly less storage and computational overheads as compared to Lo et al.’s scheme and other related schemes. Through the informal and formal security analysis, we further show that our scheme is secure against all possible attacks including the forward security. In addition, our scheme supports efficiently dynamic access control problems compared to Lo et al.’s scheme and other related schemes. Thus, higher security along with low storage and computational costs make our scheme more suitable for practical applications compared to other schemes.

Gandino, F., Montrucchio, B., Rebaudengo, M..  2014.  Key Management for Static Wireless Sensor Networks With Node Adding. Industrial Informatics, IEEE Transactions on. 10:1133-1143.

Wireless sensor networks offer benefits in several applications but are vulnerable to various security threats, such as eavesdropping and hardware tampering. In order to reach secure communications among nodes, many approaches employ symmetric encryption. Several key management schemes have been proposed in order to establish symmetric keys. The paper presents an innovative key management scheme called random seed distribution with transitory master key, which adopts the random distribution of secret material and a transitory master key used to generate pairwise keys. The proposed approach addresses the main drawbacks of the previous approaches based on these techniques. Moreover, it overperforms the state-of-the-art protocols by providing always a high security level.

Arias Cabarcos, Patricia, Almenárez, Florina, Gómez Mármol, Félix, Mar\'ın, Andrés.  2014.  To Federate or Not To Federate: A Reputation-Based Mechanism to Dynamize Cooperation in Identity Management. Wirel. Pers. Commun.. 75:1769–1786.

Identity Management systems cannot be centralized anymore. Nowadays, users have multiple accounts, profiles and personal data distributed throughout the web and hosted by different providers. However, the online world is currently divided into identity silos forcing users to deal with repetitive authentication and registration processes and hindering a faster development of large scale e-business. Federation has been proposed as a technology to bridge different trust domains, allowing user identity information to be shared in order to improve usability. But further research is required to shift from the current static model, where manual bilateral agreements must be pre-configured to enable cooperation between unknown parties, to a more dynamic one, where trust relationships are established on demand in a fully automated fashion. This paper presents IdMRep, the first completely decentralized reputation-based mechanism which makes dynamic federation a reality. Initial experiments demonstrate its accuracy as well as an assumable overhead in scenarios with and without malicious nodes.

Ghosh, S..  2014.  On the implementation of mceliece with CCA2 indeterminacy by SHA-3. Circuits and Systems (ISCAS), 2014 IEEE International Symposium on. :2804-2807.

This paper deals with the design and implementation of the post-quantum public-key algorithm McEliece. Seamless incorporation of a new error generator and new SHA-3 module provides higher indeterminacy and more randomization of the original McEliece algorithm and achieves CCA2 security standard. Due to the lightweight and high-speed implementation of SHA-3 module the proposed 128-bit secure McEliece architecture provides 6% higher performance in only 0.78 times area of the best known existing design.
 

Nemoianu, I.-D., Greco, C., Cagnazzo, M., Pesquet-Popescu, B..  2014.  On a Hashing-Based Enhancement of Source Separation Algorithms Over Finite Fields With Network Coding Perspectives. Multimedia, IEEE Transactions on. 16:2011-2024.

Blind Source Separation (BSS) deals with the recovery of source signals from a set of observed mixtures, when little or no knowledge of the mixing process is available. BSS can find an application in the context of network coding, where relaying linear combinations of packets maximizes the throughput and increases the loss immunity. By relieving the nodes from the need to send the combination coefficients, the overhead cost is largely reduced. However, the scaling ambiguity of the technique and the quasi-uniformity of compressed media sources makes it unfit, at its present state, for multimedia transmission. In order to open new practical applications for BSS in the context of multimedia transmission, we have recently proposed to use a non-linear encoding to increase the discriminating power of the classical entropy-based separation methods. Here, we propose to append to each source a non-linear message digest, which offers an overhead smaller than a per-symbol encoding and that can be more easily tuned. Our results prove that our algorithm is able to provide high decoding rates for different media types such as image, audio, and video, when the transmitted messages are less than 1.5 kilobytes, which is typically the case in a realistic transmission scenario.

Subramanyan, P., Tsiskaridze, N., Wenchao Li, Gascon, A., Wei Yang Tan, Tiwari, A., Shankar, N., Seshia, S.A., Malik, S..  2014.  Reverse Engineering Digital Circuits Using Structural and Functional Analyses. Emerging Topics in Computing, IEEE Transactions on. 2:63-80.

Integrated circuits (ICs) are now designed and fabricated in a globalized multivendor environment making them vulnerable to malicious design changes, the insertion of hardware Trojans/malware, and intellectual property (IP) theft. Algorithmic reverse engineering of digital circuits can mitigate these concerns by enabling analysts to detect malicious hardware, verify the integrity of ICs, and detect IP violations. In this paper, we present a set of algorithms for the reverse engineering of digital circuits starting from an unstructured netlist and resulting in a high-level netlist with components such as register files, counters, adders, and subtractors. Our techniques require no manual intervention and experiments show that they determine the functionality of >45% and up to 93% of the gates in each of the test circuits that we examine. We also demonstrate that our algorithms are scalable to real designs by experimenting with a very large, highly-optimized system-on-chip (SOC) design with over 375000 combinational elements. Our inference algorithms cover 68% of the gates in this SOC. We also demonstrate that our algorithms are effective in aiding a human analyst to detect hardware Trojans in an unstructured netlist.
 

Vollala, S., Varadhan, V.V., Geetha, K., Ramasubramanian, N..  2014.  Efficient modular multiplication algorithms for public key cryptography. Advance Computing Conference (IACC), 2014 IEEE International. :74-78.

The modular exponentiation is an important operation for cryptographic transformations in public key cryptosystems like the Rivest, Shamir and Adleman, the Difie and Hellman and the ElGamal schemes. computing ax mod n and axby mod n for very large x,y and n are fundamental to the efficiency of almost all pubic key cryptosystems and digital signature schemes. To achieve high level of security, the word length in the modular exponentiations should be significantly large. The performance of public key cryptography is primarily determined by the implementation efficiency of the modular multiplication and exponentiation. As the words are usually large, and in order to optimize the time taken by these operations, it is essential to minimize the number of modular multiplications. In this paper we are presenting efficient algorithms for computing ax mod n and axbymod n. In this work we propose four algorithms to evaluate modular exponentiation. Bit forwarding (BFW) algorithms to compute ax mod n, and to compute axby mod n two algorithms namely Substitute and reward (SRW), Store and forward(SFW) are proposed. All the proposed algorithms are efficient in terms of time and at the same time demands only minimal additional space to store the pre-computed values. These algorithms are suitable for devices with low computational power and limited storage.
 

Burley, Diana L., Eisenberg, Jon, Goodman, Seymour E..  2014.  Would Cybersecurity Professionalization Help Address the Cybersecurity Crisis? Commun. ACM. 57:24–27.

Evaluating the trade-offs involved in cybersecurity professionalization.

Alomari, E., Manickam, S., Gupta, B.B., Singh, P., Anbar, M..  2014.  Design, deployment and use of HTTP-based botnet (HBB) testbed. Advanced Communication Technology (ICACT), 2014 16th International Conference on. :1265-1269.

Botnet is one of the most widespread and serious malware which occur frequently in today's cyber attacks. A botnet is a group of Internet-connected computer programs communicating with other similar programs in order to perform various attacks. HTTP-based botnet is most dangerous botnet among all the different botnets available today. In botnets detection, in particularly, behavioural-based approaches suffer from the unavailability of the benchmark datasets and this lead to lack of precise results evaluation of botnet detection systems, comparison, and deployment which originates from the deficiency of adequate datasets. Most of the datasets in the botnet field are from local environment and cannot be used in the large scale due to privacy problems and do not reflect common trends, and also lack some statistical features. To the best of our knowledge, there is not any benchmark dataset available which is infected by HTTP-based botnet (HBB) for performing Distributed Denial of Service (DDoS) attacks against Web servers by using HTTP-GET flooding method. In addition, there is no Web access log infected by botnet is available for researchers. Therefore, in this paper, a complete test-bed will be illustrated in order to implement a real time HTTP-based botnet for performing variety of DDoS attacks against Web servers by using HTTP-GET flooding method. In addition to this, Web access log with http bot traces are also generated. These real time datasets and Web access logs can be useful to study the behaviour of HTTP-based botnet as well as to evaluate different solutions proposed to detect HTTP-based botnet by various researchers.
 

Arora, D., Verigin, A., Godkin, T., Neville, S.W..  2014.  Statistical Assessment of Sybil-Placement Strategies within DHT-Structured Peer-to-Peer Botnets. Advanced Information Networking and Applications (AINA), 2014 IEEE 28th International Conference on. :821-828.

Botnets are a well recognized global cyber-security threat as they enable attack communities to command large collections of compromised computers (bots) on-demand. Peer to-peer (P2P) distributed hash tables (DHT) have become particularly attractive botnet command and control (C & C) solutions due to the high level resiliency gained via the diffused random graph overlays they produce. The injection of Sybils, computers pretending to be valid bots, remains a key defensive strategy against DHT-structured P2P botnets. This research uses packet level network simulations to explore the relative merits of random, informed, and partially informed Sybil placement strategies. It is shown that random placements perform nearly as effectively as the tested more informed strategies, which require higher levels of inter-defender co-ordination. Moreover, it is shown that aspects of the DHT-structured P2P botnets behave as statistically nonergodic processes, when viewed from the perspective of stochastic processes. This suggests that although optimal Sybil placement strategies appear to exist they would need carefully tuning to each specific P2P botnet instance.

Gelenbe, E..  2014.  A Software Defined Self-Aware Network: The Cognitive Packet Network. Network Cloud Computing and Applications (NCCA), 2014 IEEE 3rd Symposium on. :9-14.

This article is a summary description of the Cognitive Packet Network (CPN) which is an example both of a completely software defined network (SDN) and of a self-aware computer network (SAN) which has been completely implemented and used in numerous experiments. CPN is able to observe its own internal performance as well as the interfaces of the external systems that it interacts with, in order to modify its behaviour so as to adaptively achieve objectives, such as discovering services for its users, improving their Quality of Service (QoS), reduce its own energy consumption, compensate for components which fail or malfunction, detect and react to intrusions, and defend itself against attacks.
 

Zerguine, A., Hammi, O., Abdelhafiz, A.H., Helaoui, M., Ghannouchi, F..  2014.  Behavioral modeling and predistortion of nonlinear power amplifiers based on adaptive filtering techniques. Multi-Conference on Systems, Signals Devices (SSD), 2014 11th International. :1-5.

In this paper, the use of some of the most popular adaptive filtering algorithms for the purpose of linearizing power amplifiers by the well-known digital predistortion (DPD) technique is investigated. First, an introduction to the problem of power amplifier linearization is given, followed by a discussion of the model used for this purpose. Next, a variety of adaptive algorithms are used to construct the digital predistorter function for a highly nonlinear power amplifier and their performance is comparatively analyzed. Based on the simulations presented in this paper, conclusions regarding the choice of algorithm are derived.

Sumit, S., Mitra, D., Gupta, D..  2014.  Proposed Intrusion Detection on ZRP based MANET by effective k-means clustering method of data mining. Optimization, Reliabilty, and Information Technology (ICROIT), 2014 International Conference on. :156-160.

Mobile Ad-Hoc Networks (MANET) consist of peer-to-peer infrastructure less communicating nodes that are highly dynamic. As a result, routing data becomes more challenging. Ultimately routing protocols for such networks face the challenges of random topology change, nature of the link (symmetric or asymmetric) and power requirement during data transmission. Under such circumstances both, proactive as well as reactive routing are usually inefficient. We consider, zone routing protocol (ZRP) that adds the qualities of the proactive (IARP) and reactive (IERP) protocols. In ZRP, an updated topological map of zone centered on each node, is maintained. Immediate routes are available inside each zone. In order to communicate outside a zone, a route discovery mechanism is employed. The local routing information of the zones helps in this route discovery procedure. In MANET security is always an issue. It is possible that a node can turn malicious and hamper the normal flow of packets in the MANET. In order to overcome such issue we have used a clustering technique to separate the nodes having intrusive behavior from normal behavior. We call this technique as effective k-means clustering which has been motivated from k-means. We propose to implement Intrusion Detection System on each node of the MANET which is using ZRP for packet flow. Then we will use effective k-means to separate the malicious nodes from the network. Thus, our Ad-Hoc network will be free from any malicious activity and normal flow of packets will be possible.

2015-05-05
Dey, L., Mahajan, D., Gupta, H..  2014.  Obtaining Technology Insights from Large and Heterogeneous Document Collections. Web Intelligence (WI) and Intelligent Agent Technologies (IAT), 2014 IEEE/WIC/ACM International Joint Conferences on. 1:102-109.

Keeping up with rapid advances in research in various fields of Engineering and Technology is a challenging task. Decision makers including academics, program managers, venture capital investors, industry leaders and funding agencies not only need to be abreast of latest developments but also be able to assess the effect of growth in certain areas on their core business. Though analyst agencies like Gartner, McKinsey etc. Provide such reports for some areas, thought leaders of all organisations still need to amass data from heterogeneous collections like research publications, analyst reports, patent applications, competitor information etc. To help them finalize their own strategies. Text mining and data analytics researchers have been looking at integrating statistics, text analytics and information visualization to aid the process of retrieval and analytics. In this paper, we present our work on automated topical analysis and insight generation from large heterogeneous text collections of publications and patents. While most of the earlier work in this area provides search-based platforms, ours is an integrated platform for search and analysis. We have presented several methods and techniques that help in analysis and better comprehension of search results. We have also presented methods for generating insights about emerging and popular trends in research along with contextual differences between academic research and patenting profiles. We also present novel techniques to present topic evolution that helps users understand how a particular area has evolved over time.
 

Kaci, A., Kamwa, I., Dessaint, L.-A., Guillon, S..  2014.  Phase angles as predictors of network dynamic security limits and further implications. PES General Meeting | Conference Exposition, 2014 IEEE. :1-6.

In the United States, the number of Phasor Measurement Units (PMU) will increase from 166 networked devices in 2010 to 1043 in 2014. According to the Department of Energy, they are being installed in order to “evaluate and visualize reliability margin (which describes how close the system is to the edge of its stability boundary).” However, there is still a lot of debate in academia and industry around the usefulness of phase angles as unambiguous predictors of dynamic stability. In this paper, using 4-year of actual data from Hydro-Québec EMS, it is shown that phase angles enable satisfactory predictions of power transfer and dynamic security margins across critical interface using random forest models, with both explanation level and R-squares accuracy exceeding 99%. A generalized linear model (GLM) is next implemented to predict phase angles from day-ahead to hour-ahead time frames, using historical phase angles values and load forecast. Combining GLM based angles forecast with random forest mapping of phase angles to power transfers result in a new data-driven approach for dynamic security monitoring.
 

Toshiro Yano, E., Bhatt, P., Gustavsson, P.M., Ahlfeldt, R.-M..  2014.  Towards a Methodology for Cybersecurity Risk Management Using Agents Paradigm. Intelligence and Security Informatics Conference (JISIC), 2014 IEEE Joint. :325-325.

In order to deal with shortcomings of security management systems, this work proposes a methodology based on agents paradigm for cybersecurity risk management. In this approach a system is decomposed in agents that may be used to attain goals established by attackers. Threats to business are achieved by attacker's goals in service and deployment agents. To support a proactive behavior, sensors linked to security mechanisms are analyzed accordingly with a model for Situational Awareness(SA)[4].
 

Kaci, A., Kamwa, I., Dessaint, L.A., Guillon, S..  2014.  Synchrophasor Data Baselining and Mining for Online Monitoring of Dynamic Security Limits. Power Systems, IEEE Transactions on. 29:2681-2695.

When the system is in normal state, actual SCADA measurements of power transfers across critical interfaces are continuously compared with limits determined offline and stored in look-up tables or nomograms in order to assess whether the network is secure or insecure and inform the dispatcher to take preventive action in the latter case. However, synchrophasors could change this paradigm by enabling new features, the phase-angle differences, which are well-known measures of system stress, with the added potential to increase system visibility. The paper develops a systematic approach to baseline the phase-angles versus actual transfer limits across system interfaces and enable synchrophasor-based situational awareness (SBSA). Statistical methods are first used to determine seasonal exceedance levels of angle shifts that can allow real-time scoring and detection of atypical conditions. Next, key buses suitable for SBSA are identified using correlation and partitioning around medoid (PAM) clustering. It is shown that angle shifts of this subset of 15% of the network backbone buses can be effectively used as features in ensemble decision tree-based forecasting of seasonal security margins across critical interfaces.
 

Han Huang, Jun Zhang, Guanglong Xie.  2014.  RESEARCH on the future functions and MODALITY of smart grid and its key technologies. Electricity Distribution (CICED), 2014 China International Conference on. :1241-1245.

Power network is important part of national comprehensive energy resources transmission system in the way of energy security promise and the economy society running. Meanwhile, because of many industries involved, the development of grid can push national innovation ability. Nowadays, it makes the inner of smart grid flourish that material science, computer technique and information and communication technology go forward. This paper researches the function and modality of smart grid on energy, geography and technology dimensions. The analysis on the technology dimension is addressed on two aspects which are network control and interaction with customer. The mapping relationship between functions fo smart grid and eight key technologies, which are Large-capacity flexible transmission technology, DC power distribution technology, Distributed power generation technology, Large-scale energy storage technology, Real-time tracking simulation technology, Intelligent electricity application technology, The big data analysis and cloud computing technology, Wide-area situational awareness technology, is given. The research emphasis of the key technologies is proposed.
 

Bhandari, P., Gujral, M.S..  2014.  Ontology based approach for perception of network security state. Engineering and Computational Sciences (RAECS), 2014 Recent Advances in. :1-6.

This paper presents an ontological approach to perceive the current security status of the network. Computer network is a dynamic entity whose state changes with the introduction of new services, installation of new network operating system, and addition of new hardware components, creation of new user roles and by attacks from various actors instigated by aggressors. Various security mechanisms employed in the network does not give the complete picture of security of complete network. In this paper we have proposed taxonomy and ontology which may be used to infer impact of various events happening in the network on security status of the network. Vulnerability, Network and Attack are the main taxonomy classes in the ontology. Vulnerability class describes various types of vulnerabilities in the network which may in hardware components like storage devices, computing devices or networks devices. Attack class has many subclasses like Actor class which is entity executing the attack, Goal class describes goal of the attack, Attack mechanism class defines attack methodology, Scope class describes size and utility of the target, Automation level describes the automation level of the attack Evaluation of security status of the network is required for network security situational awareness. Network class has network operating system, users, roles, hardware components and services as its subclasses. Based on this taxonomy ontology has been developed to perceive network security status. Finally a framework, which uses this ontology as knowledgebase has been proposed.
 

Fink, G.A., Griswold, R.L., Beech, Z.W..  2014.  Quantifying cyber-resilience against resource-exhaustion attacks. Resilient Control Systems (ISRCS), 2014 7th International Symposium on. :1-8.

Resilience in the information sciences is notoriously difficult to define much less to measure. But in mechanical engineering, the resilience of a substance is mathematically well-defined as an area under the stress-strain curve. We combined inspiration from mechanics of materials and axioms from queuing theory in an attempt to define resilience precisely for information systems. We first examine the meaning of resilience in linguistic and engineering terms and then translate these definitions to information sciences. As a general assessment of our approach's fitness, we quantify how resilience may be measured in a simple queuing system. By using a very simple model we allow clear application of established theory while being flexible enough to apply to many other engineering contexts in information science and cyber security. We tested our definitions of resilience via simulation and analysis of networked queuing systems. We conclude with a discussion of the results and make recommendations for future work.
 

Jian Wu, Yongmei Jiang, Gangyao Kuang, Jun Lu, Zhiyong Li.  2014.  Parameter estimation for SAR moving target detection using Fractional Fourier Transform. Geoscience and Remote Sensing Symposium (IGARSS), 2014 IEEE International. :596-599.

This paper proposes an algorithm for multi-channel SAR ground moving target detection and estimation using the Fractional Fourier Transform(FrFT). To detect the moving target with low speed, the clutter is first suppressed by Displace Phase Center Antenna(DPCA), then the signal-to-clutter can be enhanced. Have suppressed the clutter, the echo of moving target remains and can be regarded as a chirp signal whose parameters can be estimated by FrFT. FrFT, one of the most widely used tools to time-frequency analysis, is utilized to estimate the Doppler parameters, from which the moving parameters, including the velocity and the acceleration can be obtained. The effectiveness of the proposed method is validated by the simulation.
 

Yue-Bin Luo, Bao-Sheng Wang, Gui-Lin Cai.  2014.  Effectiveness of Port Hopping as a Moving Target Defense. Security Technology (SecTech), 2014 7th International Conference on. :7-10.

Port hopping is a typical moving target defense, which constantly changes service port number to thwart reconnaissance attack. It is effective in hiding service identities and confusing potential attackers, but it is still unknown how effective port hopping is and under what circumstances it is a viable proactive defense because the existed works are limited and they usually discuss only a few parameters and give some empirical studies. This paper introduces urn model and quantifies the likelihood of attacker success in terms of the port pool size, number of probes, number of vulnerable services, and hopping frequency. Theoretical analysis shows that port hopping is an effective and promising proactive defense technology in thwarting network attacks.
 

Abgrall, E., le Traon, Y., Gombault, S., Monperrus, M..  2014.  Empirical Investigation of the Web Browser Attack Surface under Cross-Site Scripting: An Urgent Need for Systematic Security Regression Testing. Software Testing, Verification and Validation Workshops (ICSTW), 2014 IEEE Seventh International Conference on. :34-41.

One of the major threats against web applications is Cross-Site Scripting (XSS). The final target of XSS attacks is the client running a particular web browser. During this last decade, several competing web browsers (IE, Netscape, Chrome, Firefox) have evolved to support new features. In this paper, we explore whether the evolution of web browsers is done using systematic security regression testing. Beginning with an analysis of their current exposure degree to XSS, we extend the empirical study to a decade of most popular web browser versions. We use XSS attack vectors as unit test cases and we propose a new method supported by a tool to address this XSS vector testing issue. The analysis on a decade releases of most popular web browsers including mobile ones shows an urgent need of XSS regression testing. We advocate the use of a shared security testing benchmark as a good practice and propose a first set of publicly available XSS vectors as a basis to ensure that security is not sacrificed when a new version is delivered.