Visible to the public Biblio

Found 314 results

Filters: Keyword is Standards  [Clear All Filters]
2019-12-02
Khan, Rafiullah, McLaughlin, Kieran, Laverty, John Hastings David, David, Hastings, Sezer, Sakir.  2018.  Demonstrating Cyber-Physical Attacks and Defense for Synchrophasor Technology in Smart Grid. 2018 16th Annual Conference on Privacy, Security and Trust (PST). :1–10.
Synchrophasor technology is used for real-time control and monitoring in smart grid. Previous works in literature identified critical vulnerabilities in IEEE C37.118.2 synchrophasor communication standard. To protect synchrophasor-based systems, stealthy cyber-attacks and effective defense mechanisms still need to be investigated.This paper investigates how an attacker can develop a custom tool to execute stealthy man-in-the-middle attacks against synchrophasor devices. In particular, four different types of attack capabilities have been demonstrated in a real synchrophasor-based synchronous islanding testbed in laboratory: (i) command injection attack, (ii) packet drop attack, (iii) replay attack and (iv) stealthy data manipulation attack. With deep technical understanding of the attack capabilities and potential physical impacts, this paper also develops and tests a distributed Intrusion Detection System (IDS) following NIST recommendations. The functionalities of the proposed IDS have been validated in the testbed for detecting aforementioned cyber-attacks. The paper identified that a distributed IDS with decentralized decision making capability and the ability to learn system behavior could effectively detect stealthy malicious activities and improve synchrophasor network security.
2019-10-02
Chao, H., Ringlee, R. J..  2018.  Analytical Challenges in Reliability and Resiliency Modeling. 2018 IEEE International Conference on Probabilistic Methods Applied to Power Systems (PMAPS). :1–5.
A significant number of the generation, transmission and distribution facilities in the North America were designed and configured for serving electric loads and economic activities under certain reliability and resiliency requirements over 30 years ago. With the changing generation mix, the electric grid is tasked to deliver electricity made by fuel uncertain and energy limited resources. How adequate are the existing facilities to meet the industry expectations on reliability? What level of grid resiliency should be designed and built to sustain reliable electric services given the increasing exposure to frequent and lasting severe weather conditions? There is a need to review the modeling assumptions, operating and maintenance records before we can answer these questions.
2019-09-26
Miletić, M., Vuku\v sić, M., Mau\v sa, G., Grbac, T. G..  2018.  Cross-Release Code Churn Impact on Effort-Aware Software Defect Prediction. 2018 41st International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO). :1460-1466.

Code churn has been successfully used to identify defect inducing changes in software development. Our recent analysis of the cross-release code churn showed that several design metrics exhibit moderate correlation with the number of defects in complex systems. The goal of this paper is to explore whether cross-release code churn can be used to identify critical design change and contribute to prediction of defects for software in evolution. In our case study, we used two types of data from consecutive releases of open-source projects, with and without cross-release code churn, to build standard prediction models. The prediction models were trained on earlier releases and tested on the following ones, evaluating the performance in terms of AUC, GM and effort aware measure Pop. The comparison of their performance was used to answer our research question. The obtained results showed that the prediction model performs better when cross-release code churn is included. Practical implication of this research is to use cross-release code churn to aid in safe planning of next release in software development.

Jackson, K. A., Bennett, B. T..  2018.  Locating SQL Injection Vulnerabilities in Java Byte Code Using Natural Language Techniques. SoutheastCon 2018. :1-5.

With so much our daily lives relying on digital devices like personal computers and cell phones, there is a growing demand for code that not only functions properly, but is secure and keeps user data safe. However, ensuring this is not such an easy task, and many developers do not have the required skills or resources to ensure their code is secure. Many code analysis tools have been written to find vulnerabilities in newly developed code, but this technology tends to produce many false positives, and is still not able to identify all of the problems. Other methods of finding software vulnerabilities automatically are required. This proof-of-concept study applied natural language processing on Java byte code to locate SQL injection vulnerabilities in a Java program. Preliminary findings show that, due to the high number of terms in the dataset, using singular decision trees will not produce a suitable model for locating SQL injection vulnerabilities, while random forest structures proved more promising. Still, further work is needed to determine the best classification tool.

2019-09-11
Moyne, J., Mashiro, S., Gross, D..  2018.  Determining a Security Roadmap for the Microelectronics Industry. 2018 29th Annual SEMI Advanced Semiconductor Manufacturing Conference (ASMC). :291–294.

The evolution of the microelectronics manufacturing industry is characterized by increased complexity, analysis, integration, distribution, data sharing and collaboration, all of which is enabled by the big data explosion. This evolution affords a number of opportunities in improved productivity and quality, and reduced cost, however it also brings with it a number of risks associated with maintaining security of data systems. The International Roadmap for Devices and System Factory Integration International Focus Team (IRDS FI IFT) determined that a security technology roadmap for the industry is needed to better understand the needs, challenges and potential solutions for security in the microelectronics industry and its supply chain. As a first step in providing this roadmap, the IFT conducted a security survey, soliciting input from users, suppliers and OEMs. Preliminary results indicate that data partitioning with IP protection is the number one topic of concern, with the need for industry-wide standards as the second most important topic. Further, the "fear" of security breach is considered to be a significant hindrance to Advanced Process Control efforts as well as use of cloud-based solutions. The IRDS FI IFT will endeavor to provide components of a security roadmap for the industry in the 2018 FI chapter, leveraging the output of the survey effort combined with follow-up discussions with users and consultations with experts.

2019-08-05
Ogundokun, A., Zavarsky, P., Swar, B..  2018.  Cybersecurity assurance control baselining for smart grid communication systems. 2018 14th IEEE International Workshop on Factory Communication Systems (WFCS). :1–6.

Cybersecurity assurance plays an important role in managing trust in smart grid communication systems. In this paper, cybersecurity assurance controls for smart grid communication networks and devices are delineated from the more technical functional controls to provide insights on recent innovative risk-based approaches to cybersecurity assurance in smart grid systems. The cybersecurity assurance control baselining presented in this paper is based on requirements and guidelines of the new family of IEC 62443 standards on network and systems security of industrial automation and control systems. The paper illustrates how key cybersecurity control baselining and tailoring concepts of the U.S. NIST SP 800-53 can be adopted in smart grid security architecture. The paper outlines the application of IEC 62443 standards-based security zoning and assignment of security levels to the zones in smart grid system architectures. To manage trust in the smart grid system architecture, cybersecurity assurance base lining concepts are applied per security impact levels. Selection and justification of security assurance controls presented in the paper is utilizing the approach common in Security Technical Implementation Guides (STIGs) of the U.S. Defense Information Systems Agency. As shown in the paper, enhanced granularity for managing trust both on the overall system and subsystem levels of smart grid systems can be achieved by implementation of the instructions of the CNSSI 1253 of the U.S. Committee of National Security Systems on security categorization and control selection for national security systems.

2019-07-01
Rosa, F. De Franco, Jino, M., Bueno, P. Marcos Siqueira, Bonacin, R..  2018.  Coverage-Based Heuristics for Selecting Assessment Items from Security Standards: A Core Set Proposal. 2018 Workshop on Metrology for Industry 4.0 and IoT. :192-197.

In the realm of Internet of Things (IoT), information security is a critical issue. Security standards, including their assessment items, are essential instruments in the evaluation of systems security. However, a key question remains open: ``Which test cases are most effective for security assessment?'' To create security assessment designs with suitable assessment items, we need to know the security properties and assessment dimensions covered by a standard. We propose an approach for selecting and analyzing security assessment items; its foundations come from a set of assessment heuristics and it aims to increase the coverage of assessment dimensions and security characteristics in assessment designs. The main contribution of this paper is the definition of a core set of security assessment heuristics. We systematize the security assessment process by means of a conceptual formalization of the security assessment area. Our approach can be applied to security standards to select or to prioritize assessment items with respect to 11 security properties and 6 assessment dimensions. The approach is flexible allowing the inclusion of dimensions and properties. Our proposal was applied to a well know security standard (ISO/IEC 27001) and its assessment items were analyzed. The proposal is meant to support: (i) the generation of high-coverage assessment designs, which include security assessment items with assured coverage of the main security characteristics, and (ii) evaluation of security standards with respect to the coverage of security aspects.

Saleem, Jibran, Hammoudeh, Mohammad, Raza, Umar, Adebisi, Bamidele, Ande, Ruth.  2018.  IoT Standardisation: Challenges, Perspectives and Solution. Proceedings of the 2Nd International Conference on Future Networks and Distributed Systems. :1:1-1:9.

The success and widespread adoption of the Internet of Things (IoT) has increased many folds over the last few years. Industries, technologists and home users recognise the importance of IoT in their lives. Essentially, IoT has brought vast industrial revolution and has helped automate many processes within organisations and homes. However, the rapid growth of IoT is also a cause for significant concern. IoT is not only plagued with security, authentication and access control issues, it also doesn't work as well as it should with fourth industrial revolution, commonly known as Industry 4.0. The absence of effective regulation, standards and weak governance has led to a continual downward trend in the security of IoT networks and devices, as well as given rise to a broad range of privacy issues. This paper examines the IoT industry and discusses the urgent need for standardisation, the benefits of governance as well as the issues affecting the IoT sector due to the absence of regulation. Additionally, through this paper, we are introducing an IoT security framework (IoTSFW) for organisations to bridge the current lack of guidelines in the IoT industry. Implementation of the guidelines, defined in the proposed framework, will assist organisations in achieving security, privacy, sustainability and scalability within their IoT networks.

2019-06-28
Plasencia-Balabarca, F., Mitacc-Meza, E., Raffo-Jara, M., Silva-Cárdenas, C..  2018.  Robust Functional Verification Framework Based in UVM Applied to an AES Encryption Module. 2018 New Generation of CAS (NGCAS). :194-197.

This Since the past century, the digital design industry has performed an outstanding role in the development of electronics. Hence, a great variety of designs are developed daily, these designs must be submitted to high standards of verification in order to ensure the 100% of reliability and the achievement of all design requirements. The Universal Verification Methodology (UVM) is the current standard at the industry for the verification process due to its reusability, scalability, time-efficiency and feasibility of handling high-level designs. This research proposes a functional verification framework using UVM for an AES encryption module based on a very detailed and robust verification plan. This document describes the complete verification process as done in the industry for a popular module used in information-security applications in the field of cryptography, defining the basis for future projects. The overall results show the achievement of the high verification standards required in industry applications and highlight the advantages of UVM against System Verilog-based functional verification and direct verification methodologies previously developed for the AES module.

2019-06-24
You, Y., Li, Z., Oechtering, T. J..  2018.  Optimal Privacy-Enhancing And Cost-Efficient Energy Management Strategies For Smart Grid Consumers. 2018 IEEE Statistical Signal Processing Workshop (SSP). :826–830.

The design of optimal energy management strategies that trade-off consumers' privacy and expected energy cost by using an energy storage is studied. The Kullback-Leibler divergence rate is used to assess the privacy risk of the unauthorized testing on consumers' behavior. We further show how this design problem can be formulated as a belief state Markov decision process problem so that standard tools of the Markov decision process framework can be utilized, and the optimal solution can be obtained by using Bellman dynamic programming. Finally, we illustrate the privacy-enhancement and cost-saving by numerical examples.

2019-05-09
Zhang, Z., Chang, C., Lv, Z., Han, P., Wang, Y..  2018.  A Control Flow Anomaly Detection Algorithm for Industrial Control Systems. 2018 1st International Conference on Data Intelligence and Security (ICDIS). :286-293.

Industrial control systems are the fundamental infrastructures of a country. Since the intrusion attack methods for industrial control systems have become complex and concealed, the traditional protection methods, such as vulnerability database, virus database and rule matching cannot cope with the attacks hidden inside the terminals of industrial control systems. In this work, we propose a control flow anomaly detection algorithm based on the control flow of the business programs. First, a basic group partition method based on key paths is proposed to reduce the performance burden caused by tabbed-assert control flow analysis method through expanding basic research units. Second, the algorithm phases of standard path set acquisition and path matching are introduced. By judging whether the current control flow path is deviating from the standard set or not, the abnormal operating conditions of industrial control can be detected. Finally, the effectiveness of a control flow anomaly detection (checking) algorithm based on Path Matching (CFCPM) is demonstrated by anomaly detection ability analysis and experiments.

2019-05-08
Ning, W., Zhi-Jun, L..  2018.  A Layer-Built Method to the Relevancy of Electronic Evidence. 2018 2nd IEEE Advanced Information Management,Communicates,Electronic and Automation Control Conference (IMCEC). :416–420.

T138 combat cyber crimes, electronic evidence have played an increasing role, but in judicial practice the electronic evidence were not highly applied because of the natural contradiction between the epistemic uncertainty of electronic evidence and the principle of discretionary evidence of judge in the court. in this paper, we put forward a layer-built method to analyze the relevancy of electronic evidence, and discussed their analytical process combined with the case study. The initial practice shows the model is feasible and has a consulting value in analyzing the relevancy of electronic evidence.

Ölvecký, M., Gabriška, D..  2018.  Wiping Techniques and Anti-Forensics Methods. 2018 IEEE 16th International Symposium on Intelligent Systems and Informatics (SISY). :000127–000132.

This paper presents a theoretical background of main research activity focused on the evaluation of wiping/erasure standards which are mostly implemented in specific software products developed and programming for data wiping. The information saved in storage devices often consists of metadata and trace data. Especially but not only these kinds of data are very important in the process of forensic analysis because they sometimes contain information about interconnection on another file. Most people saving their sensitive information on their local storage devices and later they want to secure erase these files but usually there is a problem with this operation. Secure file destruction is one of many Anti-forensics methods. The outcome of this paper is to define the future research activities focused on the establishment of the suitable digital environment. This environment will be prepared for testing and evaluating selected wiping standards and appropriate eraser software.

2019-05-01
Hajny, J., Dzurenda, P., Ricci, S., Malina, L., Vrba, K..  2018.  Performance Analysis of Pairing-Based Elliptic Curve Cryptography on Constrained Devices. 2018 10th International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT). :1–5.

The paper deals with the implementation aspects of the bilinear pairing operation over an elliptic curve on constrained devices, such as smart cards, embedded devices, smart meters and similar devices. Although cryptographic constructions, such as group signatures, anonymous credentials or identity-based encryption schemes, often rely on the pairing operation, the implementation of such schemes into practical applications is not straightforward, in fact, it may become very difficult. In this paper, we show that the implementation is difficult not only due to the high computational complexity, but also due to the lack of cryptographic libraries and programming interfaces. In particular, we show how difficult it is to implement pairing-based schemes on constrained devices and show the performance of various libraries on different platforms. Furthermore, we show the performance estimates of fundamental cryptographic constructions, the group signatures. The purpose of this paper is to reduce the gap between the cryptographic designers and developers and give performance results that can be used for the estimation of the implementability and performance of novel, upcoming schemes.

Valenta, L., Sullivan, N., Sanso, A., Heninger, N..  2018.  In Search of CurveSwap: Measuring Elliptic Curve Implementations in the Wild. 2018 IEEE European Symposium on Security and Privacy (EuroS P). :384–398.

We survey elliptic curve implementations from several vantage points. We perform internet-wide scans for TLS on a large number of ports, as well as SSH and IPsec to measure elliptic curve support and implementation behaviors, and collect passive measurements of client curve support for TLS. We also perform active measurements to estimate server vulnerability to known attacks against elliptic curve implementations, including support for weak curves, invalid curve attacks, and curve twist attacks. We estimate that 1.53% of HTTPS hosts, 0.04% of SSH hosts, and 4.04% of IKEv2 hosts that support elliptic curves do not perform curve validity checks as specified in elliptic curve standards. We describe how such vulnerabilities could be used to construct an elliptic curve parameter downgrade attack called CurveSwap for TLS, and observe that there do not appear to be combinations of weak behaviors we examined enabling a feasible CurveSwap attack in the wild. We also analyze source code for elliptic curve implementations, and find that a number of libraries fail to perform point validation for JSON Web Encryption, and find a flaw in the Java and NSS multiplication algorithms.

2019-04-01
Rathour, N., Kaur, K., Bansal, S., Bhargava, C..  2018.  A Cross Correlation Approach for Breaking of Text CAPTCHA. 2018 International Conference on Intelligent Circuits and Systems (ICICS). :6–10.
Online web service providers generally protect themselves through CAPTCHA. A CAPTCHA is a type of challenge-response test used in computing as an attempt to ensure that the response is generated by a person. CAPTCHAS are mainly instigated as distorted text which the handler must correctly transcribe. Numerous schemes have been proposed till date in order to prevent attacks by Bots. This paper also presents a cross correlation based approach in breaking of famous service provider's text CAPTCHA i.e. PayPal.com and the other one is of India's most visited website IRCTC.co.in. The procedure can be fragmented down into 3 firmly tied tasks: pre-processing, segmentation, and classification. The pre-processing of the image is performed to remove all the background noise of the image. The noise in the CAPTCHA are unwanted on pixels in the background. The segmentation is performed by scanning the image for on pixels. The organization is performed by using the association values of the inputs and templates. Two types of templates have been used for classification purpose. One is the standard templates which give 30% success rate and other is the noisy templates made from the captcha images and success rate achieved with these is 100%.
2019-03-06
Yan, Li, Hao, Xiaowei, Cheng, Zelei, Zhou, Rui.  2018.  Cloud Computing Security and Privacy. Proceedings of the 2018 International Conference on Big Data and Computing. :119-123.
Cloud computing is an emerging technology that can provide organizations, enterprises and governments with cheaper, more convenient and larger scale computing resources. However, cloud computing will bring potential risks and threats, especially on security and privacy. We make a survey on potential threats and risks and existing solutions on cloud security and privacy. We also put forward some problems to be addressed to provide a secure cloud computing environment.
2019-02-25
Lesisa, T. G., Marnewick, A., Nel, H..  2018.  The Identification of Supplier Selection Criteria Within a Risk Management Framework Towards Consistent Supplier Selection. 2018 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM). :913–917.
The aim of the study is to evaluate the consistency of supplier risk assessment performed during the supplier selection process. Existing literature indicates that current supplier selection processes yield inconsistent results. Consistent supplier selection cannot be accomplished without stable risk assessment performed during the process. A case study was conducted in a train manufacturer in South Africa, and document analysis, interviews and questionnaires were employed to source information and data. Triangulation and pattern matching enabled a comparative study between literature and practice from which findings were derived. The study suggests selection criteria that may be considered when performing supplier risk assessment during the selection process. The findings indicate that structured supplier risk assessment with predefined supplier selection criteria may eliminate inconsistencies in supplier assessment and selection.
2019-02-21
Bi, Q., Huang, Y..  2018.  A Self-organized Shape Formation Method for Swarm Controlling. 2018 37th Chinese Control Conference (CCC). :7205–7209.
This paper presents a new approach for the shape formation based on the artificial method. It refers to the basic concept in the swarm intelligence: complex behaviors of the swarm can be formed with simple rules designed in the agents. In the framework, the distance image is used to generate not only an attraction field to keep all the agents in the given shape, but also repulsive force field among the agents to make them distribute uniformly. Compared to the traditional methods based on centralized control, the algorithm has properties of distributed and simple computation, convergence and robustness, which is very suitable for the swarm robots in the real world considering the limitation of communication, collision avoidance and calculation problems. We also show that some initial sensitive method can be improved in the similar way. The simulation results prove the proposed approach is suitable for convex. non-convex and line shapes.
2019-01-16
Kreuk, F., Adi, Y., Cisse, M., Keshet, J..  2018.  Fooling End-To-End Speaker Verification With Adversarial Examples. 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). :1962–1966.
Automatic speaker verification systems are increasingly used as the primary means to authenticate costumers. Recently, it has been proposed to train speaker verification systems using end-to-end deep neural models. In this paper, we show that such systems are vulnerable to adversarial example attacks. Adversarial examples are generated by adding a peculiar noise to original speaker examples, in such a way that they are almost indistinguishable, by a human listener. Yet, the generated waveforms, which sound as speaker A can be used to fool such a system by claiming as if the waveforms were uttered by speaker B. We present white-box attacks on a deep end-to-end network that was either trained on YOHO or NTIMIT. We also present two black-box attacks. In the first one, we generate adversarial examples with a system trained on NTIMIT and perform the attack on a system that trained on YOHO. In the second one, we generate the adversarial examples with a system trained using Mel-spectrum features and perform the attack on a system trained using MFCCs. Our results show that one can significantly decrease the accuracy of a target system even when the adversarial examples are generated with different system potentially using different features.
2018-12-03
Shearon, C. E..  2018.  IPC-1782 standard for traceability of critical items based on risk. 2018 Pan Pacific Microelectronics Symposium (Pan Pacific). :1–3.

Traceability has grown from being a specialized need for certain safety critical segments of the industry, to now being a recognized value-add tool for the industry as a whole that can be utilized for manual to automated processes End to End throughout the supply chain. The perception of traceability data collection persists as being a burden that provides value only when the most rare and disastrous of events take place. Disparate standards have evolved in the industry, mainly dictated by large OEM companies in the market create confusion, as a multitude of requirements and definitions proliferate. The intent of the IPC-1782 project is to bring the whole principle of traceability up to date and enable business to move faster, increase revenue, increase productivity, and decrease costs as a result of increased trust. Traceability, as defined in this standard will represent the most effective quality tool available, becoming an intrinsic part of best practice operations, with the encouragement of automated data collection from existing manufacturing systems which works well with Industry 4.0, integrating quality, reliability, product safety, predictive (routine, preventative, and corrective) maintenance, throughput, manufacturing, engineering and supply-chain data, reducing cost of ownership as well as ensuring timeliness and accuracy all the way from a finished product back through to the initial materials and granular attributes about the processes along the way. The goal of this standard is to create a single expandable and extendable data structure that can be adopted for all levels of traceability and enable easily exchanged information, as appropriate, across many industries. The scope includes support for the most demanding instances for detail and integrity such as those required by critical safety systems, all the way through to situations where only basic traceability, such as for simple consumer products, are required. A key driver for the adoption of the standard is the ability to find a relevant and achievable level of traceability that exactly meets the requirement following risk assessment of the business. The wealth of data accessible from traceability for analysis (e.g.; Big Data, etc.) can easily and quickly yield information that can raise expectations of very significant quality and performance improvements, as well as providing the necessary protection against the costs of issues in the market and providing very timely information to regulatory bodies along with consumers/customers as appropriate. This information can also be used to quickly raise yields, drive product innovation that resonates with consumers, and help drive development tests & design requirements that are meaningful to the Marketplace. Leveraging IPC 1782 to create the best value of Component Traceability for your business.

2018-11-19
Chen, Y., Lai, Y., Liu, Y..  2017.  Transforming Photos to Comics Using Convolutional Neural Networks. 2017 IEEE International Conference on Image Processing (ICIP). :2010–2014.

In this paper, inspired by Gatys's recent work, we propose a novel approach that transforms photos to comics using deep convolutional neural networks (CNNs). While Gatys's method that uses a pre-trained VGG network generally works well for transferring artistic styles such as painting from a style image to a content image, for more minimalist styles such as comics, the method often fails to produce satisfactory results. To address this, we further introduce a dedicated comic style CNN, which is trained for classifying comic images and photos. This new network is effective in capturing various comic styles and thus helps to produce better comic stylization results. Even with a grayscale style image, Gatys's method can still produce colored output, which is not desirable for comics. We develop a modified optimization framework such that a grayscale image is guaranteed to be synthesized. To avoid converging to poor local minima, we further initialize the output image using grayscale version of the content image. Various examples show that our method synthesizes better comic images than the state-of-the-art method.

Carlin, D., O'Kane, P., Sezer, S., Burgess, J..  2018.  Detecting Cryptomining Using Dynamic Analysis. 2018 16th Annual Conference on Privacy, Security and Trust (PST). :1–6.

With the rise in worth and popularity of cryptocurrencies, a new opportunity for criminal gain is being exploited and with little currently offered in the way of defence. The cost of mining (i.e., earning cryptocurrency through CPU-intensive calculations that underpin the blockchain technology) can be prohibitively expensive, with hardware costs and electrical overheads previously offering a loss compared to the cryptocurrency gained. Off-loading these costs along a distributed network of machines via malware offers an instantly profitable scenario, though standard Anti-virus (AV) products offer some defences against file-based threats. However, newer fileless malicious attacks, occurring through the browser on seemingly legitimate websites, can easily evade detection and surreptitiously engage the victim machine in computationally-expensive cryptomining (cryptojacking). With no current academic literature on the dynamic opcode analysis of cryptomining, to the best of our knowledge, we present the first such experimental study. Indeed, this is the first such work presenting opcode analysis on non-executable files. Our results show that browser-based cryptomining within our dataset can be detected by dynamic opcode analysis, with accuracies of up to 100%. Further to this, our model can distinguish between cryptomining sites, weaponized benign sites, de-weaponized cryptomining sites and real world benign sites. As it is process-based, our technique offers an opportunity to rapidly detect, prevent and mitigate such attacks, a novel contribution which should encourage further future work.

2018-11-14
Afanasev, M. Y., Krylova, A. A., Shorokhov, S. A., Fedosov, Y. V., Sidorenko, A. S..  2018.  A Design of Cyber-Physical Production System Prototype Based on an Ethereum Private Network. 2018 22nd Conference of Open Innovations Association (FRUCT). :3–11.

The concept of cyber-physical production systems is highly discussed amongst researchers and industry experts, however, the implementation options for these systems rely mainly on obsolete technologies. Despite the fact that the blockchain is most often associated with cryptocurrency, it is fundamentally wrong to deny the universality of this technology and the prospects for its application in other industries. For example, in the insurance sector or in a number of identity verification services. This article discusses the deployment of the CPPS backbone network based on the Ethereum private blockchain system. The structure of the network is described as well as its interaction with the help of smart contracts, based on the consumption of cryptocurrency for various operations.

2018-09-28
Song, Youngho, Shin, Young-sung, Jang, Miyoung, Chang, Jae-Woo.  2017.  Design and implementation of HDFS data encryption scheme using ARIA algorithm on Hadoop. 2017 IEEE International Conference on Big Data and Smart Computing (BigComp). :84–90.

Hadoop is developed as a distributed data processing platform for analyzing big data. Enterprises can analyze big data containing users' sensitive information by using Hadoop and utilize them for their marketing. Therefore, researches on data encryption have been widely done to protect the leakage of sensitive data stored in Hadoop. However, the existing researches support only the AES international standard data encryption algorithm. Meanwhile, the Korean government selected ARIA algorithm as a standard data encryption scheme for domestic usages. In this paper, we propose a HDFS data encryption scheme which supports both ARIA and AES algorithms on Hadoop. First, the proposed scheme provides a HDFS block-splitting component that performs ARIA/AES encryption and decryption under the Hadoop distributed computing environment. Second, the proposed scheme provides a variable-length data processing component that can perform encryption and decryption by adding dummy data, in case when the last data block does not contains 128-bit data. Finally, we show from performance analysis that our proposed scheme is efficient for various applications, such as word counting, sorting, k-Means, and hierarchical clustering.