Visible to the public Biblio

Filters: Keyword is cost reduction  [Clear All Filters]
2022-04-01
Dabthong, Hachol, Warasart, Maykin, Duma, Phongsaphat, Rakdej, Pongpat, Majaroen, Natt, Lilakiatsakun, Woraphon.  2021.  Low Cost Automated OS Security Audit Platform Using Robot Framework. 2021 Research, Invention, and Innovation Congress: Innovation Electricals and Electronics (RI2C). :31—34.
Security baseline hardening is a baseline configuration framework aims to improve the robustness of the operating system, lowering the risk and impact of breach incidents. In typical best practice, the security baseline hardening requires to have regular check and follow-up to keep the system in-check, this set of activities are called "Security Baseline Audit". The Security Baseline Audit process is responsible by the IT department. In terms of business, this process consumes a fair number of resources such as man-hour, time, and technical knowledge. In a huge production environment, the resources mentioned can be multiplied by the system's amount in the production environment. This research proposes improving the process with automation while maintaining the quality and security level at the standard. Robot Framework, a useful and flexible opensource automation framework, is being utilized in this research following with a very successful result where the configuration is aligned with CIS (Center for Internet Security) run by the automation process. A tremendous amount of time and process are decreased while the configuration is according to this tool's standard.
2021-03-30
Gillen, R. E., Carter, J. M., Craig, C., Johnson, J. A., Scott, S. L..  2020.  Assessing Anomaly-Based Intrusion Detection Configurations for Industrial Control Systems. 2020 IEEE 21st International Symposium on "A World of Wireless, Mobile and Multimedia Networks" (WoWMoM). :360—366.

To reduce cost and ease maintenance, industrial control systems (ICS) have adopted Ethernetbased interconnections that integrate operational technology (OT) systems with information technology (IT) networks. This integration has made these critical systems vulnerable to attack. Security solutions tailored to ICS environments are an active area of research. Anomalybased network intrusion detection systems are well-suited for these environments. Often these systems must be optimized for their specific environment. In prior work, we introduced a method for assessing the impact of various anomaly-based network IDS settings on security. This paper reviews the experimental outcomes when we applied our method to a full-scale ICS test bed using actual attacks. Our method provides new and valuable data to operators enabling more informed decisions about IDS configurations.

2020-12-01
Zhang, Y., Deng, L., Chen, M., Wang, P..  2018.  Joint Bidding and Geographical Load Balancing for Datacenters: Is Uncertainty a Blessing or a Curse? IEEE/ACM Transactions on Networking. 26:1049—1062.

We consider the scenario where a cloud service provider (CSP) operates multiple geo-distributed datacenters to provide Internet-scale service. Our objective is to minimize the total electricity and bandwidth cost by jointly optimizing electricity procurement from wholesale markets and geographical load balancing (GLB), i.e., dynamically routing workloads to locations with cheaper electricity. Under the ideal setting where exact values of market prices and workloads are given, this problem reduces to a simple linear programming and is easy to solve. However, under the realistic setting where only distributions of these variables are available, the problem unfolds into a non-convex infinite-dimensional one and is challenging to solve. One of our main contributions is to develop an algorithm that is proven to solve the challenging problem optimally, by exploring the full design space of strategic bidding. Trace-driven evaluations corroborate our theoretical results, demonstrate fast convergence of our algorithm, and show that it can reduce the cost for the CSP by up to 20% as compared with baseline alternatives. This paper highlights the intriguing role of uncertainty in workloads and market prices, measured by their variances. While uncertainty in workloads deteriorates the cost-saving performance of joint electricity procurement and GLB, counter-intuitively, uncertainty in market prices can be exploited to achieve a cost reduction even larger than the setting without price uncertainty.

2020-11-02
Bloom, Gedare, Alsulami, Bassma, Nwafor, Ebelechukwu, Bertolotti, Ivan Cibrario.  2018.  Design patterns for the industrial Internet of Things. 2018 14th IEEE International Workshop on Factory Communication Systems (WFCS). :1—10.
The Internet of Things (IoT) is a vast collection of interconnected sensors, devices, and services that share data and information over the Internet with the objective of leveraging multiple information sources to optimize related systems. The technologies associated with the IoT have significantly improved the quality of many existing applications by reducing costs, improving functionality, increasing access to resources, and enhancing automation. The adoption of IoT by industries has led to the next industrial revolution: Industry 4.0. The rise of the Industrial IoT (IIoT) promises to enhance factory management, process optimization, worker safety, and more. However, the rollout of the IIoT is not without significant issues, and many of these act as major barriers that prevent fully achieving the vision of Industry 4.0. One major area of concern is the security and privacy of the massive datasets that are captured and stored, which may leak information about intellectual property, trade secrets, and other competitive knowledge. As a way forward toward solving security and privacy concerns, we aim in this paper to identify common input-output (I/O) design patterns that exist in applications of the IIoT. These design patterns enable constructing an abstract model representation of data flow semantics used by such applications, and therefore better understand how to secure the information related to IIoT operations. In this paper, we describe communication protocols and identify common I/O design patterns for IIoT applications with an emphasis on data flow in edge devices, which, in the industrial control system (ICS) setting, are most often involved in process control or monitoring.
2020-06-01
Nikolaidis, Fotios, Kossifidis, Nick, Leibovici, Thomas, Zertal, Soraya.  2018.  Towards a TRansparent I/O Solution. 2018 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW). :1221–1228.
The benefits of data distribution to multiple storage platforms with different characteristics have been widely acknowledged. Such systems are more tolerant to outages and bottlenecks and allow for more flexible policies regarding cost reduction, security and workload diversity. To leverage platforms simultaneously additional orchestration steps are needed. Existing approaches either implement such steps in the application's source code, resulting to minimum reusability across applications, or handle them at the infrastructure level. The latter usually involves over-engineering to handle different application behaviors and binds the system to a specific infrastructure. In this paper we present a middle-ware that decouples the I/O path from the application's source code and performs in-transit processing before data lands on the storage platforms. Abstracting the I/O process as a graph of reusable components allows the developers to easily implement complex storage solutions without the burden of writing custom code. Similarly, the administrators can create their own graph that reflects the infrastructure setup and append it to the preceding graph, so that various policies and infrastructure-related changes can be performed transparently to the application. Users can also extend the graph chain to enhance the application's functionality by using plug-ins. Our approach eliminates the need for custom I/O management code and allows for the applications to evolve independently of the storage back-end. To evaluate our system we employed a secure web service scenario that was seamlessly adapted to the changes in its storage back-end.
2020-03-09
Portolan, Michele, Savino, Alessandro, Leveugle, Regis, Di Carlo, Stefano, Bosio, Alberto, Di Natale, Giorgio.  2019.  Alternatives to Fault Injections for Early Safety/Security Evaluations. 2019 IEEE European Test Symposium (ETS). :1–10.
Functional Safety standards like ISO 26262 require a detailed analysis of the dependability of components subjected to perturbations. Radiation testing or even much more abstract RTL fault injection campaigns are costly and complex to set up especially for SoCs and Cyber Physical Systems (CPSs) comprising intertwined hardware and software. Moreover, some approaches are only applicable at the very end of the development cycle, making potential iterations difficult when market pressure and cost reduction are paramount. In this tutorial, we present a summary of classical state-of-the-art approaches, then alternative approaches for the dependability analysis that can give an early yet accurate estimation of the safety or security characteristics of HW-SW systems. Designers can rely on these tools to identify issues in their design to be addressed by protection mechanisms, ensuring that system dependability constraints are met with limited risk when subjected later to usual fault injections and to e.g., radiation testing or laser attacks for certification.
2020-02-10
Lakshminarayana, Subhash, Belmega, E. Veronica, Poor, H. Vincent.  2019.  Moving-Target Defense for Detecting Coordinated Cyber-Physical Attacks in Power Grids. 2019 IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (SmartGridComm). :1–7.
This work proposes a moving target defense (MTD) strategy to detect coordinated cyber-physical attacks (CCPAs) against power grids. A CCPA consists of a physical attack, such as disconnecting a transmission line, followed by a coordinated cyber attack that injects false data into the sensor measurements to mask the effects of the physical attack. Such attacks can lead to undetectable line outages and cause significant damage to the grid. The main idea of the proposed approach is to invalidate the knowledge that the attackers use to mask the effects of the physical attack by actively perturbing the grid's transmission line reactances using distributed flexible AC transmission system (D-FACTS) devices. We identify the MTD design criteria in this context to thwart CCPAs. The proposed MTD design consists of two parts. First, we identify the subset of links for D-FACTS device deployment that enables the defender to detect CCPAs against any link in the system. Then, in order to minimize the defense cost during the system's operational time, we use a game-theoretic approach to identify the best subset of links (within the D-FACTS deployment set) to perturb which will provide adequate protection. Extensive simulations performed using the MATPOWER simulator on IEEE bus systems verify the effectiveness of our approach in detecting CCPAs and reducing the operator's defense cost.
2018-12-03
Shearon, C. E..  2018.  IPC-1782 standard for traceability of critical items based on risk. 2018 Pan Pacific Microelectronics Symposium (Pan Pacific). :1–3.

Traceability has grown from being a specialized need for certain safety critical segments of the industry, to now being a recognized value-add tool for the industry as a whole that can be utilized for manual to automated processes End to End throughout the supply chain. The perception of traceability data collection persists as being a burden that provides value only when the most rare and disastrous of events take place. Disparate standards have evolved in the industry, mainly dictated by large OEM companies in the market create confusion, as a multitude of requirements and definitions proliferate. The intent of the IPC-1782 project is to bring the whole principle of traceability up to date and enable business to move faster, increase revenue, increase productivity, and decrease costs as a result of increased trust. Traceability, as defined in this standard will represent the most effective quality tool available, becoming an intrinsic part of best practice operations, with the encouragement of automated data collection from existing manufacturing systems which works well with Industry 4.0, integrating quality, reliability, product safety, predictive (routine, preventative, and corrective) maintenance, throughput, manufacturing, engineering and supply-chain data, reducing cost of ownership as well as ensuring timeliness and accuracy all the way from a finished product back through to the initial materials and granular attributes about the processes along the way. The goal of this standard is to create a single expandable and extendable data structure that can be adopted for all levels of traceability and enable easily exchanged information, as appropriate, across many industries. The scope includes support for the most demanding instances for detail and integrity such as those required by critical safety systems, all the way through to situations where only basic traceability, such as for simple consumer products, are required. A key driver for the adoption of the standard is the ability to find a relevant and achievable level of traceability that exactly meets the requirement following risk assessment of the business. The wealth of data accessible from traceability for analysis (e.g.; Big Data, etc.) can easily and quickly yield information that can raise expectations of very significant quality and performance improvements, as well as providing the necessary protection against the costs of issues in the market and providing very timely information to regulatory bodies along with consumers/customers as appropriate. This information can also be used to quickly raise yields, drive product innovation that resonates with consumers, and help drive development tests & design requirements that are meaningful to the Marketplace. Leveraging IPC 1782 to create the best value of Component Traceability for your business.

2018-04-11
Alsaiari, U., Gebali, F., Abd-El-Barr, M..  2017.  Programmable Assertion Checkers for Hardware Trojan Detection. 2017 1st Conference on PhD Research in Microelectronics and Electronics Latin America (PRIME-LA). :1–4.

Due to the increase in design complexity and cost of VLSI chips, a number of design houses outsource manufacturing and import designs in a way to reduce the cost. This results in a decrease of the authenticity and security of the manufactured product. Since product development involves outside sources, circuit designers can not guarantee that their hardware has not been altered. It is often possible that attackers include additional hardware in order to gain privileges over the original circuit or cause damage to the product. These added circuits are called ``Hardware Trojans''. In this paper, we investigate introducing necessary modules needed for detection of hardware Trojans. We also introduce necessary programmable logic fabric that can be used in the implementation of the hardware assertion checkers. Our target is to utilize the provided programable fabric in a System on Chip (SoC) and optimize the hardware assertion to cover the detection of most hardware trojans in each core of the target SoC.

2015-05-06
Miyoung Jang, Min Yoon, Jae-Woo Chang.  2014.  A privacy-aware query authentication index for database outsourcing. Big Data and Smart Computing (BIGCOMP), 2014 International Conference on. :72-76.

Recently, cloud computing has been spotlighted as a new paradigm of database management system. In this environment, databases are outsourced and deployed on a service provider in order to reduce cost for data storage and maintenance. However, the service provider might be untrusted so that the two issues of data security, including data confidentiality and query result integrity, become major concerns for users. Existing bucket-based data authentication methods have problem that the original spatial data distribution can be disclosed from data authentication index due to the unsophisticated data grouping strategies. In addition, the transmission overhead of verification object is high. In this paper, we propose a privacy-aware query authentication which guarantees data confidentiality and query result integrity for users. A periodic function-based data grouping scheme is designed to privately partition a spatial database into small groups for generating a signature of each group. The group signature is used to check the correctness and completeness of outsourced data when answering a range query to users. Through performance evaluation, it is shown that proposed method outperforms the existing method in terms of range query processing time up to 3 times.

2015-05-05
Pal, S.K., Sardana, P., Sardana, A..  2014.  Efficient search on encrypted data using bloom filter. Computing for Sustainable Global Development (INDIACom), 2014 International Conference on. :412-416.

Efficient and secure search on encrypted data is an important problem in computer science. Users having large amount of data or information in multiple documents face problems with their storage and security. Cloud services have also become popular due to reduction in cost of storage and flexibility of use. But there is risk of data loss, misuse and theft. Reliability and security of data stored in the cloud is a matter of concern, specifically for critical applications and ones for which security and privacy of the data is important. Cryptographic techniques provide solutions for preserving the confidentiality of data but make the data unusable for many applications. In this paper we report a novel approach to securely store the data on a remote location and perform search in constant time without the need for decryption of documents. We use bloom filters to perform simple as well advanced search operations like case sensitive search, sentence search and approximate search.
 

Miyoung Jang, Min Yoon, Jae-Woo Chang.  2014.  A privacy-aware query authentication index for database outsourcing. Big Data and Smart Computing (BIGCOMP), 2014 International Conference on. :72-76.

Recently, cloud computing has been spotlighted as a new paradigm of database management system. In this environment, databases are outsourced and deployed on a service provider in order to reduce cost for data storage and maintenance. However, the service provider might be untrusted so that the two issues of data security, including data confidentiality and query result integrity, become major concerns for users. Existing bucket-based data authentication methods have problem that the original spatial data distribution can be disclosed from data authentication index due to the unsophisticated data grouping strategies. In addition, the transmission overhead of verification object is high. In this paper, we propose a privacy-aware query authentication which guarantees data confidentiality and query result integrity for users. A periodic function-based data grouping scheme is designed to privately partition a spatial database into small groups for generating a signature of each group. The group signature is used to check the correctness and completeness of outsourced data when answering a range query to users. Through performance evaluation, it is shown that proposed method outperforms the existing method in terms of range query processing time up to 3 times.

2015-05-04
Shaobu Wang, Shuai Lu, Ning Zhou, Guang Lin, Elizondo, M., Pai, M.A..  2014.  Dynamic-Feature Extraction, Attribution, and Reconstruction (DEAR) Method for Power System Model Reduction. Power Systems, IEEE Transactions on. 29:2049-2059.

In interconnected power systems, dynamic model reduction can be applied to generators outside the area of interest (i.e., study area) to reduce the computational cost associated with transient stability studies. This paper presents a method of deriving the reduced dynamic model of the external area based on dynamic response measurements. The method consists of three steps, namely dynamic-feature extraction, attribution, and reconstruction (DEAR). In this method, a feature extraction technique, such as singular value decomposition (SVD), is applied to the measured generator dynamics after a disturbance. Characteristic generators are then identified in the feature attribution step for matching the extracted dynamic features with the highest similarity, forming a suboptimal “basis” of system dynamics. In the reconstruction step, generator state variables such as rotor angles and voltage magnitudes are approximated with a linear combination of the characteristic generators, resulting in a quasi-nonlinear reduced model of the original system. The network model is unchanged in the DEAR method. Tests on several IEEE standard systems show that the proposed method yields better reduction ratio and response errors than the traditional coherency based reduction methods.
 

2015-04-30
Omote, K., Thao, T.P..  2014.  A New Efficient and Secure POR Scheme Based on Network Coding. Advanced Information Networking and Applications (AINA), 2014 IEEE 28th International Conference on. :98-105.

Information is increasing quickly, database owners have tendency to outsource their data to an external service provider called Cloud Computing. Using Cloud, clients can remotely store their data without burden of local data storage and maintenance. However, such service provider is untrusted, therefore there are some challenges in data security: integrity, availability and confidentiality. Since integrity and availability are prerequisite conditions of the existence of a system, we mainly focus on them rather than confidentiality. To ensure integrity and availability, researchers have proposed network coding-based POR (Proof of Retrievability) schemes that enable the servers to demonstrate whether the data is retrievable or not. However, most of network coding-based POR schemes are inefficient in data checking and also cannot prevent a common attack in POR: small corruption attack. In this paper, we propose a new network coding-based POR scheme using dispersal code in order to reduce cost in checking phase and also to prevent small corruption attack.

Miyoung Jang, Min Yoon, Jae-Woo Chang.  2014.  A privacy-aware query authentication index for database outsourcing. Big Data and Smart Computing (BIGCOMP), 2014 International Conference on. :72-76.

Recently, cloud computing has been spotlighted as a new paradigm of database management system. In this environment, databases are outsourced and deployed on a service provider in order to reduce cost for data storage and maintenance. However, the service provider might be untrusted so that the two issues of data security, including data confidentiality and query result integrity, become major concerns for users. Existing bucket-based data authentication methods have problem that the original spatial data distribution can be disclosed from data authentication index due to the unsophisticated data grouping strategies. In addition, the transmission overhead of verification object is high. In this paper, we propose a privacy-aware query authentication which guarantees data confidentiality and query result integrity for users. A periodic function-based data grouping scheme is designed to privately partition a spatial database into small groups for generating a signature of each group. The group signature is used to check the correctness and completeness of outsourced data when answering a range query to users. Through performance evaluation, it is shown that proposed method outperforms the existing method in terms of range query processing time up to 3 times.