Biblio

Found 951 results

Filters: First Letter Of Last Name is E  [Clear All Filters]
2022-01-25
Sureshkumar, S, Agash, C P, Ramya, S, Kaviyaraj, R, Elanchezhiyan, S.  2021.  Augmented Reality with Internet of Things. 2021 International Conference on Artificial Intelligence and Smart Systems (ICAIS). :1426—1430.
Today technological changes make the probability of more complex things made into simple tasks with more accuracy in major areas and mostly in Manufacturing Industry. Internet of things contributes its major part in automation which helps human to make life easy by monitoring and directed to a related person with in a fraction of second. Continuous advances and improvement in computer vision, mobile computing and tablet screens have led to a revived interest in Augmented Reality the Augmented Reality makes the complex automation into an easier task by making more realistic real time animation in monitoring and automation on Internet of Things (eg like temperature, time, object information, installation manual, real time testing).In order to identify and link the augmented content, like object control of home appliances, industrial appliances. The AR-IoT will have a much cozier atmosphere and enhance the overall Interactivity of the IoT environment. Augmented Reality applications use a myriad of data generated by IoT devices and components, AR helps workers become more competitive and productive with the realistic environment in IoT. Augmented Reality and Internet of Things together plays a critical role in the development of next generation technologies. This paper describes the concept of how Augmented Reality can be integrated with industry(AR-IoT)4.0 and how the sensors are used to monitoring objects/things contiguously round the clock, and make the process of converting real-time physical objects into smart things for the upcoming new era with AR-IoT.
2022-03-01
ElDiwany, Belal Essam, El-Sherif, Amr A., ElBatt, Tamer.  2021.  Network-Coded Wireless Powered Cellular Networks: Lifetime and Throughput Analysis. 2021 IEEE Wireless Communications and Networking Conference (WCNC). :1–6.
In this paper, we study a wireless powered cellular network (WPCN) supported with network coding capability. In particular, we consider a network consisting of k cellular users (CUs) served by a hybrid access point (HAP) that takes over energy transfer to the users on top of information transmission over both the uplink (UL) and downlink (DL). Each CU has k+1 states representing its communication behavior, and collectively are referred to as the user demand profile. Opportunistically, when the CUs have information to be exchanged through the HAP, it broadcasts this information in coded format to the exchanging pairs, resulting in saving time slots over the DL. These saved slots are then utilized by the HAP to prolong the network lifetime and enhance the network throughput. We quantify, analytically, the performance gain of our network-coded WPCN over the conventional one, that does not employ network coding, in terms of network lifetime and throughput. We consider the two extreme cases of using all the saved slots either for energy boosting or throughput enhancement. In addition, a lifetime/throughput optimization is carried out by the HAP for balancing the saved slots assignment in an optimized fashion, where the problem is formulated as a mixed-integer linear programming optimization problem. Numerical results exhibit the network performance gains from the lifetime and throughput perspectives, for a uniform user demand profile across all CUs. Moreover, the effect of biasing the user demand profile of some CUs in the network reveals considerable improvement in the network performance gains.
2022-08-12
Aumayr, Lukas, Maffei, Matteo, Ersoy, Oğuzhan, Erwig, Andreas, Faust, Sebastian, Riahi, Siavash, Hostáková, Kristina, Moreno-Sanchez, Pedro.  2021.  Bitcoin-Compatible Virtual Channels. 2021 IEEE Symposium on Security and Privacy (SP). :901–918.
Current permissionless cryptocurrencies such as Bitcoin suffer from a limited transaction rate and slow confirmation time, which hinders further adoption. Payment channels are one of the most promising solutions to address these problems, as they allow the parties of the channel to perform arbitrarily many payments in a peer-to-peer fashion while uploading only two transactions on the blockchain. This concept has been generalized into payment channel networks where a path of payment channels is used to settle the payment between two users that might not share a direct channel between them. However, this approach requires the active involvement of each user in the path, making the system less reliable (they might be offline), more expensive (they charge fees per payment), and slower (intermediaries need to be actively involved in the payment). To mitigate this issue, recent work has introduced the concept of virtual channels (IEEE S&P’19), which involve intermediaries only in the initial creation of a bridge between payer and payee, who can later on independently perform arbitrarily many off-chain transactions. Unfortunately, existing constructions are only available for Ethereum, as they rely on its account model and Turing-complete scripting language. The realization of virtual channels in other blockchain technologies with limited scripting capabilities, like Bitcoin, was so far considered an open challenge.In this work, we present the first virtual channel protocols that are built on the UTXO-model and require a scripting language supporting only a digital signature scheme and a timelock functionality, being thus backward compatible with virtually every cryptocurrency, including Bitcoin. We formalize the security properties of virtual channels as an ideal functionality in the Universal Composability framework and prove that our protocol constitutes a secure realization thereof. We have prototyped and evaluated our protocol on the Bitcoin blockchain, demonstrating its efficiency: for n sequential payments, they require an off-chain exchange of 9+2n transactions or a total of 3524+695n bytes, with no on-chain footprint in the optimistic case. This is a substantial improvement compared to routing payments in a payment channel network, which requires 8n transactions with a total of 3026n bytes to be exchanged.
2022-06-15
Kurt, Ahmet, Mercana, Suat, Erdin, Enes, Akkaya, Kemal.  2021.  Enabling Micro-payments on IoT Devices using Bitcoin Lightning Network. 2021 IEEE International Conference on Blockchain and Cryptocurrency (ICBC). :1–3.
Lightning Network (LN) addresses the scalability problem of Bitcoin by leveraging off-chain transactions. Nevertheless, it is not possible to run LN on resource-constrained IoT devices due to its storage, memory, and processing requirements. Therefore, in this paper, we propose an efficient and secure protocol that enables an IoT device to use LN's functions through a gateway LN node. The idea is to involve the IoT device in LN operations with its digital signature by replacing original 2-of-2 multisignature channels with 3-of-3 multisignature channels. Our protocol enforces the LN gateway to request the IoT device's cryptographic signature for all operations on the channel. We evaluated the proposed protocol by implementing it on a Raspberry Pi for a toll payment scenario and demonstrated its feasibility and security.
2021-12-21
Brionna Davis, Grace Jennings, Taylor Pothast, Ilias Gerostathopoulos, Evangelos Pournaras, Raphael Stern.  2021.  Decentralized Optimization of Vehicle Route Planning - A Cross-City Comparative Study. IEEE Internet Computing. 25(3):34-42.

The introduction of connected and autonomous vehicles enables new possibilities in vehicle routing: Knowing the origin and destination of each vehicle in the network can allow for coordinated real-time routing of the vehicles to optimize network performance. However, this relies on individual vehicles being “altruistic,” i.e., willing to accept alternative less-preferred routes. We conduct a study to compare different levels of agent altruism in decentralized vehicles coordination and the effect on the network-level traffic performance. This work introduces novel load-balancing scenarios of traffic flow in real-world cities for varied levels of agent altruism. We show evidence that the new decentralized optimization router is more effective with networks of high load.

2022-04-25
El Rai, Marwa, Al-Saad, Mina, Darweesh, Muna, Al Mansoori, Saeed, Al Ahmad, Hussain, Mansoor, Wathiq.  2021.  Moving Objects Segmentation in Infrared Scene Videos. 2021 4th International Conference on Signal Processing and Information Security (ICSPIS). :17–20.
Nowadays, developing an intelligent system for segmenting the moving object from the background is essential task for video surveillance applications. Recently, a deep learning segmentation algorithm composed of encoder CNN, a Feature Pooling Module and a decoder CNN called FgSegNET\_S has been proposed. It is capable to train the model using few training examples. FgSegNET\_S is relying only on the spatial information while it is fundamental to include temporal information to distinguish if an object is moving or not. In this paper, an improved version known as (T\_FgSegNET\_S) is proposed by using the subtracted images from the initial background as input. The proposed approach is trained and evaluated using two publicly available infrared datasets: remote scene infrared videos captured by medium-wave infrared (MWIR) sensors and the Grayscale Thermal Foreground Detection (GTFD) dataset. The performance of network is evaluated using precision, recall, and F-measure metrics. The experiments show improved results, especially when compared to other state-of-the-art methods.
2022-07-12
Farrukh, Yasir Ali, Ahmad, Zeeshan, Khan, Irfan, Elavarasan, Rajvikram Madurai.  2021.  A Sequential Supervised Machine Learning Approach for Cyber Attack Detection in a Smart Grid System. 2021 North American Power Symposium (NAPS). :1—6.
Modern smart grid systems are heavily dependent on Information and Communication Technology, and this dependency makes them prone to cyber-attacks. The occurrence of a cyber-attack has increased in recent years resulting in substantial damage to power systems. For a reliable and stable operation, cyber protection, control, and detection techniques are becoming essential. Automated detection of cyberattacks with high accuracy is a challenge. To address this, we propose a two-layer hierarchical machine learning model having an accuracy of 95.44 % to improve the detection of cyberattacks. The first layer of the model is used to distinguish between the two modes of operation - normal state or cyberattack. The second layer is used to classify the state into different types of cyberattacks. The layered approach provides an opportunity for the model to focus its training on the targeted task of the layer, resulting in improvement in model accuracy. To validate the effectiveness of the proposed model, we compared its performance against other recent cyber attack detection models proposed in the literature.
2022-04-13
Sulaga, D Tulasi, Maag, Angelika, Seher, Indra, Elchouemi, Amr.  2021.  Using Deep learning for network traffic prediction to secure Software networks against DDoS attacks. 2021 6th International Conference on Innovative Technology in Intelligent System and Industrial Applications (CITISIA). :1—10.
Deep learning (DL) is an emerging technology that is being used in many areas due to its effectiveness. One of its major applications is attack detection and prevention of backdoor attacks. Sampling-based measurement approaches in the software-defined network of an Internet of Things (IoT) network often result in low accuracy, high overhead, higher memory consumption, and low attack detection. This study aims to review and analyse papers on DL-based network prediction techniques against the problem of Distributed Denial of service attack (DDoS) in a secure software network. Techniques and approaches have been studied, that can effectively predict network traffic and detect DDoS attacks. Based on this review, major components are identified in each work from which an overall system architecture is suggested showing the basic processes needed. Major findings are that the DL is effective against DDoS attacks more than other state of the art approaches.
2022-04-26
AlQahtani, Ali Abdullah S., Alamleh, Hosam, El-Awadi, Zakaria.  2021.  Secure Digital Signature Validated by Ambient User amp;\#x2019;s Wi-Fi-enabled devices. 2021 IEEE 5th International Conference on Information Technology, Information Systems and Electrical Engineering (ICITISEE). :159–162.

In cyberspace, a digital signature is a mathematical technique that plays a significant role, especially in validating the authenticity of digital messages, emails, or documents. Furthermore, the digital signature mechanism allows the recipient to trust the authenticity of the received message that is coming from the said sender and that the message was not altered in transit. Moreover, a digital signature provides a solution to the problems of tampering and impersonation in digital communications. In a real-life example, it is equivalent to a handwritten signature or stamp seal, but it offers more security. This paper proposes a scheme to enable users to digitally sign their communications by validating their identity through users’ mobile devices. This is done by utilizing the user’s ambient Wi-Fi-enabled devices. Moreover, the proposed scheme depends on something that a user possesses (i.e., Wi-Fi-enabled devices), and something that is in the user’s environment (i.e., ambient Wi-Fi access points) where the validation process is implemented, in a way that requires no effort from users and removes the "weak link" from the validation process. The proposed scheme was experimentally examined.

2022-06-30
Ergün, Salih, Maden, Fatih.  2021.  An ADC Based Random Number Generator from a Discrete Time Chaotic Map. 2021 26th IEEE Asia-Pacific Conference on Communications (APCC). :79—82.
This paper introduces a robust random number generator that based on Bernoulli discrete chaotic map. An eight bit SAR ADC is used with discrete time chaotic map to generate random bit sequences. Compared to RNGs that use the continuous time chaotic map, sensitivity to process, voltage and temperature (PVT) variations are reduced. Thanks to utilizing switch capacitor circuits to implement Bernoulli chaotic map equations, power consumption decreased significantly. Proposed design that has a throughput of 500 Kbit/second is implemented in TSMC 180 nm process technology. Generated bit sequences has successfully passed all four primary tests of FIPS-140-2 test suite and all tests of NIST 820–22 test suite without post processing. Furthermore, data rate can be increased by sacrificing power consumption. Hence, proposed architecture could be utilized in high speed cryptography applications.
2022-04-18
Enireddy, Vamsidhar, Somasundaram, K., Mahesh M, P. C. Senthil, Ramkumar Prabhu, M., Babu, D. Vijendra, C, Karthikeyan..  2021.  Data Obfuscation Technique in Cloud Security. 2021 2nd International Conference on Smart Electronics and Communication (ICOSEC). :358–362.
Cloud storage, in general, is a collection of Computer Technology resources provided to consumers over the internet on a leased basis. Cloud storage has several advantages, including simplicity, reliability, scalability, convergence, and cost savings. One of the most significant impediments to cloud computing's growth is security. This paper proposes a security approach based on cloud security. Cloud security now plays a critical part in everyone's life. Due to security concerns, data is shared between cloud service providers and other users. In order to protect the data from unwanted access, the Security Service Algorithm (SSA), which is called as MONcrypt is used to secure the information. This methodology is established on the obfuscation of data techniques. The MONcrypt SSA is a Security as a Service (SaaS) product. When compared to current obfuscation strategies, the proposed methodology offers a better efficiency and smart protection. In contrast to the current method, MONcrypt eliminates the different dimensions of information that are uploaded to cloud storage. The proposed approach not only preserves the data's secrecy but also decreases the size of the plaintext. The exi sting method does not reduce the size of data until it has been obfuscated. The findings show that the recommended MONcrypt offers optimal protection for the data stored in the cloud within the shortest amount of time. The proposed protocol ensures the confidentiality of the information while reducing the plaintext size. Current techniques should not reduce the size of evidence once it has been muddled. Based on the findings, it is clear that the proposed MONcrypt provides the highest level of protection in the shortest amount of time for rethought data.
2022-10-20
Alexan, Wassim, Mamdouh, Eyad, Elkhateeb, Abdelrahman, Al-Seba'ey, Fahd, Amr, Ziad, Khalil, Hana.  2021.  Securing Sensitive Data Through Corner Filters, Chaotic Maps and LSB Embedding. 2021 3rd Novel Intelligent and Leading Emerging Sciences Conference (NILES). :359—364.
This paper proposes 2 multiple layer message security schemes. Information security is carried out through the implementation of cryptography, steganography and image processing techniques. In both schemes, the sensitive data is first encrypted by employing a chaotic function. In the first proposed scheme, LSB steganography is then applied to 2D slices of a 3D image. In the second proposed scheme, a corner detection filter is first applied to the 2D slices of a 3D image, then LSB embedding is carried out in those corner-detected pixels. The number of neighboring pixels used for corner detection is varied and its effect is noted. Performance of the proposed schemes is numerically evaluated using a number of metrics, including the mean squared error (MSE), the peak signal to noise ratio (PSNR), the structure similarity index measure (SSIM), the normalized cross-correlation (NCC), the image fidelity (IF), as well as the image difference (ID). The proposed schemes exhibit superior payload capacity and security in comparison to their counterparts from the literature.
2022-06-06
Böhm, Fabian, Englbrecht, Ludwig, Friedl, Sabrina, Pernul, Günther.  2021.  Visual Decision-Support for Live Digital Forensics. 2021 IEEE Symposium on Visualization for Cyber Security (VizSec). :58–67.

Performing a live digital forensics investigation on a running system is challenging due to the time pressure under which decisions have to be made. Newly proliferating and frequently applied types of malware (e.g., fileless malware) increase the need to conduct digital forensic investigations in real-time. In the course of these investigations, forensic experts are confronted with a wide range of different forensic tools. The decision, which of those are suitable for the current situation, is often based on the cyber forensics experts’ experience. Currently, there is no reliable automated solution to support this decision-making. Therefore, we derive requirements for visually supporting the decision-making process for live forensic investigations and introduce a research prototype that provides visual guidance for cyber forensic experts during a live digital forensics investigation. Our prototype collects relevant core information for live digital forensics and provides visual representations for connections between occurring events, developments over time, and detailed information on specific events. To show the applicability of our approach, we analyze an exemplary use case using the prototype and demonstrate the support through our approach.

2022-05-10
Hammad, Mohamed, Elmedany, Wael, Ismail, Yasser.  2021.  Design and Simulation of AES S-Box Towards Data Security in Video Surveillance Using IP Core Generator. 2021 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT). :469–476.
Broadcasting applications such as video surveillance systems are using High Definition (HD) videos. The use of high-resolution videos increases significantly the data volume of video coding standards such as High-Efficiency Video Coding (HEVC) and Advanced Video Coding (AVC), which increases the challenge for storing, processing, encrypting, and transmitting these data over different communication channels. Video compression standards use state-of-the-art techniques to compress raw video sequences more efficiently, such techniques require high computational complexity and memory utilization. With the emergent of using HEVC and video surveillance systems, many security risks arise such as man-in-the-middle attacks, and unauthorized disclosure. Such risks can be mitigated by encrypting the traffic of HEVC. The most widely used encryption algorithm is the Advanced Encryption Standard (AES). Most of the computational complexity in AES hardware-implemented is due to S-box or sub-byte operation and that because it needs many resources and it is a non-linear structure. The proposed AES S-box ROM design considers the latest HEVC used for homeland security video surveillance systems. This paper presents different designs for VHDL efficient ROM implementation of AES S-box using IP core generator, ROM components, and using Functions, which are all supported by Xilinx. IP core generator has Block Memory Generator (BMG) component in its library. S-box IP core ROM is implemented using Single port block memory. The S-box lookup table has been used to fill the ROM using the .coe file format provided during the initialization of the IP core ROM. The width is set to 8-bit to address the 256 values while the depth is set to 8-bit which represents the data filed in the ROM. The whole design is synthesized using Xilinx ISE Design Suite 14.7 software, while Modelism (version10.4a) is used for the simulation process. The proposed IP core ROM design has shown better memory utilization compared to non-IP core ROM design, which is more suitable for memory-intensive applications. The proposed design is suitable for implementation using the FPGA ROM design. Hardware complexity, frequency, memory utilization, and delay are presented in this paper.
2022-02-07
Elbahadır, Hamza, Erdem, Ebubekir.  2021.  Modeling Intrusion Detection System Using Machine Learning Algorithms in Wireless Sensor Networks. 2021 6th International Conference on Computer Science and Engineering (UBMK). :401–406.
Wireless sensor networks (WSN) are used to perceive many data such as temperature, vibration, pressure in the environment and to produce results; it is widely used, including in critical fields such as military, intelligence and health. However, because of WSNs have different infrastructure and architecture than traditional networks, different security measures must be taken. In this study, an intrusion detection system (IDS) is modeled to ensure WSN security. Since the signature, misuse and anomaly based detection methods for intrusion detection systems are insufficient to provide security alone, a hybrid model is proposed in which these methods are used together. In the hybrid model, anomaly rules were defined for attack detection, and machine learning algorithms BayesNet, J48 and Random Forest were used to classify normal and abnormal traffic. Unlike the studies in the literature, CSE-CIC-IDS2018, the most up-to-date data set, was used to create attack profiles. Considering both hardware constraints and battery capacities of WSNs; the data was pre-processed in accordance with data mining principles. The results showed that the developed model has high accuracy and low false alarm rate.
Ben Abdel Ouahab, Ikram, Elaachak, Lotfi, Alluhaidan, Yasser A., Bouhorma, Mohammed.  2021.  A new approach to detect next generation of malware based on machine learning. 2021 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT). :230–235.
In these days, malware attacks target different kinds of devices as IoT, mobiles, servers even the cloud. It causes several hardware damages and financial losses especially for big companies. Malware attacks represent a serious issue to cybersecurity specialists. In this paper, we propose a new approach to detect unknown malware families based on machine learning classification and visualization technique. A malware binary is converted to grayscale image, then for each image a GIST descriptor is used as input to the machine learning model. For the malware classification part we use 3 machine learning algorithms. These classifiers are so efficient where the highest precision reach 98%. Once we train, test and evaluate models we move to simulate 2 new malware families. We do not expect a good prediction since the model did not know the family; however our goal is to analyze the behavior of our classifiers in the case of new family. Finally, we propose an approach using a filter to know either the classification is normal or it's a zero-day malware.
2022-08-12
Blanco, Geison, Perez, Juan, Monsalve, Jonathan, Marquez, Miguel, Esnaola, Iñaki, Arguello, Henry.  2021.  Single Snapshot System for Compressive Covariance Matrix Estimation for Hyperspectral Imaging via Lenslet Array. 2021 XXIII Symposium on Image, Signal Processing and Artificial Vision (STSIVA). :1—5.
Compressive Covariance Sampling (CCS) is a strategy used to recover the covariance matrix (CM) directly from compressive measurements. Several works have proven the advantages of CSS in Compressive Spectral Imaging (CSI) but most of these algorithms require multiple random projections of the scene to obtain good reconstructions. However, several low-resolution copies of the scene can be captured in a single snapshot through a lenslet array. For this reason, this paper proposes a sensing protocol and a single snapshot CCS optical architecture using a lenslet array based on the Dual Dispersive Aperture Spectral Imager(DD-CASSI) that allows the recovery of the covariance matrix with a single snapshot. In this architecture uses the lenslet array allows to obtain different projections of the image in a shot due to the special coded aperture. In order to validate the proposed approach, simulations evaluated the quality of the recovered CM and the performance recovering the spectral signatures against traditional methods. Results show that the image reconstructions using CM have PSNR values about 30 dB, and reconstructed spectrum has a spectral angle mapper (SAM) error less than 15° compared to the original spectral signatures.
2021-01-25
Merouane, E. M., Escudero, C., Sicard, F., Zamai, E..  2020.  Aging Attacks against Electro-Mechanical Actuators from Control Signal Manipulation. 2020 IEEE International Conference on Industrial Technology (ICIT). :133–138.
The progress made in terms of controller technologies with the introduction of remotely-accessibility capacity in the digital controllers has opened the door to new cybersecurity threats on the Industrial Control Systems (ICSs). Among them, some aim at damaging the ICS's physical system. In this paper, a corrupted controller emitting a non-legitimate Pulse Width Modulation control signal to an Electro-Mechanical Actuator (EMA) is considered. The attacker's capabilities for accelerating the EMA's aging by inducing Partial Discharges (PDs) are investigated. A simplified model is considered for highlighting the influence of the carrier frequency of the control signal over the amplitude and the repetition of the PDs involved in the EMA's aging.
2021-09-21
bin Asad, Ashub, Mansur, Raiyan, Zawad, Safir, Evan, Nahian, Hossain, Muhammad Iqbal.  2020.  Analysis of Malware Prediction Based on Infection Rate Using Machine Learning Techniques. 2020 IEEE Region 10 Symposium (TENSYMP). :706–709.
In this modern, technological age, the internet has been adopted by the masses. And with it, the danger of malicious attacks by cybercriminals have increased. These attacks are done via Malware, and have resulted in billions of dollars of financial damage. This makes the prevention of malicious attacks an essential part of the battle against cybercrime. In this paper, we are applying machine learning algorithms to predict the malware infection rates of computers based on its features. We are using supervised machine learning algorithms and gradient boosting algorithms. We have collected a publicly available dataset, which was divided into two parts, one being the training set, and the other will be the testing set. After conducting four different experiments using the aforementioned algorithms, it has been discovered that LightGBM is the best model with an AUC Score of 0.73926.
2022-08-12
Chao, Wang, Qun, Li, XiaoHu, Wang, TianYu, Ren, JiaHan, Dong, GuangXin, Guo, EnJie, Shi.  2020.  An Android Application Vulnerability Mining Method Based On Static and Dynamic Analysis. 2020 IEEE 5th Information Technology and Mechatronics Engineering Conference (ITOEC). :599–603.
Due to the advantages and limitations of the two kinds of vulnerability mining methods of static and dynamic analysis of android applications, the paper proposes a method of Android application vulnerability mining based on dynamic and static combination. Firstly, the static analysis method is used to obtain the basic vulnerability analysis results of the application, and then the input test case of dynamic analysis is constructed on this basis. The fuzzy input test is carried out in the real machine environment, and the application security vulnerability is verified with the taint analysis technology, and finally the application vulnerability report is obtained. Experimental results show that compared with static analysis results, the method can significantly improve the accuracy of vulnerability mining.
2021-09-16
Almohri, Hussain M. J., Watson, Layne T., Evans, David.  2020.  An Attack-Resilient Architecture for the Internet of Things. IEEE Transactions on Information Forensics and Security. 15:3940–3954.
With current IoT architectures, once a single device in a network is compromised, it can be used to disrupt the behavior of other devices on the same network. Even though system administrators can secure critical devices in the network using best practices and state-of-the-art technology, a single vulnerable device can undermine the security of the entire network. The goal of this work is to limit the ability of an attacker to exploit a vulnerable device on an IoT network and fabricate deceitful messages to co-opt other devices. The approach is to limit attackers by using device proxies that are used to retransmit and control network communications. We present an architecture that prevents deceitful messages generated by compromised devices from affecting the rest of the network. The design assumes a centralized and trustworthy machine that can observe the behavior of all devices on the network. The central machine collects application layer data, as opposed to low-level network traffic, from each IoT device. The collected data is used to train models that capture the normal behavior of each individual IoT device. The normal behavioral data is then used to monitor the IoT devices and detect anomalous behavior. This paper reports on our experiments using both a binary classifier and a density-based clustering algorithm to model benign IoT device behavior with a realistic test-bed, designed to capture normal behavior in an IoT-monitored environment. Results from the IoT testbed show that both the classifier and the clustering algorithms are promising and encourage the use of application-level data for detecting compromised IoT devices.
Conference Name: IEEE Transactions on Information Forensics and Security
2021-06-28
Oualhaj, Omar Ait, Mohamed, Amr, Guizani, Mohsen, Erbad, Aiman.  2020.  Blockchain Based Decentralized Trust Management framework. 2020 International Wireless Communications and Mobile Computing (IWCMC). :2210–2215.
The blockchain is a storage technology and transmission of information, transparent, secure, and operating without central control. In this paper, we propose a new decentralized trust management and cooperation model where data is shared via blockchain and we explore the revenue distribution under different consensus schemes. To reduce the power calculation with respect to the control mechanism, our proposal adopts the possibility of Proof on Trust (PoT) and Proof of proof-of-stake based trust to replace the proof of work (PoW) scheme, to carry out the mining and storage of new data blocks. To detect nodes with malicious behavior to provide false system information, the trust updating algorithm is proposed..
2021-03-22
Kellogg, M., Schäf, M., Tasiran, S., Ernst, M. D..  2020.  Continuous Compliance. 2020 35th IEEE/ACM International Conference on Automated Software Engineering (ASE). :511–523.
Vendors who wish to provide software or services to large corporations and governments must often obtain numerous certificates of compliance. Each certificate asserts that the software satisfies a compliance regime, like SOC or the PCI DSS, to protect the privacy and security of sensitive data. The industry standard for obtaining a compliance certificate is an auditor manually auditing source code. This approach is expensive, error-prone, partial, and prone to regressions. We propose continuous compliance to guarantee that the codebase stays compliant on each code change using lightweight verification tools. Continuous compliance increases assurance and reduces costs. Continuous compliance is applicable to any source-code compliance requirement. To illustrate our approach, we built verification tools for five common audit controls related to data security: cryptographically unsafe algorithms must not be used, keys must be at least 256 bits long, credentials must not be hard-coded into program text, HTTPS must always be used instead of HTTP, and cloud data stores must not be world-readable. We evaluated our approach in three ways. (1) We applied our tools to over 5 million lines of open-source software. (2) We compared our tools to other publicly-available tools for detecting misuses of encryption on a previously-published benchmark, finding that only ours are suitable for continuous compliance. (3) We deployed a continuous compliance process at AWS, a large cloud-services company: we integrated verification tools into the compliance process (including auditors accepting their output as evidence) and ran them on over 68 million lines of code. Our tools and the data for the former two evaluations are publicly available.
2022-11-08
Wshah, Safwan, Shadid, Reem, Wu, Yuhao, Matar, Mustafa, Xu, Beilei, Wu, Wencheng, Lin, Lei, Elmoudi, Ramadan.  2020.  Deep Learning for Model Parameter Calibration in Power Systems. 2020 IEEE International Conference on Power Systems Technology (POWERCON). :1–6.
In power systems, having accurate device models is crucial for grid reliability, availability, and resiliency. Existing model calibration methods based on mathematical approaches often lead to multiple solutions due to the ill-posed nature of the problem, which would require further interventions from the field engineers in order to select the optimal solution. In this paper, we present a novel deep-learning-based approach for model parameter calibration in power systems. Our study focused on the generator model as an example. We studied several deep-learning-based approaches including 1-D Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), and Gated Recurrent Units (GRU), which were trained to estimate model parameters using simulated Phasor Measurement Unit (PMU) data. Quantitative evaluations showed that our proposed methods can achieve high accuracy in estimating the model parameters, i.e., achieved a 0.0079 MSE on the testing dataset. We consider these promising results to be the basis for further exploration and development of advanced tools for model validation and calibration.
2021-11-29
Jamieson, Laura, Moreno-Garcia, Carlos Francisco, Elyan, Eyad.  2020.  Deep Learning for Text Detection and Recognition in Complex Engineering Diagrams. 2020 International Joint Conference on Neural Networks (IJCNN). :1–7.
Engineering drawings such as Piping and Instrumentation Diagrams contain a vast amount of text data which is essential to identify shapes, pipeline activities, tags, amongst others. These diagrams are often stored in undigitised format, such as paper copy, meaning the information contained within the diagrams is not readily accessible to inspect and use for further data analytics. In this paper, we make use of the benefits of recent deep learning advances by selecting models for both text detection and text recognition, and apply them to the digitisation of text from within real world complex engineering diagrams. Results show that 90% of text strings were detected including vertical text strings, however certain non text diagram elements were detected as text. Text strings were obtained by the text recognition method for 86% of detected text instances. The findings show that whilst the chosen Deep Learning methods were able to detect and recognise text which occurred in simple scenarios, more complex representations of text including those text strings located in close proximity to other drawing elements were highlighted as a remaining challenge.