Visible to the public Biblio

Found 273 results

Filters: Keyword is Predictive models  [Clear All Filters]
2023-09-20
Kumar Sahoo, Goutam, Kanike, Keerthana, Das, Santos Kumar, Singh, Poonam.  2022.  Machine Learning-Based Heart Disease Prediction: A Study for Home Personalized Care. 2022 IEEE 32nd International Workshop on Machine Learning for Signal Processing (MLSP). :01—06.
This study develops a framework for personalized care to tackle heart disease risk using an at-home system. The machine learning models used to predict heart disease are Logistic Regression, K - Nearest Neighbor, Support Vector Machine, Naive Bayes, Decision Tree, Random Forest and XG Boost. Timely and efficient detection of heart disease plays an important role in health care. It is essential to detect cardiovascular disease (CVD) at the earliest, consult a specialist doctor before the severity of the disease and start medication. The performance of the proposed model was assessed using the Cleveland Heart Disease dataset from the UCI Machine Learning Repository. Compared to all machine learning algorithms, the Random Forest algorithm shows a better performance accuracy score of 90.16%. The best model may evaluate patient fitness rather than routine hospital visits. The proposed work will reduce the burden on hospitals and help hospitals reach only critical patients.
Shi, Yong.  2022.  A Machine Learning Study on the Model Performance of Human Resources Predictive Algorithms. 2022 4th International Conference on Applied Machine Learning (ICAML). :405—409.
A good ecological environment is crucial to attracting talents, cultivating talents, retaining talents and making talents fully effective. This study provides a solution to the current mainstream problem of how to deal with excellent employee turnover in advance, so as to promote the sustainable and harmonious human resources ecological environment of enterprises with a shortage of talents.This study obtains open data sets and conducts data preprocessing, model construction and model optimization, and describes a set of enterprise employee turnover prediction models based on RapidMiner workflow. The data preprocessing is completed with the help of the data statistical analysis software IBM SPSS Statistic and RapidMiner.Statistical charts, scatter plots and boxplots for analysis are generated to realize data visualization analysis. Machine learning, model application, performance vector, and cross-validation through RapidMiner's multiple operators and workflows. Model design algorithms include support vector machines, naive Bayes, decision trees, and neural networks. Comparing the performance parameters of the algorithm model from the four aspects of accuracy, precision, recall and F1-score. It is concluded that the performance of the decision tree algorithm model is the highest. The performance evaluation results confirm the effectiveness of this model in sustainable exploring of enterprise employee turnover prediction in human resource management.
2023-09-08
Zhong, Luoyifan.  2022.  Optimization and Prediction of Intelligent Tourism Data. 2022 IEEE 8th Intl Conference on Big Data Security on Cloud (BigDataSecurity), IEEE Intl Conference on High Performance and Smart Computing, (HPSC) and IEEE Intl Conference on Intelligent Data and Security (IDS). :186–188.
Tourism is one of the main sources of income in Australia. The number of tourists will affect airlines, hotels and other stakeholders. Predicting the arrival of tourists can make full preparations for welcoming tourists. This paper selects Queensland Tourism data as intelligent data. Carry out data visualization around the intelligent data, establish seasonal ARIMA model, find out the characteristics and predict. In order to improve the accuracy of prediction. Based on the tourism data around Queensland, build a 10 layer Back Propagation neural network model. It is proved that the network shows good performance for the data prediction of this paper.
2023-08-25
Nagabhushana Babu, B, Gunasekaran, M.  2022.  An Analysis of Insider Attack Detection Using Machine Learning Algorithms. 2022 IEEE 2nd International Conference on Mobile Networks and Wireless Communications (ICMNWC). :1—7.
Among the greatest obstacles in cybersecurity is insider threat, which is a well-known massive issue. This anomaly shows that the vulnerability calls for specialized detection techniques, and resources that can help with the accurate and quick detection of an insider who is harmful. Numerous studies on identifying insider threats and related topics were also conducted to tackle this problem are proposed. Various researches sought to improve the conceptual perception of insider risks. Furthermore, there are numerous drawbacks, including a dearth of actual cases, unfairness in drawing decisions, a lack of self-optimization in learning, which would be a huge concern and is still vague, and the absence of an investigation that focuses on the conceptual, technological, and numerical facets concerning insider threats and identifying insider threats from a wide range of perspectives. The intention of the paper is to afford a thorough exploration of the categories, levels, and methodologies of modern insiders based on machine learning techniques. Further, the approach and evaluation metrics for predictive models based on machine learning are discussed. The paper concludes by outlining the difficulties encountered and offering some suggestions for efficient threat identification using machine learning.
2023-08-23
Liang, Chenjun, Deng, Li, Zhu, Jincan, Cao, Zhen, Li, Chao.  2022.  Cloud Storage I/O Load Prediction Based on XB-IOPS Feature Engineering. 2022 IEEE 8th Intl Conference on Big Data Security on Cloud (BigDataSecurity), IEEE Intl Conference on High Performance and Smart Computing, (HPSC) and IEEE Intl Conference on Intelligent Data and Security (IDS). :54—60.
With the popularization of cloud computing and the deepening of its application, more and more cloud block storage systems have been put into use. The performance optimization of cloud block storage systems has become an important challenge facing today, which is manifested in the reduction of system performance caused by the unbalanced resource load of cloud block storage systems. Accurately predicting the I/O load status of the cloud block storage system can effectively avoid the load imbalance problem. However, the cloud block storage system has the characteristics of frequent random reads and writes, and a large amount of I/O requests, which makes prediction difficult. Therefore, we propose a novel I/O load prediction method for XB-IOPS feature engineering. The feature engineering is designed according to the I/O request pattern, I/O size and I/O interference, and realizes the prediction of the actual load value at a certain moment in the future and the average load value in the continuous time interval in the future. Validated on a real dataset of Alibaba Cloud block storage system, the results show that the XB-IOPS feature engineering prediction model in this paper has better performance in Alibaba Cloud block storage devices where random I/O and small I/O dominate. The prediction performance is better, and the prediction time is shorter than other prediction models.
2023-08-18
Shen, Wendi, Yang, Genke.  2022.  An error neighborhood-based detection mechanism to improve the performance of anomaly detection in industrial control systems. 2022 International Conference on Mechanical, Automation and Electrical Engineering (CMAEE). :25—29.
Anomaly detection for devices (e.g, sensors and actuators) plays a crucial role in Industrial Control Systems (ICS) for security protection. The typical framework of deep learning-based anomaly detection includes a model to predict or reconstruct the state of devices and a detection mechanism to determine anomalies. The majority of anomaly detection methods use a fixed threshold detection mechanism to detect anomalous points. However, the anomalies caused by cyberattacks in ICSs are usually continuous anomaly segments. In this paper, we propose a novel detection mechanism to detect continuous anomaly segments. Its core idea is to determine the start and end times of anomalies based on the continuity characteristics of anomalies and the dynamics of error. We conducted experiments on the two real-world datasets for performance evaluation using five baselines. The F1 score increased by 3.8% on average in the SWAT dataset and increased by 15.6% in the WADI dataset. The results show a significant improvement in the performance of baselines using an error neighborhood-based continuity detection mechanism in a real-time manner.
Li, Shijie, Liu, Junjiao, Pan, Zhiwen, Lv, Shichao, Si, Shuaizong, Sun, Limin.  2022.  Anomaly Detection based on Robust Spatial-temporal Modeling for Industrial Control Systems. 2022 IEEE 19th International Conference on Mobile Ad Hoc and Smart Systems (MASS). :355—363.
Industrial Control Systems (ICS) are increasingly facing the threat of False Data Injection (FDI) attacks. As an emerging intrusion detection scheme for ICS, process-based Intrusion Detection Systems (IDS) can effectively detect the anomalies caused by FDI attacks. Specifically, such IDS establishes anomaly detection model which can describe the normal pattern of industrial processes, then perform real-time anomaly detection on industrial process data. However, this method suffers low detection accuracy due to the complexity and instability of industrial processes. That is, the process data inherently contains sophisticated nonlinear spatial-temporal correlations which are hard to be explicitly described by anomaly detection model. In addition, the noise and disturbance in process data prevent the IDS from distinguishing the real anomaly events. In this paper, we propose an Anomaly Detection approach based on Robust Spatial-temporal Modeling (AD-RoSM). Concretely, to explicitly describe the spatial-temporal correlations within the process data, a neural based state estimation model is proposed by utilizing 1D CNN for temporal modeling and multi-head self attention mechanism for spatial modeling. To perform robust anomaly detection in the presence of noise and disturbance, a composite anomaly discrimination model is designed so that the outputs of the state estimation model can be analyzed with a combination of threshold strategy and entropy-based strategy. We conducted extensive experiments on two benchmark ICS security datasets to demonstrate the effectiveness of our approach.
2023-07-28
Abu-Khadrah, Ahmed.  2022.  An Efficient Fuzzy Logic Modelling of TiN Coating Thickness. 2022 International Conference on Business Analytics for Technology and Security (ICBATS). :1—5.
In this paper, fuzzy logic was implemented as a proposed approach for modelling of Thickness as an output response of thin film layer in Titanium Nitrite (TiN). The layer was deposited using Physical Vapor Deposition (PVD) process that uses a sputtering technique to coat insert cutting tools with TiN. Central cubic design (CCD) was used for designing the optimal points of the experiment. In order to develop the fuzzy rules, the experimental data that collected by PVD was used. Triangular membership functions (Trimf) were used to develop the fuzzy prediction model. Residual error (e) and prediction accuracy (A) were used for validating the result of the proposed fuzzy model. The result of the developed fuzzy model with triangular membership function revealed that the average residual error of 0.2 is low and acceptable. Furthermore, the model obtained high prediction accuracy with 90.04%. The result revealed that the rule-based model of fuzzy logic could be an efficient approach to predict coatings layer thickness in the TiN.
2023-07-21
Eze, Emmanuel O., Keates, Simeon, Pedram, Kamran, Esfahani, Alireza, Odih, Uchenna.  2022.  A Context-Based Decision-Making Trust Scheme for Malicious Detection in Connected and Autonomous Vehicles. 2022 International Conference on Computing, Electronics & Communications Engineering (iCCECE). :31—36.
The fast-evolving Intelligent Transportation Systems (ITS) are crucial in the 21st century, promising answers to congestion and accidents that bother people worldwide. ITS applications such as Connected and Autonomous Vehicle (CAVs) update and broadcasts road incident event messages, and this requires significant data to be transmitted between vehicles for a decision to be made in real-time. However, broadcasting trusted incident messages such as accident alerts between vehicles pose a challenge for CAVs. Most of the existing-trust solutions are based on the vehicle's direct interaction base reputation and the psychological approaches to evaluate the trustworthiness of the received messages. This paper provides a scheme for improving trust in the received incident alert messages for real-time decision-making to detect malicious alerts between CAVs using direct and indirect interactions. This paper applies artificial intelligence and statistical data classification for decision-making on the received messages. The model is trained based on the US Department of Technology Safety Pilot Deployment Model (SPMD). An Autonomous Decision-making Trust Scheme (ADmTS) that incorporates a machine learning algorithm and a local trust manager for decision-making has been developed. The experiment showed that the trained model could make correct predictions such as 98% and 0.55% standard deviation accuracy in predicting false alerts on the 25% malicious data
2023-06-30
Bhuyan, Hemanta Kumar, Arun Sai, T., Charan, M., Vignesh Chowdary, K., Brahma, Biswajit.  2022.  Analysis of classification based predicted disease using machine learning and medical things model. 2022 Second International Conference on Advances in Electrical, Computing, Communication and Sustainable Technologies (ICAECT). :1–6.
{Health diseases have been issued seriously harmful in human life due to different dehydrated food and disturbance of working environment in the organization. Precise prediction and diagnosis of disease become a more serious and challenging task for primary deterrence, recognition, and treatment. Thus, based on the above challenges, we proposed the Medical Things (MT) and machine learning models to solve the healthcare problems with appropriate services in disease supervising, forecast, and diagnosis. We developed a prediction framework with machine learning approaches to get different categories of classification for predicted disease. The framework is designed by the fuzzy model with a decision tree to lessen the data complexity. We considered heart disease for experiments and experimental evaluation determined the prediction for categories of classification. The number of decision trees (M) with samples (MS), leaf node (ML), and learning rate (I) is determined as MS=20
Azghandi, Seif.  2022.  Deterrence of Cycles in Temporal Knowledge Graphs. 2022 IEEE Aerospace Conference (AERO). :01–09.
Temporal Knowledge Graph Embedding (TKGE) is an extensible (continuous vector space) time-sensitive data structure (tree) and is used to predict future event given historical events. An event consists of current state of a knowledge (subject), and a transition (predicate) that morphs the knowledge to the next state (object). The prediction is accomplished when the historical event data conform to structural model of Temporal Points Processes (TPP), followed by processing it by the behavioral model of Conditional Intensity Function (CIF). The formidable challenge in constructing and maintaining a TKGE is to ensure absence of cycles when historical event data are formed/structured as logical paths. Variations of depth-first search (DFS) are used in constructing TKGE albeit with the challenge of maintaining it as a cycle-free structure. This article presents a simple (tradeoff-based) design that creates and maintains a single-rooted isolated-paths TKGE: ipTKGE. In ipTKGE, isolated-paths have their own (local) roots. The local roots trigger the break down of the traditionally-constructed TKGE into isolated (independent) paths alleviating the necessity for using DFS - or its variational forms. This approach is possible at the expense of subject/objec t and predicates redun-dancies in ipTKGE. Isolated-paths allow for simpler algorithmic detection and avoidance of potential cycles in TKGE.
ISSN: 1095-323X
Pan, Xiyu, Mohammadi, Neda, Taylor, John E..  2022.  Smart City Digital Twins for Public Safety: A Deep Learning and Simulation Based Method for Dynamic Sensing and Decision-Making. 2022 Winter Simulation Conference (WSC). :808–818.
Technological innovations are expanding rapidly in the public safety sector providing opportunities for more targeted and comprehensive urban crime deterrence and detection. Yet, the spatial dispersion of crimes may vary over time. Therefore, it is unclear whether and how sensors can optimally impact crime rates. We developed a Smart City Digital Twin-based method to dynamically place license plate reader (LPR) sensors and improve their detection and deterrence performance. Utilizing continuously updated crime records, the convolutional long short-term memory algorithm predicted areas crimes were most likely to occur. Then, a Monte Carlo traffic simulation simulated suspect vehicle movements to determine the most likely routes to flee crime scenes. Dynamic LPR placement predictions were made weekly, capturing the spatiotemporal variation in crimes and enhancing LPR performance relative to static placement. We tested the proposed method in Warner Robins, GA, and results support the method's promise in detecting and deterring crime.
ISSN: 1558-4305
2023-06-23
Choi, Hankaram, Bae, Yongchul.  2022.  Prediction of encoding bitrate for each CRF value using video features and deep learning. 2022 Joint 12th International Conference on Soft Computing and Intelligent Systems and 23rd International Symposium on Advanced Intelligent Systems (SCIS&ISIS). :1–2.

In this paper, we quantify elements representing video features and we propose the bitrate prediction of compressed encoding video using deep learning. Particularly, to overcome disadvantage that we cannot predict bitrate of compression video by using Constant Rate Factor (CRF), we use deep learning. We can find element of video feature with relationship of bitrate when we compress the video, and we can confirm its possibility to find relationship through various deep learning techniques.

Sun, Haoran, Zhu, Xiaolong, Zhou, Conghua.  2022.  Deep Reinforcement Learning for Video Summarization with Semantic Reward. 2022 IEEE 22nd International Conference on Software Quality, Reliability, and Security Companion (QRS-C). :754–755.

Video summarization aims to improve the efficiency of large-scale video browsing through producting concise summaries. It has been popular among many scenarios such as video surveillance, video review and data annotation. Traditional video summarization techniques focus on filtration in image features dimension or image semantics dimension. However, such techniques can make a large amount of possible useful information lost, especially for many videos with rich text semantics like interviews, teaching videos, in that only the information relevant to the image dimension will be retained. In order to solve the above problem, this paper considers video summarization as a continuous multi-dimensional decision-making process. Specifically, the summarization model predicts a probability for each frame and its corresponding text, and then we designs reward methods for each of them. Finally, comprehensive summaries in two dimensions, i.e. images and semantics, is generated. This approach is not only unsupervised and does not rely on labels and user interaction, but also decouples the semantic and image summarization models to provide more usable interfaces for subsequent engineering use.

ISSN: 2693-9371

Rajin, S M Ataul Karim, Murshed, Manzur, Paul, Manoranjan, Teng, Shyh Wei, Ma, Jiangang.  2022.  Human pose based video compression via forward-referencing using deep learning. 2022 IEEE International Conference on Visual Communications and Image Processing (VCIP). :1–5.

To exploit high temporal correlations in video frames of the same scene, the current frame is predicted from the already-encoded reference frames using block-based motion estimation and compensation techniques. While this approach can efficiently exploit the translation motion of the moving objects, it is susceptible to other types of affine motion and object occlusion/deocclusion. Recently, deep learning has been used to model the high-level structure of human pose in specific actions from short videos and then generate virtual frames in future time by predicting the pose using a generative adversarial network (GAN). Therefore, modelling the high-level structure of human pose is able to exploit semantic correlation by predicting human actions and determining its trajectory. Video surveillance applications will benefit as stored “big” surveillance data can be compressed by estimating human pose trajectories and generating future frames through semantic correlation. This paper explores a new way of video coding by modelling human pose from the already-encoded frames and using the generated frame at the current time as an additional forward-referencing frame. It is expected that the proposed approach can overcome the limitations of the traditional backward-referencing frames by predicting the blocks containing the moving objects with lower residuals. Our experimental results show that the proposed approach can achieve on average up to 2.83 dB PSNR gain and 25.93% bitrate savings for high motion video sequences compared to standard video coding.

ISSN: 2642-9357

2023-06-22
Zhao, Wanqi, Sun, Haoyue, Zhang, Dawei.  2022.  Research on DDoS Attack Detection Method Based on Deep Neural Network Model inSDN. 2022 International Conference on Networking and Network Applications (NaNA). :184–188.
This paper studies Distributed Denial of Service (DDoS) attack detection by adopting the Deep Neural Network (DNN) model in Software Defined Networking (SDN). We first deploy the flow collector module to collect the flow table entries. Considering the detection efficiency of the DNN model, we also design some features manually in addition to the features automatically obtained by the flow table. Then we use the preprocessed data to train the DNN model and make a prediction. The overall detection framework is deployed in the SDN controller. The experiment results illustrate DNN model has higher accuracy in identifying attack traffic than machine learning algorithms, which lays a foundation for the defense against DDoS attack.
Das, Soumyajit, Dayam, Zeeshaan, Chatterjee, Pinaki Sankar.  2022.  Application of Random Forest Classifier for Prevention and Detection of Distributed Denial of Service Attacks. 2022 OITS International Conference on Information Technology (OCIT). :380–384.
A classification issue in machine learning is the issue of spotting Distributed Denial of Service (DDos) attacks. A Denial of Service (DoS) assault is essentially a deliberate attack launched from a single source with the implied intent of rendering the target's application unavailable. Attackers typically aims to consume all available network bandwidth in order to accomplish this, which inhibits authorized users from accessing system resources and denies them access. DDoS assaults, in contrast to DoS attacks, include several sources being used by the attacker to launch an attack. At the network, transportation, presentation, and application layers of a 7-layer OSI architecture, DDoS attacks are most frequently observed. With the help of the most well-known standard dataset and multiple regression analysis, we have created a machine learning model in this work that can predict DDoS and bot assaults based on traffic.
2023-05-30
Wang, Binbin, Wu, Yi, Guo, Naiwang, Zhang, Lei, Liu, Chang.  2022.  A cross-layer attack path detection method for smart grid dynamics. 2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE). :142—146.
With the intelligent development of power system, due to the double-layer structure of smart grid and the characteristics of failure propagation across layers, the attack path also changes significantly: from single-layer to multi-layer and from static to dynamic. In response to the shortcomings of the single-layer attack path of traditional attack path identification methods, this paper proposes the idea of cross-layer attack, which integrates the threat propagation mechanism of the information layer and the failure propagation mechanism of the physical layer to establish a forward-backward bi-directional detection model. The model is mainly used to predict possible cross-layer attack paths and evaluate their path generation probabilities to provide theoretical guidance and technical support for defenders. The experimental results show that the method proposed in this paper can well identify the dynamic cross-layer attacks in the smart grid.
2023-05-19
Wu, Jingyi, Guo, Jinkang, Lv, Zhihan.  2022.  Deep Learning Driven Security in Digital Twins of Drone Network. ICC 2022 - IEEE International Conference on Communications. :1—6.
This study aims to explore the security issues and computational intelligence of drone information system based on deep learning. Targeting at the security issues of the drone system when it is attacked, this study adopts the improved long short-term memory (LSTM) network to analyze the cyber physical system (CPS) data for prediction from the perspective of predicting the control signal data of the system before the attack occurs. At the same time, the differential privacy frequent subgraph (DPFS) is introduced to keep data privacy confidential, and the digital twins technology is used to map the operating environment of the drone in the physical space, and an attack prediction model for drone digital twins CPS is constructed based on differential privacy-improved LSTM. Finally, the tennessee eastman (TE) process is undertaken as a simulation platform to simulate the constructed model so as to verify its performance. In addition, the proposed model is compared with the Bidirectional LSTM (BiLSTM) and Attention-BiLSTM models proposed by other scholars. It was found that the root mean square error (RMSE) of the proposed model is the smallest (0.20) when the number of hidden layer nodes is 26. Comparison with the actual flow value shows that the proposed algorithm is more accurate with better fitting. Therefore, the constructed drone attack prediction model can achieve higher prediction accuracy and obvious better robustness under the premise of ensuring errors, which can provide experimental basis for the later security and intelligent development of drone system.
2023-05-12
Ponce-de-Leon, Hernán, Kinder, Johannes.  2022.  Cats vs. Spectre: An Axiomatic Approach to Modeling Speculative Execution Attacks. 2022 IEEE Symposium on Security and Privacy (SP). :235–248.

The SPECTRE family of speculative execution attacks has required a rethinking of formal methods for security. Approaches based on operational speculative semantics have made initial inroads towards finding vulnerable code and validating defenses. However, with each new attack grows the amount of microarchitectural detail that has to be integrated into the underlying semantics. We propose an alternative, lightweight and axiomatic approach to specifying speculative semantics that relies on insights from memory models for concurrency. We use the CAT modeling language for memory consistency to specify execution models that capture speculative control flow, store-to-load forwarding, predictive store forwarding, and memory ordering machine clears. We present a bounded model checking framework parameterized by our speculative CAT models and evaluate its implementation against the state of the art. Due to the axiomatic approach, our models can be rapidly extended to allow our framework to detect new types of attacks and validate defenses against them.

ISSN: 2375-1207

Naseri, Amir Mohammad, Lucia, Walter, Youssef, Amr.  2022.  A Privacy Preserving Solution for Cloud-Enabled Set-Theoretic Model Predictive Control. 2022 European Control Conference (ECC). :894–899.
Cloud computing solutions enable Cyber-Physical Systems (CPSs) to utilize significant computational resources and implement sophisticated control algorithms even if limited computation capabilities are locally available for these systems. However, such a control architecture suffers from an important concern related to the privacy of sensor measurements and the computed control inputs within the cloud. This paper proposes a solution that allows implementing a set-theoretic model predictive controller on the cloud while preserving this privacy. This is achieved by exploiting the offline computations of the robust one-step controllable sets used by the controller and two affine transformations of the sensor measurements and control optimization problem. It is shown that the transformed and original control problems are equivalent (i.e., the optimal control input can be recovered from the transformed one) and that privacy is preserved if the control algorithm is executed on the cloud. Moreover, we show how the actuator can take advantage of the set-theoretic nature of the controller to verify, through simple set-membership tests, if the control input received from the cloud is admissible. The correctness of the proposed solution is verified by means of a simulation experiment involving a dual-tank water system.
2023-04-28
Hao, Wei, Shen, Chuanbao, Yang, Xing, Wang, Chao.  2022.  Intelligent Penetration and Attack Simulation System Based on Attack Chain. 2022 15th International Symposium on Computational Intelligence and Design (ISCID). :204–207.
Vulnerability assessment is an important process for network security. However, most commonly used vulnerability assessment methods still rely on expert experience or rule-based automated scripts, which are difficult to meet the security requirements of increasingly complex network environment. In recent years, although scientists and engineers have made great progress on artificial intelligence in both theory and practice, it is a challenging to manufacture a mature high-quality intelligent products in the field of network security, especially in penetration testing based vulnerability assessment for enterprises. Therefore, in order to realize the intelligent penetration testing, Vul.AI with its rich experience in cyber attack and defense for many years has designed and developed a set of intelligent penetration and attack simulation system Ai.Scan, which is based on attack chain, knowledge graph and related evaluation algorithms. In this paper, the realization principle, main functions and application scenarios of Ai.Scan are introduced in detail.
ISSN: 2473-3547
2023-04-14
Barakat, Ghena, Al-Duwairi, Basheer, Jarrah, Moath, Jaradat, Manar.  2022.  Modeling and Simulation of IoT Botnet Behaviors Using DEVS. 2022 13th International Conference on Information and Communication Systems (ICICS). :42–47.
The ubiquitous nature of the Internet of Things (IoT) devices and their wide-scale deployment have remarkably attracted hackers to exploit weakly-configured and vulnerable devices, allowing them to form large IoT botnets and launch unprecedented attacks. Modeling the behavior of IoT botnets leads to a better understanding of their spreading mechanisms and the state of the network at different levels of the attack. In this paper, we propose a generic model to capture the behavior of IoT botnets. The proposed model uses Markov Chains to study the botnet behavior. Discrete Event System Specifications environment is used to simulate the proposed model.
ISSN: 2573-3346
2023-03-31
Khelifi, Hakima, Belouahri, Amani.  2022.  The Impact of Big Data Analytics on Traffic Prediction. 2022 International Conference on Advanced Aspects of Software Engineering (ICAASE). :1–6.
The Internet of Vehicles (IoVs) performs the rapid expansion of connected devices. This massive number of devices is constantly generating a massive and near-real-time data stream for numerous applications, which is known as big data. Analyzing such big data to find, predict, and control decisions is a critical solution for IoVs to enhance service quality and experience. Thus, the main goal of this paper is to study the impact of big data analytics on traffic prediction in IoVs. In which we have used big data analytics steps to predict the traffic flow, and based on different deep neural models such as LSTM, CNN-LSTM, and GRU. The models are validated using evaluation metrics, MAE, MSE, RMSE, and R2. Hence, a case study based on a real-world road is used to implement and test the efficiency of the traffic prediction models.
Soderi, Mirco, Kamath, Vignesh, Breslin, John G..  2022.  A Demo of a Software Platform for Ubiquitous Big Data Engineering, Visualization, and Analytics, via Reconfigurable Micro-Services, in Smart Factories. 2022 IEEE International Conference on Smart Computing (SMARTCOMP). :1–3.
Intelligent, smart, Cloud, reconfigurable manufac-turing, and remote monitoring, all intersect in modern industry and mark the path toward more efficient, effective, and sustain-able factories. Many obstacles are found along the path, including legacy machineries and technologies, security issues, and software that is often hard, slow, and expensive to adapt to face unforeseen challenges and needs in this fast-changing ecosystem. Light-weight, portable, loosely coupled, easily monitored, variegated software components, supporting Edge, Fog and Cloud computing, that can be (re)created, (re)configured and operated from remote through Web requests in a matter of milliseconds, and that rely on libraries of ready-to-use tasks also extendable from remote through sub-second Web requests, constitute a fertile technological ground on top of which fourth-generation industries can be built. In this demo it will be shown how starting from a completely virgin Docker Engine, it is possible to build, configure, destroy, rebuild, operate, exclusively from remote, exclusively via API calls, computation networks that are capable to (i) raise alerts based on configured thresholds or trained ML models, (ii) transform Big Data streams, (iii) produce and persist Big Datasets on the Cloud, (iv) train and persist ML models on the Cloud, (v) use trained models for one-shot or stream predictions, (vi) produce tabular visualizations, line plots, pie charts, histograms, at real-time, from Big Data streams. Also, it will be shown how easily such computation networks can be upgraded with new functionalities at real-time, from remote, via API calls.
ISSN: 2693-8340