Optimal Machine Learning Algorithms for Cyber Threat Detection
Title | Optimal Machine Learning Algorithms for Cyber Threat Detection |
Publication Type | Conference Paper |
Year of Publication | 2018 |
Authors | Farooq, H. M., Otaibi, N. M. |
Conference Name | 2018 UKSim-AMSS 20th International Conference on Computer Modelling and Simulation (UKSim) |
Keywords | advanced targeted cyber threats, anomaly detection, automated cyber threat detection, classification, Classification algorithms, Clustering algorithms, composability, cyber security, cyber threat detection model, data mining, Data Science, Decision trees, Deep Learning, Dimensionality, efficient machine, Ensemble Learning, exponential hike, false detection rates, false-positive detections, global Security Operations Center environments, Kernel, learning (artificial intelligence), Logistics, machine learning, machine learning algorithms, minimisation, ML-based analytics, Numerical Clustering, optimal machine learning algorithm, prediction, privacy, pubcrawl, regression, resilience, Resiliency, security, security data, security log analytics, security logs, security machine data, security of data, SoC |
Abstract | With the exponential hike in cyber threats, organizations are now striving for better data mining techniques in order to analyze security logs received from their IT infrastructures to ensure effective and automated cyber threat detection. Machine Learning (ML) based analytics for security machine data is the next emerging trend in cyber security, aimed at mining security data to uncover advanced targeted cyber threats actors and minimizing the operational overheads of maintaining static correlation rules. However, selection of optimal machine learning algorithm for security log analytics still remains an impeding factor against the success of data science in cyber security due to the risk of large number of false-positive detections, especially in the case of large-scale or global Security Operations Center (SOC) environments. This fact brings a dire need for an efficient machine learning based cyber threat detection model, capable of minimizing the false detection rates. In this paper, we are proposing optimal machine learning algorithms with their implementation framework based on analytical and empirical evaluations of gathered results, while using various prediction, classification and forecasting algorithms. |
URL | https://ieeexplore.ieee.org/document/8588174 |
DOI | 10.1109/UKSim.2018.00018 |
Citation Key | farooq_optimal_2018 |
- Regression
- Logistics
- machine learning
- machine learning algorithms
- minimisation
- ML-based analytics
- Numerical Clustering
- optimal machine learning algorithm
- prediction
- privacy
- pubcrawl
- learning (artificial intelligence)
- resilience
- Resiliency
- security
- security data
- security log analytics
- security logs
- security machine data
- security of data
- SoC
- Decision trees
- Anomaly Detection
- automated cyber threat detection
- classification
- Classification algorithms
- Clustering algorithms
- composability
- cyber security
- cyber threat detection model
- Data mining
- data science
- advanced targeted cyber threats
- deep learning
- Dimensionality
- efficient machine
- Ensemble Learning
- exponential hike
- false detection rates
- false-positive detections
- global Security Operations Center environments
- Kernel