Biblio
Filters: First Letter Of Last Name is W [Clear All Filters]
Research on Information Security Protection of Industrial Internet Oriented CNC System. 2022 IEEE 6th Information Technology and Mechatronics Engineering Conference (ITOEC). 6:1818–1822.
.
2022. Machine tool is known as the mother of industry. CNC machine tool is the embodiment of modern automatic control productivity. In the context of the rapid development of the industrial Internet, a large number of equipment and systems are interconnected through the industrial Internet, realizing the flexible adaptation from the supply side to the demand side. As the a typical core system of industrial Internet, CNC system is facing the threat of industrial virus and network attack. The problem of information security is becoming more and more prominent. This paper analyzes the security risks of the existing CNC system from the aspects of terminal security, data security and network security. By comprehensively using the technologies of data encryption, identity authentication, digital signature, access control, secure communication and key management, this paper puts forward a targeted security protection and management scheme, which effectively strengthens the overall security protection ability.
ISSN: 2693-289X
A virtualization-based security architecture for industrial control systems. 2022 7th IEEE International Conference on Data Science in Cyberspace (DSC). :94–101.
.
2022. The Industrial Internet expands the attack surface of industrial control systems(ICS), bringing cybersecurity threats to industrial controllers located in operation technology(OT) networks. Honeypot technology is an important means to detect network attacks. However, the existing honeypot system cannot simulate business logic and is difficult to resist highly concealed APT attacks. This paper proposes a high-simulation ICS security defense framework based on virtualization technology. The framework utilizes virtualization technology to build twins for protected control systems. The architecture can infer the execution results of control instructions in advance based on actual production data, so as to discover hidden attack behaviors in time. This paper designs and implements a prototype system and demonstrates the effectiveness and potential of this architecture for ICS security.
10 Gigabit industrial thermal data acquisition and storage solution based on software-defined network. 2022 7th IEEE International Conference on Data Science in Cyberspace (DSC). :616–619.
.
2022. With the wide application of Internet technology in the industrial control field, industrial control networks are getting larger and larger, and the industrial data generated by industrial control systems are increasing dramatically, and the performance requirements of the acquisition and storage systems are getting higher and higher. The collection and analysis of industrial equipment work logs and industrial timing data can realize comprehensive management and continuous monitoring of industrial control system work status, as well as intrusion detection and energy efficiency analysis in terms of traffic and data. In the face of increasingly large realtime industrial data, existing log collection systems and timing data gateways, such as packet loss and other phenomena [1], can not be more complete preservation of industrial control network thermal data. The emergence of software-defined networking provides a new solution to realize massive thermal data collection in industrial control networks. This paper proposes a 10-gigabit industrial thermal data acquisition and storage scheme based on software-defined networking, which uses software-defined networking technology to solve the problem of insufficient performance of existing gateways.
Data traceability scheme of industrial control system based on digital watermark. 2022 7th IEEE International Conference on Data Science in Cyberspace (DSC). :322–325.
.
2022. The fourth industrial revolution has led to the rapid development of industrial control systems. While the large number of industrial system devices connected to the Internet provides convenience for production management, it also exposes industrial control systems to more attack surfaces. Under the influence of multiple attack surfaces, sensitive data leakage has a more serious and time-spanning negative impact on industrial production systems. How to quickly locate the source of information leakage plays a crucial role in reducing the loss from the attack, so there are new requirements for tracing sensitive data in industrial control information systems. In this paper, we propose a digital watermarking traceability scheme for sensitive data in industrial control systems to address the above problems. In this scheme, we enhance the granularity of traceability by classifying sensitive data types of industrial control systems into text, image and video data with differentiated processing, and achieve accurate positioning of data sources by combining technologies such as national secret asymmetric encryption and hash message authentication codes, and mitigate the impact of mainstream watermarking technologies such as obfuscation attacks and copy attacks on sensitive data. It also mitigates the attacks against the watermarking traceability such as obfuscation attacks and copy attacks. At the same time, this scheme designs a data flow watermark monitoring module on the post-node of the data source to monitor the unauthorized sensitive data access behavior caused by other attacks.
Exploration of the principle of 6G communication technology and its development prospect. 2022 International Conference on Electronics and Devices, Computational Science (ICEDCS). :100–103.
.
2022. Nowadays, 5G has been widely used in various fields. People are starting to turn their attention to 6G. Therefore, at the beginning, this paper describes in detail the principle and performance of 6G, and introduces the key technologies of 6G, Cavity technology and THz technology. Based on the high-performance indicators of 6G, we then study the possible application changes brought by 6G, for example, 6G technology will make remote surgery and remote control possible. 6G technology will make remote surgery and remote control possible. 6G will speed up the interconnection of everything, allowing closer and faster connection between cars. Next, virtual reality is discussed. 6G technology will enable better development of virtual reality technology and enhance people's immersive experience. Finally, we present the issues that need to be addressed with 6G technology, such as cybersecurity issues and energy requirements. As well as the higher challenges facing 6G technology, such as connectivity and communication on a larger social plane.
6G toward Metaverse: Technologies, Applications, and Challenges. 2022 IEEE VTS Asia Pacific Wireless Communications Symposium (APWCS). :6–10.
.
2022. Metaverse opens up a new social networking paradigm where people can experience a real interactive feeling without physical space constraints. Social interactions are gradually evolving from text combined with pictures and videos to 3-dimensional virtual reality, making the social experience increasingly physical, implying that more metaverse applications with immersive experiences will be developed in the future. However, the increasing data dimensionality and volume for new metaverse applications present a significant challenge in data acquisition, security, and sharing. Furthermore, metaverse applications require high capacity and ultrareliability for the wireless system to guarantee the quality of user experience, which cannot be addressed in the current fifth-generation system. Therefore, reaching the metaverse is dependent on the revolution in the sixth-generation (6G) wireless communication, which is expected to provide low-latency, high-throughput, and secure services. This article provides a comprehensive view of metaverse applications and investigates the fundamental technologies for the 6G toward metaverse.
Data-Driven Digital Twins in Surgery utilizing Augmented Reality and Machine Learning. 2022 IEEE International Conference on Communications Workshops (ICC Workshops). :580–585.
.
2022. On the one hand, laparoscopic surgery as medical state-of-the-art method is minimal invasive, and thus less stressful for patients. On the other hand, laparoscopy implies higher demands on physicians, such as mental load or preparation time, hence appropriate technical support is essential for quality and suc-cess. Medical Digital Twins provide an integrated and virtual representation of patients' and organs' data, and thus a generic concept to make complex information accessible by surgeons. In this way, minimal invasive surgery could be improved significantly, but requires also a much more complex software system to achieve the various resulting requirements. The biggest challenges for these systems are the safe and precise mapping of the digital twin to reality, i.e. dealing with deformations, movement and distortions, as well as balance out the competing requirement for intuitive and immersive user access and security. The case study ARAILIS is presented as a proof in concept for such a system and provides a starting point for further research. Based on the insights delivered by this prototype, a vision for future Medical Digital Twins in surgery is derived and discussed.
ISSN: 2694-2941
Cybersecurity Education in the Age of Artificial Intelligence: A Novel Proactive and Collaborative Learning Paradigm. 2022 IEEE Frontiers in Education Conference (FIE). :1–5.
.
2022. This Innovative Practice Work-in-Progress paper presents a virtual, proactive, and collaborative learning paradigm that can engage learners with different backgrounds and enable effective retention and transfer of the multidisciplinary AI-cybersecurity knowledge. While progress has been made to better understand the trustworthiness and security of artificial intelligence (AI) techniques, little has been done to translate this knowledge to education and training. There is a critical need to foster a qualified cybersecurity workforce that understands the usefulness, limitations, and best practices of AI technologies in the cybersecurity domain. To address this import issue, in our proposed learning paradigm, we leverage multidisciplinary expertise in cybersecurity, AI, and statistics to systematically investigate two cohesive research and education goals. First, we develop an immersive learning environment that motivates the students to explore AI/machine learning (ML) development in the context of real-world cybersecurity scenarios by constructing learning models with tangible objects. Second, we design a proactive education paradigm with the use of hackathon activities based on game-based learning, lifelong learning, and social constructivism. The proposed paradigm will benefit a wide range of learners, especially underrepresented students. It will also help the general public understand the security implications of AI. In this paper, we describe our proposed learning paradigm and present our current progress of this ongoing research work. In the current stage, we focus on the first research and education goal and have been leveraging cost-effective Minecraft platform to develop an immersive learning environment where the learners are able to investigate the insights of the emerging AI/ML concepts by constructing related learning modules via interacting with tangible AI/ML building blocks.
ISSN: 2377-634X
Exploring the effects of segmentation when learning with Virtual Reality and 2D displays: a study with airport security officers. 2022 IEEE International Carnahan Conference on Security Technology (ICCST). :1–1.
.
2022. With novel 3D imaging technology based on computed tomography (CT) set to replace the current 2D X-ray systems, airports face the challenge of adequately preparing airport security officers (screeners) through knowledge building. Virtual reality (VR) bears the potential to greatly facilitate this process by allowing learners to experience and engage in immersive virtual scenarios as if they were real. However, while general aspects of immersion have been explored frequently, less is known about the benefits of immersive technology for instructional purposes in practical settings such as airport security.In the present study, we evaluated how different display technologies (2D vs VR) and segmentation (system-paced vs learner-paced) affected screeners' objective and subjective knowledge gain, cognitive load, as well as aspects of motivation and technology acceptance. By employing a 2 x 2 between-subjects design, four experimental groups experienced uniform learning material featuring information about 3D CT technology and its application in airport security: 2D system-paced, 2D learner-paced, VR system-paced, and VR learner-paced. The instructional material was presented as an 11 min multimedia lesson featuring words (i.e., narration, onscreen text) and pictures in dynamic form (i.e., video, animation). Participants of the learner-paced groups were prompted to initialize the next section of the multimedia lesson by pressing a virtual button after short segments of information. Additionally, a control group experiencing no instructional content was included to evaluate the effectiveness of the instructional material. The data was collected at an international airport with screeners having no prior 3D CT experience (n=162).The results show main effects on segmentation for objective learning outcomes (favoring system-paced), germane cognitive load on display technology (supporting 2D). These results contradict the expected benefits of VR and segmentation, respectively. Overall, the present study offers valuable insight on how to implement instructional material for a practical setting.
ISSN: 2153-0742
An OpenPLC-based Active Real-time Anomaly Detection Framework for Industrial Control Systems. 2022 China Automation Congress (CAC). :5899—5904.
.
2022. In recent years, the design of anomaly detectors has attracted a tremendous surge of interest due to security issues in industrial control systems (ICS). Restricted by hardware resources, most anomaly detectors can only be deployed at the remote monitoring ends, far away from the control sites, which brings potential threats to anomaly detection. In this paper, we propose an active real-time anomaly detection framework deployed in the controller of OpenPLC, which is a standardized open-source PLC and has high scalability. Specifically, we add adaptive active noises to control signals, and then identify a linear dynamic system model of the plant offline and implement it in the controller. Finally, we design two filters to process the estimated residuals based on the obtained model and use χ2 detector for anomaly detection. Extensive experiments are conducted on an industrial control virtual platform to show the effectiveness of the proposed detection framework.
The Use of Blockchain for Digital Identity Management in Healthcare. 2022 10th International Conference on Cyber and IT Service Management (CITSM). :1—6.
.
2022. Digitalization has occurred in almost all industries, one of them is health industry. Patients” medical records are now easier to be accessed and managed as all related data are stored in data storages or repositories. However, this system is still under development as number of patients still increasing. Lack of standardization might lead to patients losing their right to control their own data. Therefore, implementing private blockchain system with Self-Sovereign Identity (SSI) concept for identity management in health industry is a viable notion. With SSI, the patients will be benefited from having control over their own medical records and stored with higher security protocol. While healthcare providers will benefit in Know You Customer (KYC) process, if they handle new patients, who move from other healthcare providers. It will eliminate and shorten the process of updating patients' medical records from previous healthcare providers. Therefore, we suggest several flows in implementing blockchain for digital identity in healthcare industry to help overcome lack of patient's data control and KYC in current system. Nevertheless, implementing blockchain on health industry requires full attention from surrounding system and stakeholders to be realized.
Digital Forensic Analysis on Caller ID Spoofing Attack. 2022 7th International Workshop on Big Data and Information Security (IWBIS). :95—100.
.
2022. Misuse of caller ID spoofing combined with social engineering has the potential as a means to commit other crimes, such as fraud, theft, leaking sensitive information, spreading hoaxes, etc. The appropriate forensic technique must be carried out to support the verification and collection of evidence related to these crimes. In this research, a digital forensic analysis was carried out on the BlueStacks emulator, Redmi 5A smartphone, and SIM card which is a device belonging to the victim and attacker to carry out caller ID spoofing attacks. The forensic analysis uses the NIST SP 800-101 R1 guide and forensic tools FTK imager, Oxygen Forensic Detective, and Paraben’s E3. This research aims to determine the artifacts resulting from caller ID spoofing attacks to assist in mapping and finding digital evidence. The result of this research is a list of digital evidence findings in the form of a history of outgoing calls, incoming calls, caller ID from the source of the call, caller ID from the destination of the call, the time the call started, the time the call ended, the duration of the call, IMSI, ICCID, ADN, and TMSI.
Odd-Even Hash Algorithm: A Improvement of Cuckoo Hash Algorithm. 2021 Ninth International Conference on Advanced Cloud and Big Data (CBD). :1—6.
.
2022. Hash-based data structures and algorithms are currently flourishing on the Internet. It is an effective way to store large amounts of information, especially for applications related to measurement, monitoring and security. At present, there are many hash table algorithms such as: Cuckoo Hash, Peacock Hash, Double Hash, Link Hash and D-left Hash algorithm. However, there are still some problems in these hash table algorithms, such as excessive memory space, long insertion and query operations, and insertion failures caused by infinite loops that require rehashing. This paper improves the kick-out mechanism of the Cuckoo Hash algorithm, and proposes a new hash table structure- Odd-Even Hash (OE Hash) algorithm. The experimental results show that OE Hash algorithm is more efficient than the existing Link Hash algorithm, Linear Hash algorithm, Cuckoo Hash algorithm, etc. OE Hash algorithm takes into account the performance of both query time and insertion time while occupying the least space, and there is no insertion failure that leads to rehashing, which is suitable for massive data storage.
A Safe Approach to Sensitive Dropout Data Collection Systems by Utilizing Homomorphic Encryption. 2022 International Symposium on Information Technology and Digital Innovation (ISITDI). :168—171.
.
2022. The student's fault is not the only cause of dropping out of school. Often, cases of dropping out of school are only associated with too general problems. However, sensitive issues that can be detrimental to certain parties in this regard, such as the institution's reputation, are usually not made public. To overcome this, an in-depth analysis of these cases is needed for proper handling. Many risks are associated with creating a single repository for this sensitive information. Therefore, some encryption is required to ensure data is not leaked. However, encryption at rest and in transit is insufficient as data leakage is a considerable risk during processing. In addition, there is also a risk of abuse of authority by insiders so that no single entity is allowed to have access to all data. Homomorphic encryption presents a viable solution to this challenge. Data may be aggregated under the security provided by Homomorphic Encryption. This method makes the data available for computation without being decrypted first and without paying the risk of having a single repository.
DPP: Data Privacy-Preserving for Cloud Computing based on Homomorphic Encryption. 2022 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC). :29—32.
.
2022. Cloud computing has been widely used because of its low price, high reliability, and generality of services. However, considering that cloud computing transactions between users and service providers are usually asynchronous, data privacy involving users and service providers may lead to a crisis of trust, which in turn hinders the expansion of cloud computing applications. In this paper, we propose DPP, a data privacy-preserving cloud computing scheme based on homomorphic encryption, which achieves correctness, compatibility, and security. DPP implements data privacy-preserving by introducing homomorphic encryption. To verify the security of DPP, we instantiate DPP based on the Paillier homomorphic encryption scheme and evaluate the performance. The experiment results show that the time-consuming of the key steps in the DPP scheme is reasonable and acceptable.
Multi-data Image Steganography using Generative Adversarial Networks. 2022 9th International Conference on Computing for Sustainable Global Development (INDIACom). :454–459.
.
2022. The success of deep learning based steganography has shifted focus of researchers from traditional steganography approaches to deep learning based steganography. Various deep steganographic models have been developed for improved security, capacity and invisibility. In this work a multi-data deep learning steganography model has been developed using a well known deep learning model called Generative Adversarial Networks (GAN) more specifically using deep convolutional Generative Adversarial Networks (DCGAN). The model is capable of hiding two different messages, meant for two different receivers, inside a single cover image. The proposed model consists of four networks namely Generator, Steganalyzer Extractor1 and Extractor2 network. The Generator hides two secret messages inside one cover image which are extracted using two different extractors. The Steganalyzer network differentiates between the cover and stego images generated by the generator network. The experiment has been carried out on CelebA dataset. Two commonly used distortion metrics Peak signal-to-Noise ratio (PSNR) and Structural Similarity Index Metric (SSIM) are used for measuring the distortion in the stego image The results of experimentation show that the stego images generated have good imperceptibility and high extraction rates.
Test Case Filtering based on Generative Adversarial Networks. 2022 IEEE 23rd International Conference on High Performance Switching and Routing (HPSR). :65–69.
.
2022. Fuzzing is a popular technique for finding soft-ware vulnerabilities. Despite their success, the state-of-art fuzzers will inevitably produce a large number of low-quality inputs. In recent years, Machine Learning (ML) based selection strategies have reported promising results. However, the existing ML-based fuzzers are limited by the lack of training data. Because the mutation strategy of fuzzing can not effectively generate useful input, it is prohibitively expensive to collect enough inputs to train models. In this paper, propose a generative adversarial networks based solution to generate a large number of inputs to solve the problem of insufficient data. We implement the proposal in the American Fuzzy Lop (AFL), and the experimental results show that it can find more crashes at the same time compared with the original AFL.
ISSN: 2325-5609
Optimization of Encrypted Communication Model Based on Generative Adversarial Network. 2022 International Conference on Blockchain Technology and Information Security (ICBCTIS). :20–24.
.
2022. With the progress of cryptography computer science, designing cryptographic algorithms using deep learning is a very innovative research direction. Google Brain designed a communication model using generation adversarial network and explored the encrypted communication algorithm based on machine learning. However, the encrypted communication model it designed lacks quantitative evaluation. When some plaintexts and keys are leaked at the same time, the security of communication cannot be guaranteed. This model is optimized to enhance the security by adjusting the optimizer, modifying the activation function, and increasing batch normalization to improve communication speed of optimization. Experiments were performed on 16 bits and 64 bits plaintexts communication. With plaintext and key leak rate of 0.75, the decryption error rate of the decryptor is 0.01 and the attacker can't guess any valid information about the communication.
LRVP: Lightweight Real-Time Verification of Intradomain Forwarding Paths. IEEE Systems Journal. 16:6309–6320.
.
2022. The correctness of user traffic forwarding paths is an important goal of trusted transmission. Many network security issues are related to it, i.e., denial-of-service attacks, route hijacking, etc. The current path-aware network architecture can effectively overcome this issue through path verification. At present, the main problems of path verification are high communication and high computation overhead. To this aim, this article proposes a lightweight real-time verification mechanism of intradomain forwarding paths in autonomous systems to achieve a path verification architecture with no communication overhead and low computing overhead. The problem situation is that a packet finally reaches the destination, but its forwarding path is inconsistent with the expected path. The expected path refers to the packet forwarding path determined by the interior gateway protocols. If the actual forwarding path is different from the expected one, it is regarded as an incorrect forwarding path. This article focuses on the most typical intradomain routing environment. A few routers are set as the verification routers to block the traffic with incorrect forwarding paths and raise alerts. Experiments prove that this article effectively solves the problem of path verification and the problem of high communication and computing overhead.
Conference Name: IEEE Systems Journal
A Novel Interleaving Scheme for Concatenated Codes on Burst-Error Channel. 2022 27th Asia Pacific Conference on Communications (APCC). :309—314.
.
2022. With the rapid development of Ethernet, RS (544, 514) (KP4-forward error correction), which was widely used in high-speed Ethernet standards for its good performance-complexity trade-off, may not meet the demands of next-generation Ethernet for higher data transmission speed and better decoding performance. A concatenated code based on KP4-FEC has become a good solution because of its low complexity and excellent compatibility. For concatenated codes, aside from the selection of outer and inner codes, an efficient interleaving scheme is also very critical to deal with different channel conditions. Aiming at burst errors in wired communication, we propose a novel matrix interleaving scheme for concatenated codes which set the outer code as KP4-FEC and the inner code as Bose-Chaudhuri-Hocquenghem (BCH) code. In the proposed scheme, burst errors are evenly distributed to each BCH code as much as possible to improve their overall decoding efficiency. Meanwhile, the bit continuity in each symbol of the RS codeword is guaranteed during transmission, so the number of symbols affected by burst errors is minimized. Simulation results demonstrate that the proposed interleaving scheme can achieve a better decoding performance on burst-error channels than the original scheme. In some cases, the extra coding gain at the bit-error-rate (BER) of 1 × 10−15 can even reach 1 dB.
Sliding-Window Forward Error Correction Based on Reference Order for Real-Time Video Streaming. IEEE Access. 10:34288—34295.
.
2022. In real-time video streaming, data packets are transported over the network from a transmitter to a receiver. The quality of the received video fluctuates as the network conditions change, and it can degrade substantially when there is considerable packet loss. Forward error correction (FEC) techniques can be used to recover lost packets by incorporating redundant data. Conventional FEC schemes do not work well when scalable video coding (SVC) is adopted. In this paper, we propose a novel FEC scheme that overcomes the drawbacks of these schemes by considering the reference picture structure of SVC and weighting the reference pictures more when FEC redundancy is applied. The experimental results show that the proposed FEC scheme outperforms conventional FEC schemes.
Real-Time FPGA Investigation of Interplay Between Probabilistic Shaping and Forward Error Correction. Journal of Lightwave Technology. 40:1339—1345.
.
2022. In this work, we implement a complete probabilistic amplitude shaping (PAS) architecture on a field-programmable gate array (FPGA) platform to study the interplay between probabilistic shaping (PS) and forward error correction (FEC). Due to the fully parallelized input–output interfaces based on look up table (LUT) and low computational complexity without high-precision multiplication, hierarchical distribution matching (HiDM) is chosen as the solution for real time probabilistic shaping. In terms of FEC, we select two kinds of the mainstream soft decision-forward error correction (SD-FEC) algorithms currently used in optical communication system, namely Open FEC (OFEC) and soft-decision quasi-cyclic low-density parity-check (SD-QC-LDPC) codes. Through FPGA experimental investigation, we studied the impact of probabilistic shaping on OFEC and LDPC, respectively, based on PS-16QAM under moderate shaping, and also the impact of probabilistic shaping on LDPC code based on PS-64QAM under weak/strong shaping. The FPGA experimental results show that if pre-FEC bit error rate (BER) is used as the predictor, moderate shaping induces no degradation on the OFEC performance, while strong shaping slightly degrades the error correction performance of LDPC. Nevertheless, there is no error floor when the output BER is around 10-15. However, if normalized generalized mutual information (NGMI) is selected as the predictor, the performance degradation of LDPC will become insignificant, which means pre-FEC BER may not a good predictor for LDPC in probabilistic shaping scenario. We also studied the impact of residual errors after FEC decoding on HiDM. The FPGA experimental results show that the increased BER after HiDM decoding is within 10 times compared to post-FEC BER.
Conference Name: Journal of Lightwave Technology
Investigation of Potential FEC Schemes for 800G-ZR Forward Error Correction. 2022 Optical Fiber Communications Conference and Exhibition (OFC). :1—3.
.
2022. With a record 400Gbps 100-piece-FPGA implementation, we investigate performance of the potential FEC schemes for OIF-800GZR. By comparing the power dissipation and correction threshold at 10−15 BER, we proposed the simplified OFEC for the 800G-ZR FEC.
Low-complexity Forward Error Correction For 800G Unamplified Campus Link. 2022 20th International Conference on Optical Communications and Networks (ICOCN). :1—3.
.
2022. The discussion about forward error correction (FEC) used for 800G unamplified link (800LR) is ongoing. Aiming at two potential options for FEC bit error ratio (BER) threshold, we propose two FEC schemes, respectively based on channel-polarized (CP) multilevel coding (MLC) and bit interleaved coded modulation (BICM), with the same inner FEC code. The field-programmable gate array (FPGA) verification results indicate that with the same FEC overhead (OH), proposed CP-MLC outperforms BICM scheme with less resource and power consumption.
A Co-regularization Facial Emotion Recognition Based on Multi-Task Facial Action Unit Recognition. 2022 41st Chinese Control Conference (CCC). :6806—6810.
.
2022. Facial emotion recognition helps feed the growth of the future artificial intelligence with the development of emotion recognition, learning, and analysis of different angles of a human face and head pose. The world's recent pandemic gave rise to the rapid installment of facial recognition for fewer applications, while emotion recognition is still within the experimental boundaries. The current challenges encountered with facial emotion recognition (FER) are the difference between background noises. Since today's world shows us that humans soon need robotics in the most significant role of human perception, attention, memory, decision-making, and human-robot interaction (HRI) needs employees. By merging the head pose as a combination towards the FER to boost the robustness in understanding emotions using the convolutional neural networks (CNN). The stochastic gradient descent with a comprehensive model is adopted by applying multi-task learning capable of implicit parallelism, inherent and better global optimizer in finding better network weights. After executing a multi-task learning model using two independent datasets, the experiment with the FER and head pose learning multi-views co-regularization frameworks were subsequently merged with validation accuracy.