Biblio
Advanced video compression is required due to the rise of online video content. A strong compression method can help convey video data effectively over a constrained bandwidth. We observed how more internet usage for video conferences, online gaming, and education led to decreased video quality from Netflix, YouTube, and other streaming services in Europe and other regions, particularly during the COVID-19 epidemic. They are represented in standard video compression algorithms as a succession of reference frames after residual frames, and these approaches are limited in their application. Deep learning's introduction and current advancements have the potential to overcome such problems. This study provides a deep learning-based video compression model that meets or exceeds current H.264 standards.
Recently perceived vulnerabilities in public key infrastructures (PKI) demand that a semantic or cognitive definition of trust is essential for augmenting the security through trust formulations. In this paper, we examine the meaning of trust in PKIs. Properly categorized trust can help in developing intelligent algorithms that can adapt to the security and privacy requirements of the clients. We delineate the different types of trust in a generic PKI model.
Integrity of image data plays an important role in data communication. Image data contain confidential information so it is very important to protect data from intruder. When data is transmitted through the network, there may be possibility that data may be get lost or damaged. Existing system does not provide all functionality for securing image during transmission. i.e image compression, encryption and user authentication. In this paper hybrid cryptosystem is proposed in which biometric fingerprint is used for key generation which is further useful for encryption purpose. Secret fragment visible mosaic image method is used for secure transmission of image. For reducing the size of image lossless compression technique is used which leads to the fast transmission of image data through transmission channel. The biometric fingerprint is useful for authentication purpose. Biometric method is more secure method of authentication because it requires physical presence of human being and it is untraceable.
A significant amount of work is invested in human-machine teaming (HMT) across multiple fields. Accurately and effectively measuring system performance of an HMT is crucial for moving the design of these systems forward. Metrics are the enabling tools to devise a benchmark in any system and serve as an evaluation platform for assessing the performance, along with the verification and validation, of a system. Currently, there is no agreed-upon set of benchmark metrics for developing HMT systems. Therefore, identification and classification of common metrics are imperative to create a benchmark in the HMT field. The key focus of this review is to conduct a detailed survey aimed at identification of metrics employed in different segments of HMT and to determine the common metrics that can be used in the future to benchmark HMTs. We have organized this review as follows: identification of metrics used in HMTs until now, and classification based on functionality and measuring techniques. Additionally, we have also attempted to analyze all the identified metrics in detail while classifying them as theoretical, applied, real-time, non-real-time, measurable, and observable metrics. We conclude this review with a detailed analysis of the identified common metrics along with their usage to benchmark HMTs.
Cryptography and steganography are the two major fields available for data security. While cryptography is a technique in which the information is scrambled in an unintelligent gibberish fashion during transmission, steganography focuses on concealing the existence of the information. Combining both domains gives a higher level of security in which even if the use of covert channel is revealed, the true information will not be exposed. This paper focuses on concealing multiple secret images in a single 24-bit cover image using LSB substitution based image steganography. Each secret image is encrypted before hiding in the cover image using Arnold Transform. Results reveal that the proposed method successfully secures the high capacity data keeping the visual quality of transmitted image satisfactory.
As multi-agent systems become ubiquitous, guaranteeing safety in these systems grows increasingly important. In applications ranging from automated cruise control to safety in robot swarms, barrier functions have emerged as a tool to provably meet safety constraints by guaranteeing forward invariance of a set. However, a single barrier function can rarely satisfy all safety aspects of a system, so there remains a need to address the degree to which multiple barrier functions may be composed through Boolean logic. Utilizing max and min operators represents one such method to accomplish Boolean composition for barrier functions. As such, the main contribution of this work extends previously established concepts for barrier functions to a class of nonsmooth barrier functions that operate on systems described by differential inclusions. To validate these results, a Boolean compositional barrier function is deployed onto a team of mobile robots.
Accepted for publication
The landscape of cyber security has been reformed dramatically by the recently emerging Advanced Persistent Threat (APT). It is uniquely featured by the stealthy, continuous, sophisticated and well-funded attack process for long-term malicious gain, which render the current defense mechanisms inapplicable. A novel design of defense strategy, continuously combating APT in a long time-span with imperfect/incomplete information on attacker's actions, is urgently needed. The challenge is even more escalated when APT is coupled with the insider threat (a major threat in cyber-security), where insiders could trade valuable information to APT attacker for monetary gains. The interplay among the defender, APT attacker and insiders should be judiciously studied to shed insights on a more secure defense system. In this paper, we consider the joint threats from APT attacker and the insiders, and characterize the fore-mentioned interplay as a two-layer game model, i.e., a defense/attack game between defender and APT attacker and an information-trading game among insiders. Through rigorous analysis, we identify the best response strategies for each player and prove the existence of Nash Equilibrium for both games. Extensive numerical study further verifies our analytic results and examines the impact of different system configurations on the achievable security level.
The new era of information communication and technology (ICT), everyone wants to store/share their Data or information in online media, like in cloud database, mobile database, grid database, drives etc. When the data is stored in online media the main problem is arises related to data is privacy because different types of hacker, attacker or crackers wants to disclose their private information as publically. Security is a continuous process of protecting the data or information from attacks. For securing that information from those kinds of unauthorized people we proposed and implement of one the technique based on the data modification concept with taking the iris database on weka tool. And this paper provides the high privacy in distributed clustered database environments.
This paper addresses the problem of asymptotic stabilization for linear time-invariant (LTI) systems using event-triggered control under finite data rate communication - both in the sense of finite precision data at each transmission and finite average data rate. Given a prescribed rate of convergence for asymptotic stability, we introduce an event-triggered control implementation that opportunistically determines the transmission instants and the finite precision data to be transmitted at each transmission. We show that our design exponentially stabilizes the origin while guaranteeing a positive lower bound on the inter-transmission times, ensuring that the number of bits transmitted at each transmission is upper bounded, and allowing for the possibility of transmitting fewer bits at any given time if more bits than prescribed were transmitted on a previous transmission. In our technical approach, we consider both the case of instantaneous and non-instantaneous transmissions. Several simulations illustrate the results.
This paper addresses the problem of event-triggered control of linear time-invariant systems over time-varying rate limited communication channels, including the possibility of channel blackouts, which are intervals of time when the communication channel is unavailable for feedback. In order to design an effective event-triggered controller that operates successfully even in the presence of blackouts, we analyze the channel data capacity, which is the total maximum number of bits that could be communicated over a given time interval. We provide an efficient real-time algorithm to estimate the channel capacity for a time-slotted model of channel evolution. Equipped with this algorithm we then propose an event-triggering scheme, which using prior knowledge of the channel information, guarantees exponential stabilization at a desired convergence rate despite intermittent channel blackouts. The contributions are the notion of channel blackouts, the effective control despite their occurrence, and the analysis and quantification of the data capacity for a class of time-varying continuous-time channels.
This paper addresses the problem of exponential practical stabilization of linear time-invariant systems with disturbances using event-triggered control and bounded communication bit rate. We consider both the case of instantaneous communication with finite precision data at each transmission and the case of non-instantaneous communication with bounded communication rate. Given a prescribed rate of convergence, the proposed event-triggered control implementations opportunistically determine the transmission instants and the finite precision data to be transmitted on each transmission. We show that our design exponentially practically stabilizes the origin while guaranteeing a uniform positive lower bound on the inter-transmission and inter-reception times, ensuring that the number of bits transmitted on each transmission is upper bounded uniformly in time, and allowing for the possibility of transmitting fewer bits at any given time if more bits than prescribed were transmitted earlier. We also characterize the necessary and sufficient average data rate for exponential practical stabilization. Several simulations illustrate the results.