Biblio
With the development of the information age, the process of global networking continues to deepen, and the cyberspace security has become an important support for today’s social functions and social activities. Web applications which have many security risks are the most direct interactive way in the process of the Internet activities. That is why the web applications face a large number of network attacks. Interpretive dynamic programming languages are easy to lean and convenient to use, they are widely used in the development of cross-platform web systems. As well as benefit from these advantages, the web system based on those languages is hard to detect errors and maintain the complex system logic, increasing the risk of system vulnerability and cyber threats. The attack defense of systems based on interpretive dynamic programming languages is widely concerned by researchers. Since the advance of endogenous security technologies, there are breakthroughs on the research of web system security. Compared with traditional security defense technologies, these technologies protect the system with their uncertainty, randomness and dynamism. Based on several common network attacks, the traditional system security defense technology and endogenous security technology of web application based on interpretive dynamic languages are surveyed and compared in this paper. Furthermore, the possible research directions of those technologies are discussed.
The value and size of information exchanged through dark-web pages are remarkable. Recently Many researches showed values and interests in using machine-learning methods to extract security-related useful knowledge from those dark-web pages. In this scope, our goals in this research focus on evaluating best prediction models while analyzing traffic level data coming from the dark web. Results and analysis showed that feature selection played an important role when trying to identify the best models. Sometimes the right combination of features would increase the model’s accuracy. For some feature set and classifier combinations, the Src Port and Dst Port both proved to be important features. When available, they were always selected over most other features. When absent, it resulted in many other features being selected to compensate for the information they provided. The Protocol feature was never selected as a feature, regardless of whether Src Port and Dst Port were available.
For modern Automatic Test Equipment (ATE), one of the most daunting tasks conducting Information Assurance (IA). In addition, there is a desire to Network ATE to allow for information sharing and deployment of software. This is complicated by the fact that typically ATE are “unmanaged” systems in that most are configured, deployed, and then mostly left alone. This results in systems that are not patched with the latest Operating System updates and in fact may be running on legacy Operating Systems which are no longer supported (like Windows XP or Windows 7 for instance). A lot of this has to do with the cost of keeping a system updated on a continuous basis and regression testing the Test Program Sets (TPS) that run on them. Given that an Automated Test System can have thousands of Test Programs running on it, the cost and time involved in doing complete regression testing on all the Test Programs can be extremely expensive. In addition to the Test Programs themselves some Test Programs rely on third party Software and / or custom developed software that is required for the Test Programs to run. Add to this the requirement to perform software steering through all the Test Program paths, the length of time required to validate a Test Program could be measured in months in some cases. If system updates are performed once a month like some Operating System updates this could consume all the available time of the Test Station or require a fleet of Test Stations to be dedicated just to do the required regression testing. On the other side of the coin, a Test System running an old unpatched Operating System is a prime target for any manner of virus or other IA issues. This paper will discuss some of the pro's and con's of a managed Test System and how it might be accomplished.
The evolving and new age cybersecurity threats has set the information security industry on high alert. This modern age cyberattacks includes malware, phishing, artificial intelligence, machine learning and cryptocurrency. Our research highlights the importance and role of Software Quality Assurance for increasing the security standards that will not just protect the system but will handle the cyber-attacks better. With the series of cyber-attacks, we have concluded through our research that implementing code review and penetration testing will protect our data's integrity, availability, and confidentiality. We gathered user requirements of an application, gained a proper understanding of the functional as well as non-functional requirements. We implemented conventional software quality assurance techniques successfully but found that the application software was still vulnerable to potential issues. We proposed two additional stages in software quality assurance process to cater with this problem. After implementing this framework, we saw that maximum number of potential threats were already fixed before the first release of the software.