Biblio
Embedded systems involve an integration of a large number of intellectual property (IP) blocks to shorten chip's time to market, in which, many IPs are acquired from the untrusted third-party suppliers. However, existing IP trust verification techniques cannot provide an adequate security assurance that no hardware Trojan was implanted inside the untrusted IPs. Hardware Trojans in untrusted IPs may cause processor program execution failures by tampering instruction code and return address. Therefore, this paper presents a secure RISC-V embedded system by integrating a Security Monitoring Unit (SMU), in which, instruction integrity monitoring by the fine-grained program basic blocks and function return address monitoring by the shadow stack are implemented, respectively. The hardware-assisted SMU is tested and validated that while CPU executes a CoreMark program, the SMU does not incur significant performance overhead on providing instruction security monitoring. And the proposed RISC-V embedded system satisfies good balance between performance overhead and resource consumption.
For modern Automatic Test Equipment (ATE), one of the most daunting tasks conducting Information Assurance (IA). In addition, there is a desire to Network ATE to allow for information sharing and deployment of software. This is complicated by the fact that typically ATE are “unmanaged” systems in that most are configured, deployed, and then mostly left alone. This results in systems that are not patched with the latest Operating System updates and in fact may be running on legacy Operating Systems which are no longer supported (like Windows XP or Windows 7 for instance). A lot of this has to do with the cost of keeping a system updated on a continuous basis and regression testing the Test Program Sets (TPS) that run on them. Given that an Automated Test System can have thousands of Test Programs running on it, the cost and time involved in doing complete regression testing on all the Test Programs can be extremely expensive. In addition to the Test Programs themselves some Test Programs rely on third party Software and / or custom developed software that is required for the Test Programs to run. Add to this the requirement to perform software steering through all the Test Program paths, the length of time required to validate a Test Program could be measured in months in some cases. If system updates are performed once a month like some Operating System updates this could consume all the available time of the Test Station or require a fleet of Test Stations to be dedicated just to do the required regression testing. On the other side of the coin, a Test System running an old unpatched Operating System is a prime target for any manner of virus or other IA issues. This paper will discuss some of the pro's and con's of a managed Test System and how it might be accomplished.
The use of software to support the information infrastructure that governments, critical infrastructure providers and businesses worldwide rely on for their daily operations and business processes is gradually becoming unavoidable. Commercial off-the shelf software is widely and increasingly used by these organizations to automate processes with information technology. That notwithstanding, cyber-attacks are becoming stealthier and more sophisticated, which has led to a complex and dynamic risk environment for IT-based operations which users are working to better understand and manage. This has made users become increasingly concerned about the integrity, security and reliability of commercial software. To meet up with these concerns and meet customer requirements, vendors have undertaken significant efforts to reduce vulnerabilities, improve resistance to attack and protect the integrity of the products they sell. These efforts are often referred to as “software assurance.” Software assurance is becoming very important for organizations critical to public safety and economic and national security. These users require a high level of confidence that commercial software is as secure as possible, something only achieved when software is created using best practices for secure software development. Therefore, in this paper, we explore the need for information assurance and its importance for both organizations and end users, methodologies and best practices for software security and information assurance, and we also conducted a survey to understand end users’ opinions on the methodologies researched in this paper and their impact.
ISSN: 2154-0373
The evolving and new age cybersecurity threats has set the information security industry on high alert. This modern age cyberattacks includes malware, phishing, artificial intelligence, machine learning and cryptocurrency. Our research highlights the importance and role of Software Quality Assurance for increasing the security standards that will not just protect the system but will handle the cyber-attacks better. With the series of cyber-attacks, we have concluded through our research that implementing code review and penetration testing will protect our data's integrity, availability, and confidentiality. We gathered user requirements of an application, gained a proper understanding of the functional as well as non-functional requirements. We implemented conventional software quality assurance techniques successfully but found that the application software was still vulnerable to potential issues. We proposed two additional stages in software quality assurance process to cater with this problem. After implementing this framework, we saw that maximum number of potential threats were already fixed before the first release of the software.
Cyber-physical systems (CPS) depend on cybersecurity to ensure functionality, data quality, cyberattack resilience, etc. There are known and unknown cyber threats and attacks that pose significant risks. Information assurance and information security are critical. Many systems are vulnerable to intelligence exploitation and cyberattacks. By investigating cybersecurity risks and formal representation of CPS using spatiotemporal dynamic graphs and networks, this paper investigates topics and solutions aimed to examine and empower: (1) Cybersecurity capabilities; (2) Information assurance and system vulnerabilities; (3) Detection of cyber threat and attacks; (4) Situational awareness; etc. We introduce statistically-characterized dynamic graphs, novel entropy-centric algorithms and calculi which promise to ensure near-real-time capabilities.