Presented as part of the 2012 HCSS conference.
Abstract:
DO-178 ("Software Considerations in Airborne Systems") provides certification guidance for the onboard software in commercial aircraft. The technologies for developing and verifying software have changed significantly since the publication of DO-178 version B in 1992. After more than six years of work, a committee of industry and government experts has produced DO-178C, incorporating new guidance related to model-based and object-oriented software, as well as formal methods. This presentation will provide an overview of software certification in commercial aircraft and describe the changes incorporated in DO-178C with a focus on formal methods.
Biography:
Darren Cofer is a Principal Systems Engineer in the Trusted Systems group of ATC. He earned his PhD in Electrical and Computer Engineering from The University of Texas at Austin. His principal area of expertise is developing and applying advanced analysis methods and tools for verification and certification of high-integrity systems. His background includes work with formal methods for system and software analysis, the design of real-time embedded systems for safety-critical applications, and the development of nuclear propulsion systems in the U.S. Navy. He is currently the principal investigator on Rockwell Collins' META project with DARPA, and serves on RTCA committee SC-205 tasked with developing DO-178C, providing updated certification guidance for airborne software. He is an Associate Technical Editor for Control Systems Magazine and is a Senior Member of the IEEE.
Switch to experimental viewerPresented as part of the 2012 HCSS conference.
Abstract:
Computer science researchers in the programming languages and formal verification communities, among others, have produced a variety of automated assistance and verification tools and techniques for formal reasoning: parsers, evaluators, proof-authoring systems, software and hardware. These leading-edge advances remain largely underutilized by large populations of potential users that may benefit from them. Among these are instructors and students engaged in the undergraduate-level instruction and study of mathematics.
Tools such as MATLAB and Mathematica are utilized for solving problems by both instructors and students in mathematics courses at the undergraduate level. Other existing tools such as Isabelle/Isar [12], Mizar [11], Coq, or even basic techniques such as monomorphic type checking, computation of congruence closures, and automated logical inference, provide an unambiguous way to represent and verify formal arguments, from chains of basic algebraic manipulations in arithmetic all the way up to graduate-level mathematics topics. What impediments cause these tools to be so seldom used in presenting and assembling even basic algebraic arguments and proofs in undergraduate mathematics courses? Impediments faced by instructors and students who may wish to adopt existing automated formal reasoning assistance and verification tools and techniques include (we cite examples of related work that shares our motivation in addressing each impediment listed): syntaxes that may be unfamiliar and entirely distinct from the syntax used in textbooks and lecture materials [7]; a lack of high-level domain-specific libraries that would make formal arguments as manageable as those found in a text-book (or those presented in the classroom) [1, 3, 10]; a potentially idiosyncratic semantics focused on low-level logical correctness rather than practical high-level relationships typically explored in a mathematics course; a steep learning curve in employing a tool or technique [2, 5]; and logistical difficulties in setting up a working environment letting instructors and students use the same tools without much supporting IT infrastructure [5, 6].
Building on earlier work in assembling and evaluating user-friendly and accessible formal verification tools for research and classroom instruction [4, 8, 9], we have recently attempted to address many of these issues while making more manageable the task of utilizing existing formal reasoning assistance and verification techniques within the classroom. We have assembled a web-based integrated formal reasoning environment for classroom instruction that incorporates several standard techniques drawn from work done by the programming languages and formal verification communities over the last several decades. These techniques include basic evaluation of expressions, monomorphic type inference, basic verification of a subset of logical formulas in propositional and first-order logic, and an algorithm for computing congruence closures. This interactive web-based environment can run entirely within a standard web browser, and can be seamlessly integrated with a web page of lecture materials and homework assignments. The lectures and assignments contain within them formal arguments (including both complete examples and problems to be completed) that can be evaluated and/or automatically verified interactively. The environment provides an explicit library of logical definitions and formulas that students can utilize to complete assignments; these formulas can be browsed and filtered by students within the environment. The environment provides interactive and instant verification feedback about logical validity as well as an interactive, exhaustive list of all inferred properties for a given formal argument input.
The environment has been utilized within the classroom for an undergraduate-level course in linear algebra, both by the instructor in presenting examples during lectures, and by students when completing homework assignments. The focus of this work is not on the expressiveness or performance of the underlying tools and techniques within the environment (most examples used in such a course are simple); rather, it is on the design and practical evaluation of an integrated environment of basic, existing tools and techniques that is accessible (logistically and semantically) to non-expert end-users. We found that even basic capabilities are helpful in teaching students what is a formal argument and how to assemble a formal argument by examining possible formulas and applying those formulas while working towards a goal. Students were also able to transfer to an exam setting the techniques they used in assembling formal arguments within the environment.
Biography:
Andrei Lapets is a postdoctoral fellow at the Hariri Institute at Boston University. His research interests lie in finding ways to improve and measure the usability and accessibility of formal reasoning assistance tools and techniques. In 2011 he received a Ph.D. in Computer Science from Boston University, and he also holds S.M. and A.B. degrees in Computer Science and Mathematics from Harvard University.
References:
[1] A. Abel, B. Chang, and F. Pfenning. Human-Readable Machine Verifiable-Proofs for Teaching Constructive Logic. In U. Egly, A. Fiedler, H. Horacek, and S. Schmitt, editors, PTP '01: IJCAR Workshop on Proof Transformations, Proof Presentations and Complexity of Proofs, Siena, Italty, 2001.
[2] A. Asperti, C.S. Coen, E. Tassi, and S. Zacchiroli. User Interaction with the Matita Proof Assistant. Journal of Automated Reasoning, 39(2): 109-139, 2007.
[3] C.E. Brown. Verifying and Invalidating Textbook Proofs Using Scunak. In MKM '06: Mathematical Knowledge Management, pages 110-123, Wokingham, England, 2006.
[4] V. Ishakian, A. Lapets, A. Bestavros, and A. Kfoury. Formal Verification of SLA Tranformations. In Proceedings of CloudPerf 2011:IEEE International Workshop on Performance Aspects of Cloud and Service Virtualization, Washington, D.C., USA, July 2011.
[5] D. Jackson. Alloy: A Lightweight Object Modeling Notation. Software Engineering and Methodology, 11(2):256-290, 2002.
[6] C. Kaliszyk. Web Interfaces for Proof Assistants. Electronic Notes in Theoretical Computer Science, 174(2):49-61, 2007.
[7] F.Kamareddine and J.B. Wells. Computerizing Mathematical Text with MathLang. Electronic Notes in Theoretical Computer Science, 205:5-30, 2008.
[8] A. Lapets. User-Friendly Support for Common Concepts in a Lightweight Verifier. In Proceedings of VERIFY-2010: The 6th International Verification Workshop, Edinburgh, UK, July 2012
[9] A. Lapets and A. Kfoury. A User-friendly Interface for a Lightweight Verification System. In Proceedings of UITP'10: 9th International Workshop on User Interfaces for Theorem Provers, Edinburgh, UK, July 2010.
[10] J.H. Siekmann, C. Benzmuller, A. Fiedler, A. Meier, and M. Pollet. Proof Development with OMEGA: sqrt(2) Is Irrational. In LPAR, pages 367-387, 2002.
[11] A. Trybulex and H. Blair. Computer Assisted Reasoning with MIZAR. In Proceedings of the 9th IJCAI, pages 26-28, Los Angeles, CA, USA, 1985.
[12] M. Wenzel. The Isabelle/Isar Reference Manual, January 2011.
Switch to experimental viewerPresented as part of the 2012 HCSS conference.
Abstract:
This talk will describe our attempt to integrate data and services within a large organization, or to use the Internet. We have developed a theory, called Orc, in which we specify the sources of data and computations and how to orchestrate their executions (concurrently, one after the other, only when the majority of data is available, ...). The Orc programming language has a very small kernel, using external libraries of sites for support functions. The language starts with concurrency as the defining characteristic of programs. With its hierarchical and recursive combination facilities, it provides a model of structured concurrency. It can be used to orchestrate a small application or a giant one with many millions of threads, running in real time, from milliseconds to months. We have developed an experimental implementation of the language. The language, its documentation, and a web-interface are available at www.orc.csres.utexas.edu where programs can also be submitted for execution.
Motivation for the Work
The Internet promises to improve our lives by allowing us to connect to data and services that support our daily existence. It promises to improve business processes by integrating data from a variety of sources within specific applications. This vision has not yet been fully realized. Instead, our infrastructure is currently built from numerous closed-world systems, each built by a single individual or business entity, which enclose and protect data and computation, while providing access to their services through limited human-oriented interfaces. As a result, users are forced to manually mediate between these services, thus wasting time and effort.
To illustrate this situation, consider the following scenario. A patient receives an electronic prescription for a drug from a doctor. The patient compares prices at several near-by pharmacies, and chooses the cheapest one to fill the prescription. He pays the pharmacy and receives an electronic invoice which he sends to the insurance company for reimbursement with instructions to deposit the amount in his bank account. Eventually, he receives a confirmation from his bank. The entire computation is mediated by the patient who acquires data from one source, does some minor computations and sends data to other sources. The mediation is entirely manual.
This computing scenario is repeated millions of times a day in diverse areas such as business computing, e-commerce, health care and logistics. In spite of the extraordinary advances in mobile computing, human participation is currently required in every major step in most applications. This is because the infrastructure for efficient mediation is largely absent, thus contributing to cost and delay in these applications. We believe that humans can largely be eliminated, or assigned a supporting role, in many applications. Doing so is not only beneficial in terms of efficiency, but also essential if we are to realize the full potential of the interconnectivity among machines, using the services and data available in the internet or within a large organization. We would prefer that humans advise and report to the machines, rather than that humans direct the machines in each step. Orc has proved successful in this effort because it has small number of key concepts and a strong theoretical basis.
Biography:
Jayadev Misra is a professor and holder of the Schlumberger Centennial chair in Computer Science at the University of Texas at Austin. He is known for his work in the area of concurrent programming, with emphasis on rigorous methods to improve the programming process. His work on the UNITY methodology, jointly with Chandy, has been influential in both academia and industry, and has spawned a large number of tools and research projects. He has recently developed a programming language, called "Orc", for concurrent orchestrations of interacting components. He is also spear-heading an effort, jointly with Tony Hoare, to automate large-scale program verification. Misra is a fellow of ACM and IEEE. He held the Guggenheim fellowship during 1988-1989. He was the Strachey lecturer at Oxford University in 1996, and has held the Belgian FNRS International Chair of Computer Science in 1990. He is a member of the Academy of Distinguished Teachers at the University of Texas at Austin. Misra has been the past editor of several journals including: Computing Surveys, Journal of the ACM, Information Processing Letters, and the Formal Aspects of Computing. He is the author of two books, Parallel Program Design: A Foundation, Addison-Wesley, 1988, co-authored with Mani Chandy, and A Discipline of Multiprogramming, Springer-Verlag, 2001.
Switch to experimental viewerPresented as part of the 2012 HCSS conference.
Abstract:
Network and embedded security devices including cross-domain solutions, network guards, and message routers have information flow policies that are complex and crucial to fulfilling device requirements. To be used in military and national infrastructure systems, such devices must often be certified to very stringent criteria such as Common Criteria EAL 6/7 and DCID 6/3. Past certification efforts have created models of software code in theorem provers and proved that the models of code comply with security policies - leaving "trust gaps" between source code and models and inhibiting developers from proceeding with verification during the development process.
To remove these trust gaps and to make information flow specification and checking more accessible to developers, it is desirable to have frameworks that support specification of information flow policies directly at the source code level as part of developer workflows. Spark, a subset of Ada for engineering safety and security-critical systems, is one of the best commercially available frameworks for formal-methods-supported development of critical software. Spark provides a source-level information flow specification and checking framework that has been used to support certification arguments on several different information assurance projects.
Our experience with Spark is derived, in part, from our collaborations with Rockwell Collins where Spark is used in security critical projects including the Janus high-speed cryptography engine and several other embedded information assurance devices. While SPARK provides information flow annotations and associated automated checking mechanisms, industrial experience has revealed that these annotations are not precise enough to specify many desired information flow policies. One key problem is that information assurance applications such as cross-domain solutions often contain information flow policies that are conditional in the sense that data is allowed to flow between system components only when the system satisfies certain state predicates. However, existing information flow specification and verification environments, including Spark, can only capture unconditional information flows. Another limitation of the Spark information flow framework is that arrays are treated as indivisible entities-flow that involve only particular locations of an array have to be abstracted into flow on the whole array. This has substantial practical impact since Spark does not allow dynamic allocation of memory, and hence makes heavy use of arrays to implement complex data structures.
In this talk, we will describe our experience in developing and applying a Hoare logic for information flow that enables precise compositional specification of conditional information flow policies in programs with arrays, and automated deduction algorithms for checking and inferring contracts in an enhanced SPARK information flow contract language that hides the details of the logic from the developer. To provide very high degrees of confidence, our algorithm for verifying code compliance to an information flow contract emits formal certificates of correctness that are checked in the Coq proof assistant. We demonstrate our techniques on a collection of simple information assurance examples.
Although the framework currently targets Spark, the techniques can be applied to other languages used for high-assurance applications such as safety critical subsets of C.
Biography:
Dr. John Hatcliff is a University Distinguished Professor at Kansas State University in the Computing and Information Sciences Department. His research areas include software engineering, formal methods, static analysis, security, software certification, and medical device development and integration. He has been a principal investigator on a number of DoD-funded projects from agencies such as the Air Force Office of Scientific Research, Army Research Office, and DARPA. His research has also been supported by the National Science Foundation, Lockheed Martin, Rockwell Collins, Intel, and IBM.
Switch to experimental viewerPresented as part of the 2012 HCSS conference.
Biography:
Ms. Dawn C. Meyerriecks currently serves as the Assistant Director of National Intelligence for Acquisition, Technology & Facilities. In this role, she explores and delivers complex technologies underpinning national missions.
Prior to this assignment, she provided senior leadership business and technology consulting direction to government and commercial clients. In addition to consulting, she served on a number of government and commercial advisory boards, to include the STRATCOM C2 Advisory Group, the Defense Science Board, the NCTC Advisory Board, the National Academy of Sciences, the Unisys Federal Advisory Board, and the SunFed Advisory Board.
From 2004 - 2006, Ms. Meyerriecks served as the Senior Vice President for Product Technology at AOL. While at AOL, she was responsible for full lifecycle development and integration of all consumer-facing AOL products and services, including the re-launch of aol.com, AOL Instant Messenger, and the open client platform.
Prior to AOL, Ms. Meyerriecks worked for nearly ten years at the Defense Information Systems Agency (DISA), where she was the Chief Technology Officer and Technical Director for the Joint Interoperability and Engineering Organization (JIEO). Her last assignment was to charter and lead a new Global Information Grid (GIG) Enterprise Services organization. Ms. Meyerriecks worked at the Jet Propulsion Laboratory (JPL) as a senior engineer and product manager before her tenure at DISA.
In addition to being named the Government Computer News Department of Defense Person of the Year for 2004, Ms. Meyerriecks has been honored with numerous other awards, including InfoWorld 2002 CTO of the year; Federal Computer Week 2000 Top 100; InfoWorld 2001 CTO of the year for the government sector; the Presidential Distinguished Service Award, November 2001; the Senior Executive Service Exceptional Achievement Awards in 1998, 1999, 2000; and the National Performance Review in August 1996. In November 2001, she was featured in Fortune magazine as one of the top 100 intellectual leaders in the world.
Ms. Meyerriecks earned a Bachelor of Science Degree in Electrical Engineering with a double major in Business from Carnegie Mellon University and a Master of Science in Computer Science from Loyola Marymount University.
Switch to experimental viewerPresented as part of the 2012 HCSS conference.
Abstract:
"Invent a better mousetrap and the world will beat a path to your door? Not!"
Great ideas and even great research does not sell itself. Someone has to convince an organization to turn an idea into a workable application and then convince many people to use it. In a university or research lab, this process is called tech transfer and is rarely done well.
From its founding, Microsoft Research placed equal weight on developing new ideas and seeing them put to use. We have not always been successful, but our research innovations are in all of Microsoft's products. In this talk, I will share my experience building software development tools for Microsoft's use and my observation on how tech transfer can succeed.
Biography:
James Larus is a Principal Researcher in Microsoft Research. Larus has been an active contributor to the programming languages, compiler, software engineering, and computer architecture communities. He published many papers, received 26 US patents, and served on numerous program committees and NSF and NRC panels. His book, Transactional Memory (Morgan Claypool) appeared in 2007. Larus became an ACM Fellow in 2006.
Larus joined Microsoft Research as a Senior Researcher in 1998 to start and lead the Software Productivity Tools (SPT) group, which developed and applied a variety of innovative program analysis techniques to construct new tools to find software defects. This group's research had considerable impact on the research community (2011 SIGPLAN Most Influential Paper and the 2011 CAV Award), as well as being shipped in Microsoft products such as the Static Driver Verifier, FX/Cop, and other widely-used internal software development tools. Larus became an MSR Research Area Manager for programming languages and tools and started the Singularity research project, which demonstrated how modern programming languages and software engineering techniques can fundamentally improve software architectures. Subsequently, he helped start XCG, an effort in MSR to develop hardware and software support for cloud computing. In XCG, Larus led groups developing the Orleans framework for cloud programming and computer hardware projects.
Before joining Microsoft, Larus was an Assistant and Associate Professor of Computer Science at the University of Wisconsin-Madison, where he published approximately 60 research papers and co-led the Wisconsin Wind Tunnel (WWT) research project with Professors Mark Hill and David Wood. WWT was a DARPA and NSF-funded project investigated new approaches to simulating, building, and programming parallel shared-memory computers. Larus's research spanned a number of areas: including new and efficient techniques for measuring and recording executing programs' behavior, tools for analyzing and manipulating compiled and linked programs, programming languages for parallel computing, tools for verifying program correctness, and techniques for compiler analysis and optimization.
Larus received his MS and PhD in Computer Science from the University of California, Berkeley in 1989, and an AB in Applied Mathematics from Harvard in 1980. At Berkeley, Larus developed one of the first systems to analyze Lisp programs and determine how to best execute them on a parallel computer.
Presented as part of the 2012 HCSS conference.
Abstract:
This talk is intended to give a general overview of GTRI's VehicleForge.mil framework and how it was designed to meet the unique needs of the Open Hardware Design community. Specifically it will start with a summary of the VehicleForge.mil vision and outline several of the key challenges associated with the collaborative design of hardware. From these needs and challenges, the VehicleForge architecture will be summarized with an emphasis on those elements that support the needs of a distributed, multi-domain community. The talk will conclude with an overview of our VehicleForge design process and how we use VehicleForge to design and build VehicleForge.
Biography:
Dr. Jack Zentner is a Senior Research Engineer at the Electronic Systems Laboratory (ELSYS) in the Georgia Tech Research Institute (GTRI). His primary area of research is in the development of probabilistic design methods and design tools for large scale, complex systems. In relation to design tool development, he is the Principal Investigator and Lead Systems Engineer for the DARPA VehiclForge.mil project. Over the past several years, he has also been developing and applying strategic decision making methods for a range of government and industry customers to help them map their high-level goals and objectives to a prioritized set of actionable alternatives. In addition to conducting research, Dr. Zentner is a course developer and lecturer for Georgia Tech's Professional Masters in Applied Systems Engineering, and teaches several professional development courses in systems engineering as part of Georgia Tech's Defense Technology Certificate Program. These courses include, Advanced Problem Solving Methods and The Fundamentals of Modern Systems Engineering. He earned a B.S. in Mechanical Engineering from Colorado State University in 1999, and a M.S. and Ph.D. in Aerospace Engineering from the Georgia Institute of Technology in 2001 and 2006, respectively.
Slide presentation not yet available.
Switch to experimental viewer