|Version 1 (modified by sacerdot, 7 years ago) (diff)|
In spite of the many recent, astonishing achievements in proof checking (proofs of the four colors theorem Mackenzie?,Gonthier?] and prime number theorem Avigad et al.?], ongoing proof of the Kepler conjecture Hales?], etc.) and program verification (see the long list of results mentioned in Section 1.2?) the field of computer assisted reasoning, with its long term vision of a fully dependable, formally checked information society is still one of the most visionary field of computer science. The impressive results obtained in this area are only partially due to the advancements in the tools for assisted reasoning, but especially to a growing confidence in their potentialities.
Compilers, filling the gap between humans and machines from the programming point of view, are one of the main tools at the base of the information technologies. We cannot hope to build truly trustworthy, dependable and long-lasting systems over shaky foundations: the semantics of compilation must be better understood. There is some history of work in certified compilers�Dave?], and a series of workshop meetings on the topic (Compiler Optimization Meets Compiler Verification, 2003 - 2009). Only very recently has end-to-end certification of a realistic compiler been attempted� Leroy09?,Leroy?,Leroy et al.?].
This previous work is focused on denotational aspects of computation. Our proposal is new in addressing intensional aspects, such as preserving space or time bounds of the source in compiled code. This is still perceived by the compiler community as a highly visionary goal, that would provide a major milestone in this area.
Our proposal contributes to the long-term community goal of formally checked reliable computer systems both directly, by our original work on a certified complexity preserving compiler, and more generally by large scale use of state-of-the-art tools and techniques for software certification. Our proposal takes such technologies past the proof-of-concept phase to the maturity level that could allow a substantial technology transfer towards industries.
The direct, long term, impact of the project on real-time systems (in particular reactive systems) will be that of dramatically increasing the trust on response time of the systems, while at the same time simplifying the code by removing some of the checks for approaching deadlines. Moreover, we envision a future where compilers (certified or not) will give back to the user guarantees and informations on the intensional behaviour of the produced binaries (like time, space and power consumption), to be exploited by semi-automated reasoning tools.
The certified compiler developed in the project is just a first step towards this direction. Being the first example of a compiler that provides intensional guarantees, we do not expect to be able to immediately exploit the provided cost annotations in an effective way. This requires an effective convergence of three communities, that are the compiler implementation, proof assistance and invariant generation ones. Moreover, further work is required to target processors for non embedded systems (which implement many hardware optimizations) and to enlarge the class of inter-procedural and loop optimizations that the compiler can handle while tracking the intensional properties. Together with the need of better understanding the issues related to multiple backends for different architectures, the time-frame required to bring this technology to mainstream can be estimated in a decade.
It would be certainly interesting to address advanced features of general purpose processors such as cache memory. However, we believe that such a task is not feasible in the time and resources allocated to our project.
Finally, we recall that Task�5.2 is dedicated to tools and techniques for automatic/computer-assisted inference of the global execution cost of C programs.