# Changeset 3627

Ignore:
Timestamp:
Mar 6, 2017, 7:27:15 PM (15 months ago)
Message:

Abstract tweaks, compression, removing redundancies and ugly turns of phrase, etc.

File:
1 edited

### Legend:

Unmodified
 r3626 Intensional properties of programs---time and space usage, for example---are an important component of the specification of a program, and therefore overall program correctness. Here, intensional properties can be analysed \emph{asymptotically}, or \emph{concretely}, with the latter analyses computing resource bounds in terms of clock cycles, bits transmitted, bytes allocated, or other basal units of resource consumption, for a program execution. For many application domains, for instance libraries exporting cryptographic primitives that must be hardened against timing side-channel attacks, concrete complexity analysis is arguably more important than asymptotic. For many application domains, for instance libraries exporting cryptographic primitives that must be impervious to side-channel attacks, concrete complexity analysis is arguably more important than asymptotic. Traditional static analysis tools for resource analysis suffer from a number of disadvantages. They are sophisticated, complex pieces of software, that must be incorporated into the trusted codebase of an application if their analysis is to be believed. They also reason on the machine code produced by a compiler, rather than at the level of the source-code that the application programmer is familiar with, and understands. More ideal would be a mechanism to lift' a cost model from the machine code generated by a compiler, back to the source code level, where analyses could be performed in terms of source code, abstractions, and control-flow constructs written and understood by the programmer. However: incorporating the precision of traditional static analyses into a high-level approach is a challenge, and how to do this reliably is not \emph{a priori} clear. More ideal would be a mechanism to lift' a cost model from the machine code generated by a compiler, back to the source code level, where analyses could be performed in terms understood by the programmer. How one could incorporate the precision of traditional static analyses into such a high-level approach, and how to do this reliably, is not \emph{a priori} clear. In this paper, we describe the scientific achievements of the European Union's FET-Open Project CerCo (`Certified Complexity'). CerCo's main achievement is the development of a technique for analysing intensional properties of programs at the source level, with little or no loss of accuracy and a small trusted code base. The core component of the project a C compiler, verified in the Matita theorem prover, that produces an instrumented copy of the source code, in addition to generating object code. The core component of the project a C compiler, verified in the Matita theorem prover, that produces an instrumented copy of the source code in addition to generating object code. This instrumentation exposes, and tracks precisely, the actual (non-asymptotic) computational cost of the input program at the source level. Untrusted invariant generators and trusted theorem provers may then be used to compute and certify the parametric execution time of the code.