Changeset 3631


Ignore:
Timestamp:
Mar 7, 2017, 10:50:57 AM (7 weeks ago)
Author:
mulligan
Message:

Rewrote abstract

File:
1 edited

Legend:

Unmodified
Added
Removed
  • Papers/jar-cerco-2017/cerco.tex

    r3627 r3631  
    6868the Seventh Framework Programme for Research of the European Commission, under
    6969FET-Open grant number: 243881}}
    70 \subtitle{}
     70\subtitle{Certified resource analysis for a large fragment of C}
    7171\journalname{Journal of Automated Reasoning}
    7272\titlerunning{Certified Complexity}
     
    9595
    9696\begin{abstract}
    97 Intensional properties of programs---time and space usage, for example---are an important component of the specification of a program, and therefore overall program correctness.
    98 Here, intensional properties can be analysed \emph{asymptotically}, or \emph{concretely}, with the latter analyses computing resource bounds in terms of clock cycles, bits transmitted, bytes allocated, or other basal units of resource consumption, for a program execution.
    99 For many application domains, for instance libraries exporting cryptographic primitives that must be impervious to side-channel attacks, concrete complexity analysis is arguably more important than asymptotic.
     97Concrete non-functional properties of programs---for example, time and space usage as measured in basal units of measure such as milliseconds and bytes allocated---are important components of the specification of a program, and therefore overall program correctness.
     98Indeed, for many application domains, concrete complexity analysis is arguably more important than any asymptotic complexity analysis.
     99Libraries exporting cryptographic primitives that must be impervious to timing side-channel attacks, or real-time applications with hard timing limits on responsiveness, are examples.
    100100
    101 Traditional static analysis tools for resource analysis suffer from a number of disadvantages.
    102 They are sophisticated, complex pieces of software, that must be incorporated into the trusted codebase of an application if their analysis is to be believed.
    103 They also reason on the machine code produced by a compiler, rather than at the level of the source-code that the application programmer is familiar with, and understands.
     101Worst Case Execution Time tools, based on abstract interpretation, currently represent the state-of-the-art in determining concrete time bounds for a program execution.
     102These tools suffer from a number of disadvantages, not least the fact that all analysis is performed on machine code, rather than high-level source code, making results hard to interpret by programmers.
     103Further, these tools are often complex pieces of software, whose analysis is hard to trust.
    104104More ideal would be a mechanism to `lift' a cost model from the machine code generated by a compiler, back to the source code level, where analyses could be performed in terms understood by the programmer.
    105 How one could incorporate the precision of traditional static analyses into such a high-level approach, and how to do this reliably, is not \emph{a priori} clear.
     105How one could incorporate the precision of traditional static analyses into such a high-level approach---and how this could be done reliably---is not \emph{a priori} clear.
    106106
    107 In this paper, we describe the scientific achievements of the European Union's FET-Open Project CerCo (`Certified Complexity').
    108 CerCo's main achievement is the development of a technique for analysing intensional properties of programs at the source level, with little or no loss of accuracy and a small trusted code base.
    109 The core component of the project a C compiler, verified in the Matita theorem prover, that produces an instrumented copy of the source code in addition to generating object code.
    110 This instrumentation exposes, and tracks precisely, the actual (non-asymptotic) computational cost of the input program at the source level.
     107In this paper, we describe the scientific contributions of the European Union's FET-Open Project CerCo (`Certified Complexity').
     108CerCo's main achievement is the development of a technique for analysing non-functional properties of programs at the source level, with little or no loss of accuracy, and a small trusted code base.
     109The core component of the project is a compiler for a large fragment of the C programming language, verified in the Matita theorem prover, that produces an instrumented copy of the source code in addition to generating object code.
     110This instrumentation exposes, and tracks precisely, the concrete (non-asymptotic) computational cost of the input program at the source level.
    111111Untrusted invariant generators and trusted theorem provers may then be used to compute and certify the parametric execution time of the code.
    112112
    113 We describe the architecture of our C compiler, its proof of correctness, the associated toolchain developed around the compiler, as well as a case study in applying this toolchain to the verification of concrete timing bounds on cryptographic code.
     113We describe the architecture of our C compiler, its proof of correctness, and the associated toolchain developed around the compiler.
     114Using our toolchain, we describe a case study in applying our technique to the verification of concrete timing bounds for cryptographic code.
    114115
    115 \keywords{Verified compilation \and Complexity analysis \and CerCo (`Certified Complexity')}
     116\keywords{Verified compilation \and Complexity analysis \and Worst Case Execution Time analysis \and CerCo (`Certified Complexity') \and Matita}
    116117\end{abstract}
    117118
Note: See TracChangeset for help on using the changeset viewer.