Changeset 3453 for Papers


Ignore:
Timestamp:
Feb 20, 2014, 12:38:06 PM (5 years ago)
Author:
mulligan
Message:

reworking up to Section 2

File:
1 edited

Legend:

Unmodified
Added
Removed
  • Papers/fopara2013/fopara13.tex

    r3452 r3453  
    9898the development of a technique for analysing non-functional
    9999properties of programs (time, space) at the source level, with little or no loss of accuracy,
    100 and with a small trusted code base. We developed a certified compiler that translates C source code to object code, producing a copy of the source code instrumented with cost annotations as an additional side-effect. These instrumentations expose at
     100and with a small trusted code base. We developed a certified compiler that translates C source code to object code, producing a copy of the source code embellished with cost annotations as an additional side-effect. These annotations expose at
    101101the source level---and track precisely---the actual (non-asymptotic)
    102102computational cost of the input program. Untrusted invariant generators and trusted theorem provers
     
    124124but these analyses may then need information
    125125about the high-level functional behaviour of the program that must then be reconstructed.
    126 
    127 The analysis of non-functional constraints on low-level object code presents several problems:
    128 \begin{enumerate}
     126This analysis on low-level object code has several problems:
     127\begin{itemize*}
    129128\item
    130 It can be hard to deduce the high-level structure of the program in the presence of compiler optimisations.
     129It can be hard to deduce the high-level structure of the program after compiler optimisations.
    131130The object code produced by an optimising compiler may have radically different control flow to the original source code program.
    132131\item
    133 Techniques that operate on object code are not useful early in the development cycle of a program, yet problems with a program's design or implementation are cheaper to resolve earlier rather than later.
     132Techniques that operate on object code are not useful early in the development process of a program, yet problems with a program's design or implementation are cheaper to resolve earlier in the process, rather than later.
    134133\item
    135134Parametric cost analysis is very hard: how can we reflect a cost that depends on the execution state, for example the
     
    138137\item
    139138Performing functional analyses on object code makes it hard for the programmer to provide information about the program and its expected execution, leading to a loss of precision in the resulting analyses.
    140 \end{enumerate}
    141 
     139\end{itemize*}
    142140\paragraph{Vision and approach.}
    143  We want to reconcile functional and
     141We wish to reconcile functional and
    144142non-functional analyses: to share information and perform both at the same time
    145 on source code.
     143on high-level source code.
    146144%
    147145What has previously prevented this approach is the lack of a uniform and precise
    148 cost model for high-level code: 1) each statement occurrence is compiled
    149 differently and optimisations may change control flow; 2) the cost of an object
     146cost model for high-level code as each statement occurrence is compiled
     147differently, optimisations may change control flow, and the cost of an object
    150148code instruction may depend on the runtime state of hardware components like
    151149pipelines and caches, all of which are not visible in the source code.
    152150
    153 To solve the issue, we envision a new generation of compilers able to keep track
    154 of program structure through compilation and optimisation, and to exploit that
    155 information to define a cost model for source code that is precise, non-uniform,
    156 and which accounts for runtime state. With such a source-level cost model we can
     151We envision a new generation of compilers that track program structure through compilation and optimisation and exploit this
     152information to define a precise, non-uniform cost model for source code that accounts for runtime state. With such a cost model we can
    157153reduce non-functional verification to the functional case and exploit the state
    158154of the art in automated high-level verification~\cite{survey}. The techniques
    159155currently used by the Worst Case Execution Time (WCET) community, who perform analyses on object code,
    160 are still available but can now be coupled with an additional source-level
    161 analysis. Where the approach produces precise cost models too complex to reason
    162 about, safe approximations can be used to trade complexity with precision.
    163 Finally, source code analysis can be used during the early stages of development, when
    164 components have been specified but not implemented: source code modularity means
    165 that it is enough to specify the non-functional behaviour of unimplemented
     156are still available but can be coupled with additional source-level
     157analyses. Where our approach produces overly complex cost models, safe approximations can be used to trade complexity with precision.
     158Finally, source code analysis can be used early in the development process, when
     159components have been specified but not implemented, as modularity means
     160that it is enough to specify the non-functional behaviour of missing
    166161components.
    167 
    168162\paragraph{Contributions.}
    169 We have developed what we refer to as \emph{the labelling approach} \cite{labelling}, a
     163We have developed \emph{the labelling approach}~\cite{labelling}, a
    170164technique to implement compilers that induce cost models on source programs by
    171165very lightweight tracking of code changes through compilation. We have studied
    172 how to formally prove the correctness of compilers implementing this technique.
    173 We have implemented such a compiler from C to object binaries for the MCS-51
    174 micro-controller for predicting execution time and stack space usage,
    175 and verified it in an interactive theorem prover.  As we are targeting
    176 an embedded micro-controller we do not consider dynamic memory allocation.
     166how to formally prove the correctness of compilers implementing this technique, and
     167have implemented such a compiler from C to object binaries for the MCS-51
     168microcontroller for predicting execution time and stack space usage,
     169verifying it in an interactive theorem prover.  As we are targeting
     170an embedded microcontroller we do not consider dynamic memory allocation.
    177171
    178172To demonstrate source-level verification of costs we have implemented
    179 a Frama-C plugin \cite{framac} that invokes the compiler on a source
    180 program and uses this to generate invariants on the high-level source
     173a Frama-C plugin~\cite{framac} that invokes the compiler on a source
     174program and uses it to generate invariants on the high-level source
    181175that correctly model low-level costs. The plugin certifies that the
    182176program respects these costs by calling automated theorem provers, a
     
    187181
    188182\section{Project context and approach}
    189 Formal methods for the verification of functional properties of programs have
    190 now reached a level of maturity and automation that is facilitating a slow but
    191 increasing adoption in production environments. For safety critical code, it is
     183Formal methods for the verification of functional properties of programs are now sufficiently mature to find increasing application in production environments. For safety critical code, it is
    192184becoming commonplace to combine rigorous software engineering methodologies and testing
    193185with static analysis, taking the strong points of each approach and mitigating
Note: See TracChangeset for help on using the changeset viewer.