Changeset 3611


Ignore:
Timestamp:
Mar 6, 2017, 2:23:47 PM (8 weeks ago)
Author:
boender
Message:

Rewrote introduction

File:
1 edited

Legend:

Unmodified
Added
Removed
  • Papers/jar-cerco-2017/cerco.tex

    r3610 r3611  
    155155
    156156%\paragraph{Problem statement.}
    157 Programs can be specified with both
    158 functional constraints (what the program must do) and non-functional constraints (what time, space or other resources the program may use).  In the current
     157Programs can be specified using both
     158functional constraints (what the program must do) and non-functional constraints (what time, space or other resources the program can use).  At the current
    159159state of the art, functional properties are verified
    160160by combining user annotations---preconditions, invariants, and so on---with a
     
    162162interpretation, theorem proving, and so on---on the program's high-level source code.
    163163By contrast, many non-functional properties
    164 are verified using analyses on low-level object code,
     164are verified by analysing the low-level object code.
    165165%\footnote{A notable
    166166%  exception is the explicit allocation of heap space in languages like C, which
    167167%  can be handled at the source level.}
    168 but these analyses may then need information
    169 about the high-level functional behaviour of the program that must then be reconstructed.
    170 This analysis on low-level object code has several problems:
     168
     169Analysis on low-level object code has several problems, however:
    171170\begin{itemize*}
    172171\item
     
    180179looking at the source code?
    181180\item
    182 Performing functional analyses on object code makes it hard for the programmer to provide information about the program and its expected execution, leading to a loss of precision in the resulting analyses.
     181Performing functional analysis on object code makes it hard for the programmer to provide information about the program and its expected execution, leading to a loss of precision in the resulting analysis.
    183182\end{itemize*}
     183
    184184\paragraph{Vision and approach.}
    185185We want to reconcile functional and
    186 non-functional analyses: to share information and perform both at the same time
    187 on high-level source code.
     186non-functional analysis: to share information between them and perform both at
     187the same time on high-level source code.
    188188%
    189 What has previously prevented this approach is the lack of a uniform and precise
    190 cost model for high-level code as each statement occurrence is compiled
     189What has previously prevented this approach from being applied is the lack of a
     190uniform and precise cost model for high-level code, since each statement
     191occurrence is compiled
    191192differently, optimisations may change control flow, and the cost of an object
    192193code instruction may depend on the runtime state of hardware components like
    193 pipelines and caches, all of which are not visible in the source code.
     194pipelines and caches, none of which are visible in the source code.
    194195
    195196We envision a new generation of compilers that track program structure through compilation and optimisation and exploit this
     
    197198reduce non-functional verification to the functional case and exploit the state
    198199of the art in automated high-level verification~\cite{survey}. The techniques
    199 currently used by the Worst Case Execution Time (WCET) community, who perform analyses on object code,
    200 are still available but can be coupled with additional source-level
    201 analyses. Where our approach produces overly complex cost models, safe approximations can be used to trade complexity with precision.
    202 Finally, source code analysis can be used early in the development process, when
    203 components have been specified but not implemented, as modularity means
     200currently used by the Worst Case Execution Time (WCET) community, which perform
     201analysis on object code, are still available but can be coupled with additional source-level analysis. In cases where our approach produces overly complex cost
     202models, safe approximations can be used to trade complexity for precision.
     203Finally, source code analysis can be used early in the development process,
     204when components have been specified but not implemented, as modularity means
    204205that it is enough to specify the non-functional behaviour of missing
    205206components.
     207
    206208\paragraph{Contributions.}
    207209We have developed \emph{the labelling approach}~\cite{labelling}, a
     
    210212how to formally prove the correctness of compilers implementing this technique, and
    211213have implemented such a compiler from C to object binaries for the 8051
    212 microcontroller for predicting execution time and stack space usage,
     214microcontroller that predicts execution time and stack space usage,
    213215verifying it in an interactive theorem prover.  As we are targeting
    214216an embedded microcontroller we do not consider dynamic memory allocation.
Note: See TracChangeset for help on using the changeset viewer.