- Timestamp:
- Feb 20, 2014, 12:38:06 PM (5 years ago)
- File:
-
- 1 edited
Legend:
- Unmodified
- Added
- Removed
-
Papers/fopara2013/fopara13.tex
r3452 r3453 98 98 the development of a technique for analysing non-functional 99 99 properties of programs (time, space) at the source level, with little or no loss of accuracy, 100 and with a small trusted code base. We developed a certified compiler that translates C source code to object code, producing a copy of the source code instrumented with cost annotations as an additional side-effect. These instrumentations expose at100 and with a small trusted code base. We developed a certified compiler that translates C source code to object code, producing a copy of the source code embellished with cost annotations as an additional side-effect. These annotations expose at 101 101 the source level---and track precisely---the actual (non-asymptotic) 102 102 computational cost of the input program. Untrusted invariant generators and trusted theorem provers … … 124 124 but these analyses may then need information 125 125 about the high-level functional behaviour of the program that must then be reconstructed. 126 127 The analysis of non-functional constraints on low-level object code presents several problems: 128 \begin{enumerate} 126 This analysis on low-level object code has several problems: 127 \begin{itemize*} 129 128 \item 130 It can be hard to deduce the high-level structure of the program in the presence ofcompiler optimisations.129 It can be hard to deduce the high-level structure of the program after compiler optimisations. 131 130 The object code produced by an optimising compiler may have radically different control flow to the original source code program. 132 131 \item 133 Techniques that operate on object code are not useful early in the development cycle of a program, yet problems with a program's design or implementation are cheaper to resolve earlierrather than later.132 Techniques that operate on object code are not useful early in the development process of a program, yet problems with a program's design or implementation are cheaper to resolve earlier in the process, rather than later. 134 133 \item 135 134 Parametric cost analysis is very hard: how can we reflect a cost that depends on the execution state, for example the … … 138 137 \item 139 138 Performing functional analyses on object code makes it hard for the programmer to provide information about the program and its expected execution, leading to a loss of precision in the resulting analyses. 140 \end{enumerate} 141 139 \end{itemize*} 142 140 \paragraph{Vision and approach.} 143 We wantto reconcile functional and141 We wish to reconcile functional and 144 142 non-functional analyses: to share information and perform both at the same time 145 on source code.143 on high-level source code. 146 144 % 147 145 What has previously prevented this approach is the lack of a uniform and precise 148 cost model for high-level code : 1)each statement occurrence is compiled149 differently and optimisations may change control flow; 2)the cost of an object146 cost model for high-level code as each statement occurrence is compiled 147 differently, optimisations may change control flow, and the cost of an object 150 148 code instruction may depend on the runtime state of hardware components like 151 149 pipelines and caches, all of which are not visible in the source code. 152 150 153 To solve the issue, we envision a new generation of compilers able to keep track 154 of program structure through compilation and optimisation, and to exploit that 155 information to define a cost model for source code that is precise, non-uniform, 156 and which accounts for runtime state. With such a source-level cost model we can 151 We envision a new generation of compilers that track program structure through compilation and optimisation and exploit this 152 information to define a precise, non-uniform cost model for source code that accounts for runtime state. With such a cost model we can 157 153 reduce non-functional verification to the functional case and exploit the state 158 154 of the art in automated high-level verification~\cite{survey}. The techniques 159 155 currently used by the Worst Case Execution Time (WCET) community, who perform analyses on object code, 160 are still available but can now be coupled with an additional source-level 161 analysis. Where the approach produces precise cost models too complex to reason 162 about, safe approximations can be used to trade complexity with precision. 163 Finally, source code analysis can be used during the early stages of development, when 164 components have been specified but not implemented: source code modularity means 165 that it is enough to specify the non-functional behaviour of unimplemented 156 are still available but can be coupled with additional source-level 157 analyses. Where our approach produces overly complex cost models, safe approximations can be used to trade complexity with precision. 158 Finally, source code analysis can be used early in the development process, when 159 components have been specified but not implemented, as modularity means 160 that it is enough to specify the non-functional behaviour of missing 166 161 components. 167 168 162 \paragraph{Contributions.} 169 We have developed what we refer to as \emph{the labelling approach}\cite{labelling}, a163 We have developed \emph{the labelling approach}~\cite{labelling}, a 170 164 technique to implement compilers that induce cost models on source programs by 171 165 very lightweight tracking of code changes through compilation. We have studied 172 how to formally prove the correctness of compilers implementing this technique .173 Wehave implemented such a compiler from C to object binaries for the MCS-51174 micro -controller for predicting execution time and stack space usage,175 and verifiedit in an interactive theorem prover. As we are targeting176 an embedded micro -controller we do not consider dynamic memory allocation.166 how to formally prove the correctness of compilers implementing this technique, and 167 have implemented such a compiler from C to object binaries for the MCS-51 168 microcontroller for predicting execution time and stack space usage, 169 verifying it in an interactive theorem prover. As we are targeting 170 an embedded microcontroller we do not consider dynamic memory allocation. 177 171 178 172 To demonstrate source-level verification of costs we have implemented 179 a Frama-C plugin 180 program and uses thisto generate invariants on the high-level source173 a Frama-C plugin~\cite{framac} that invokes the compiler on a source 174 program and uses it to generate invariants on the high-level source 181 175 that correctly model low-level costs. The plugin certifies that the 182 176 program respects these costs by calling automated theorem provers, a … … 187 181 188 182 \section{Project context and approach} 189 Formal methods for the verification of functional properties of programs have 190 now reached a level of maturity and automation that is facilitating a slow but 191 increasing adoption in production environments. For safety critical code, it is 183 Formal methods for the verification of functional properties of programs are now sufficiently mature to find increasing application in production environments. For safety critical code, it is 192 184 becoming commonplace to combine rigorous software engineering methodologies and testing 193 185 with static analysis, taking the strong points of each approach and mitigating
Note: See TracChangeset
for help on using the changeset viewer.