Changeset 3113 for Deliverables
 Timestamp:
 Apr 9, 2013, 7:00:25 PM (7 years ago)
 File:

 1 edited
Legend:
 Unmodified
 Added
 Removed

Deliverables/D6.3/report.tex
r3111 r3113 114 114 115 115 \section{Middle and Long Term Improvements} 116 In order to identify the middle and long term improvements, we briefly review 117 here the premises and goals of the CerCo approach to resource analysis. 118 \begin{itemize} 119 \item There is a lot of recent and renewed activity in the formal method 120 community to accomodate resource analysis using techniques derived from 121 functional analysis (type systems, logics, abstract interpretation, 122 amortized analysis applied to data structures, etc.) 123 \item Most of this work, which currently remains at theoretical level, 124 is focused on high level languages and it assumes the existence of correct 125 and compositional resource cost models. 126 \item High level languages are compiled to object code by compilers that 127 should respect the functional properties of the program. However, because 128 of optimizations and the inherently non compositional nature of compilation, 129 compilers do not respect compositional cost models that are imposed a priori 130 on the source language. By controlling the compiler and coupling it with a 131 WCET analyser, it is actually possible to 132 choose the cost model in such a way that the cost bounds are high enough 133 to bound the cost of every produced code. This was attempted for instance 134 in the EmBounded project with good success. However, we believe that bounds 135 obtained in this way have few possibilities of being tight. 136 \item Therefore our approach consists in having the compiler generate the 137 cost model for the user by combining tracking of basic blocks during code 138 transformations with a static resource analysis on the object code for basic 139 blocks. We formally prove the compiler to respect the cost model that is 140 induced on the source level based on a very few assumptions: basically the 141 cost of a sequence of instructions should be associative and commutative 142 and it should not depend on the machine status, except its program counter. 143 Commutativity can be relaxed at the price of introducing more cost updates 144 in the instrumented source code. 145 \item The cost model for basic blocks induced on the source language must then 146 be exploited to derive cost invariants and to prove them automatically. 147 In CerCo we have shown how even simple invariant generations techniques 148 are sufficient to enable the fully automatic proving of parametric WCET 149 bounds for simple C programs and for Lustre programs of arbitrary complexity. 150 \end{itemize} 151 152 Compared to traditional WCET techniques, our approach currently has many 153 similarities, some advantages and some limitations. Both techniques need to 154 perform data flow analysis on the control flow graph of the program and both 155 techniques need to estimate the cost of control blocks of instructions. 156 157 \subsection{Control flow analysis} 158 159 The first main difference is in the control flow analysis. Traditional WCET 160 starts from object code and reconstructs the control flow graph from it. 161 Moreover, abstract interpretation is heavily employed to bound the number of 162 executions of cycles. In order to improve the accuracy of estimation, control 163 flow constraints are provided by the user, usually as systems of (linear) 164 inequalities. In order to do this, the user, helped by the system, needs to 165 relate the object code control flow graph with the source one, because it is 166 on the latter that the bounds can be figured out and be understood. This 167 operations is untrusted and potentially error prone for complex optimizations 168 (like aggressive loop optimizations). Efficient tools from linear algebra are 169 then used to solve the systems of inequations obtained by the abstract 170 interpreter and from the user constraints. 171 172 In CerCo, instead, we assume full control on the compiler that 173 is able to relate, even in non trivial ways, the object code control flow graph 174 onto the source code control flow graph. A clear disadvantage is the 175 impossibility of applying the tool on the object code produced by third party 176 compilers. On the other hand, we get rid of the possibility of errors 177 in the reconstruction of the control flow graph and in the translation of 178 high level constraints into low level constraints. The second potentially 179 important advantage is that, once we are dealing with the source language, 180 we can augment the precision of our dataflow analysis by combining together 181 functional and non functional invariants. This is what we attempted with 182 the CerCo Cost Annotating FramaC PlugIn. The FramaC architecture allows 183 several plugins to perform all kind of static analisys on the source code, 184 reusing results from other plugins and augmenting the source code with their 185 results. The techniques are absolutely not limited to linear algebra and 186 abstract interpretation, and the most important plugins call domain specific 187 and general purpose automated theorem provers to close proof obligations of 188 arbitrary shape and complexity. 189 190 In principle, the extended flexibility of the analysis should allow for a major 191 advantage of our technique in terms of precision, also 192 considering that all analysis used in traditional WCET can still be implemented 193 as plugins. In particular, the target we have in mind are systems that are 194 both (hard) real time and safety critical. Being safety critical, we can already 195 expect them to be fully or partially specified at the functional level. 196 Therefore we expect that the additional functional invariants should allow to 197 augment the precision of the cost bounds, up to the point where the parametric 198 cost bound is fully precise. 199 In practice, we have not had the time to perform extensive 200 comparisons on the kind of code used by industry in production systems. 201 The first middle term improvement of CerCo would then consist in this kind 202 of analysis, to support or disprove our expectations. In the case where our 203 technique remains promising, the next long term improvement would consist in 204 integrating in the FramaC plugin adhoc analysis and combinations of analysis 205 that would augment the coverage of the efficiency of the cost estimation 206 techniques. 207 208 \subsection{Static analysis of basic blocks} 116 209 117 210 \end{document}
Note: See TracChangeset
for help on using the changeset viewer.