Changeset 3113


Ignore:
Timestamp:
Apr 9, 2013, 7:00:25 PM (4 years ago)
Author:
sacerdot
Message:

Some work on control flow analysis.

File:
1 edited

Legend:

Unmodified
Added
Removed
  • Deliverables/D6.3/report.tex

    r3111 r3113  
    114114
    115115\section{Middle and Long Term Improvements}
     116In order to identify the middle and long term improvements, we briefly review
     117here the premises and goals of the CerCo approach to resource analysis.
     118\begin{itemize}
     119 \item There is a lot of recent and renewed activity in the formal method
     120   community to accomodate resource analysis using techniques derived from
     121   functional analysis (type systems, logics, abstract interpretation,
     122   amortized analysis applied to data structures, etc.)
     123 \item Most of this work, which currently remains at theoretical level,
     124   is focused on high level languages and it assumes the existence of correct
     125   and compositional resource cost models.
     126 \item High level languages are compiled to object code by compilers that
     127   should respect the functional properties of the program. However, because
     128   of optimizations and the inherently non compositional nature of compilation,
     129   compilers do not respect compositional cost models that are imposed a priori
     130   on the source language. By controlling the compiler and coupling it with a
     131   WCET analyser, it is actually possible to
     132   choose the cost model in such a way that the cost bounds are high enough
     133   to bound the cost of every produced code. This was attempted for instance
     134   in the EmBounded project with good success. However, we believe that bounds
     135   obtained in this way have few possibilities of being tight.
     136 \item Therefore our approach consists in having the compiler generate the
     137   cost model for the user by combining tracking of basic blocks during code
     138   transformations with a static resource analysis on the object code for basic
     139   blocks. We formally prove the compiler to respect the cost model that is
     140   induced on the source level based on a very few assumptions: basically the
     141   cost of a sequence of instructions should be associative and commutative
     142   and it should not depend on the machine status, except its program counter.
     143   Commutativity can be relaxed at the price of introducing more cost updates
     144   in the instrumented source code.
     145 \item The cost model for basic blocks induced on the source language must then
     146   be exploited to derive cost invariants and to prove them automatically.
     147   In CerCo we have shown how even simple invariant generations techniques
     148   are sufficient to enable the fully automatic proving of parametric WCET
     149   bounds for simple C programs and for Lustre programs of arbitrary complexity.
     150\end{itemize}
     151
     152Compared to traditional WCET techniques, our approach currently has many
     153similarities, some advantages and some limitations. Both techniques need to
     154perform data flow analysis on the control flow graph of the program and both
     155techniques need to estimate the cost of control blocks of instructions.
     156
     157\subsection{Control flow analysis}
     158
     159The first main difference is in the control flow analysis. Traditional WCET
     160starts from object code and reconstructs the control flow graph from it.
     161Moreover, abstract interpretation is heavily employed to bound the number of
     162executions of cycles. In order to improve the accuracy of estimation, control
     163flow constraints are provided by the user, usually as systems of (linear)
     164inequalities. In order to do this, the user, helped by the system, needs to
     165relate the object code control flow graph with the source one, because it is
     166on the latter that the bounds can be figured out and be understood. This
     167operations is untrusted and potentially error prone for complex optimizations
     168(like aggressive loop optimizations). Efficient tools from linear algebra are
     169then used to solve the systems of inequations obtained by the abstract
     170interpreter and from the user constraints.
     171
     172In CerCo, instead, we assume full control on the compiler that
     173is able to relate, even in non trivial ways, the object code control flow graph
     174onto the source code control flow graph. A clear disadvantage is the
     175impossibility of applying the tool on the object code produced by third party
     176compilers. On the other hand, we get rid of the possibility of errors
     177in the reconstruction of the control flow graph and in the translation of
     178high level constraints into low level constraints. The second potentially
     179important advantage is that, once we are dealing with the source language,
     180we can augment the precision of our dataflow analysis by combining together
     181functional and non functional invariants. This is what we attempted with
     182the CerCo Cost Annotating Frama-C Plug-In. The Frama-C architecture allows
     183several plug-ins to perform all kind of static analisys on the source code,
     184reusing results from other plug-ins and augmenting the source code with their
     185results. The techniques are absolutely not limited to linear algebra and
     186abstract interpretation, and the most important plug-ins call domain specific
     187and general purpose automated theorem provers to close proof obligations of
     188arbitrary shape and complexity.
     189
     190In principle, the extended flexibility of the analysis should allow for a major
     191advantage of our technique in terms of precision, also
     192considering that all analysis used in traditional WCET can still be implemented
     193as plug-ins. In particular, the target we have in mind are systems that are
     194both (hard) real time and safety critical. Being safety critical, we can already
     195expect them to be fully or partially specified at the functional level.
     196Therefore we expect that the additional functional invariants should allow to
     197augment the precision of the cost bounds, up to the point where the parametric
     198cost bound is fully precise.
     199In practice, we have not had the time to perform extensive
     200comparisons on the kind of code used by industry in production systems.
     201The first middle term improvement of CerCo would then consist in this kind
     202of analysis, to support or disprove our expectations. In the case where our
     203technique remains promising, the next long term improvement would consist in
     204integrating in the Frama-C plug-in ad-hoc analysis and combinations of analysis
     205that would augment the coverage of the efficiency of the cost estimation
     206techniques.
     207
     208\subsection{Static analysis of basic blocks}
    116209
    117210\end{document}
Note: See TracChangeset for help on using the changeset viewer.