Changeset 509


Ignore:
Timestamp:
Feb 14, 2011, 3:27:57 PM (6 years ago)
Author:
sacerdot
Message:

Background.

File:
1 edited

Legend:

Unmodified
Added
Removed
  • Deliverables/D4.1/ITP-Paper/itp-2011.tex

    r508 r509  
    6363% SECTION                                                                      %
    6464%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
    65 \section{Introduction}
     65\section{Background}
    6666\label{sect.introduction}
    6767
    68 Compiler verification, as of late, is as of late a `hot topic'.
    69 This rapidly growing field is motivated by one simple question: `to what extent can you trust your compiler?'
    70 Existing verification efforts have broadly focussed on \emph{semantic correctness}, that is, creating a compiler that is guaranteed to preserve the semantics of a program during the compilation process.
    71 However, there is another important facet of correctness that has not garnered much attention, that is, correctness with respect to some intensional properties of the program to be compiled.
     68Formal methods are aimed at increasing our confidence in our software and
     69hardware artifacts. Ideally, in the future we would like all artifacts to be
     70equipped with a formal specification and a proof of correctness of the
     71implementation. Nowadays practically all programs are written in high level
     72languages and then compiled into low level ones. Thus the specifications are
     73also given at high level and correctness is proved reasoning automatically
     74or interactively on the source code. The code that is actually run, however,
     75is the object code generated by the compiler.
     76
     77A few simple questions then arise:
     78What properties are preserved during compilation?
     79What properties are affected by the compilation strategy?
     80To what extent can you trust your compiler in preserving those properties?
     81
     82Compiler verification, as of late, is a `hot topic' in computer science
     83research. So far, it has been focused on the first and last question only.
     84In particular, the attention has been put only on extensional properties,
     85which are easily preserved during compilation:
     86it is sufficient to completely preserve the denotational semantics of the
     87input program.
     88
     89The situation is definitely more complex when we also take in account
     90intesional properties of programs, like space, time or energy spent into
     91computation and transmission of data. To express this properties and to
     92reason on them we are forced to adopt a cost model that assigns a cost to
     93single instructions or to blocks of instructions. Morally, we would like to
     94have a compositional cost model that assigns the same cost to all occurrences
     95of one instruction. However, compiler optimizations are inherently non
     96compositional: each occurrence of an high level instruction is usually compiled
     97in a different way according to the surrounding instructions. Hence the cost
     98model is affected by compilation and thus all intensional specifications are as
     99well.
     100
     101In the EU Project CerCo (Certified Complexity) we will approach the problem
     102by developing a compiler that induces the cost model on the source code by
     103assigning to each block of high level instructions the cost associated to the
     104obtained object code. The cost model will thus be inherently non compositional,
     105but it can be extremely \emph{precise}, capturing the realistic cost. In
     106particular, we already have a prototype where no approximation of the cost
     107is provided at all. The natural applications of our approach are then in the
     108domain of hard real time programs, where the user can certify the meeting of
     109all dealines while fully exploiting the computational time, that would be
     110wasted in case of over-estimation of the costs.
     111
     112Other applications of our approach are in the domain of compiler verification
     113itself. For instance, an extensional specification of an optimization is useless
     114since it grants preservation of the semantics without stating that the cost
     115(in space or time) of the optimized code should be lower. Another example is
     116completeness and correctness of the compilation process in presence of
     117space constraints: the compiler could refuse a source
     118program for an embedded system when the size of the compiled code exceeds the
     119available ROM size. Moreover, preservation of the semantics must be required
     120only for those programs that do not exhausts their stack/heap space. Hence the
     121statement of completeness of the compiler must take in account the realistic
     122cost model.
     123
     124In the methodology proposed in CerCo we assume to be able to compute on the
     125object code exact and realistic costs for sequential blocks of instructions.
     126With modern processors, it is possible~\cite{??,??,??}, but difficult,
     127to compute exact costs or to reasonably approximate them, since the execution
     128of the program itself has an influence on the speed of processing. This is due
     129mainly to caching effects and memory effects in the processor, used, for
     130instance, to perform branch prediction. For this reason, at the current stage
     131of CerCo we decided to focus on 8-bits microprocessors that are still widely
     132used in embedded systems and whose cost model is easily predictable.
     133
     134In particular, we have fully formalized an executable formal semantics of
     135the Family of 8 bits Freescale Microprocessors~\cite{oliboni} and a similar
     136one for the MCS-51 microprocessors. The latter is the one described in this
     137paper. The main focus of the formalization has been on capturing the
     138intensional behaviour of the processor. The main problems we have faced,
     139however, are mainly due to the extreme unorthogonality of the memory model
     140and instruction sets of the MCS-51 microprocessors. To cope with this
     141unorthogonality and to have executability, we have exploited the dependent
     142type system of the interactive theorem prover Matita.
     143
     144%Compiler verification, as of late, is a `hot topic' in computer science research.
     145%This rapidly growing field is motivated by one simple question: `to what extent can you trust your compiler?'
     146%Existing verification efforts have broadly focussed on \emph{semantic correctness}, that is, creating a compiler that is guaranteed to preserve the semantics of a program during the compilation process.
     147%However, there is another important facet of correctness that has not garnered much attention, that is, correctness with respect to some intensional properties of the program to be compiled.
    72148
    73149\subsection{The 8051/8052}
Note: See TracChangeset for help on using the changeset viewer.