Implicit computational complexity and compilers

Research output: Book/ReportPh.D. thesisResearch

  • Thomas Rubiano
Complexity theory helps us predict and control resources, usually time and space, consumed by programs.
Static analysis on specific syntactic criterion allows us to categorize some programs. A common approach is to
observe the program’s data’s behavior. For instance, the detection of non-size-increasing programs is based on
a simple principle : counting memory allocation and deallocation, particularly in loops. This way, we can detect
programs which compute within a constant amount of space. This method can easily be expressed as property
on control flow graphs. Because analyses on data’s behaviour are syntactic, they can be done at compile time.
Because they are only static, those analyses are not always computable or easily computable and approximations
need are needed. “Size-Change Principle” from C. S. Lee, N. D. Jones et A. M. Ben-Amram presented
a method to predict termination by observing resources evolution and a lot of research came from this theory.
Until now, these implicit complexity theories were essentially applied on more or less toy languages. This thesis
applies implicit computational complexity methods into “real life” programs by manipulating intermediate
representation languages in compilers. This give an accurate idea of the actual expressivity of these analyses
and show that implicit computational complexity and compilers communities can fuel each other fruitfully. As
we show in this thesis, the methods developed are quite generals and open the way to several new applications.
Original languageEnglish
PublisherDepartment of Computer Science, Faculty of Science, University of Copenhagen
Number of pages98
Publication statusPublished - 2017

ID: 191905210