# Research

## Theses

This thesis is devoted to resource analysis of imperative programs. Resource analysis falls within the wide spectrum of static program analysis, which is concerned with automatic methods for inferring reliable approximate informations about the dynamic behaviour of computer programs. In resource analysis, we are concerned with approximations of quantitative properties of program executions, such as the maximal number of execution steps or memory needed. This topic is an active area of research and several resource analysis tools have been established in recent years. In this work, we provide the following contributions.

First, we are concerned with the resource analysis of imperative programs in which states are formed over a finite set of integer-valued variables. We consider a standard abstract model of computation, so-called constraint transition systems. When syntactically restricting the precise notion of the program representation, one can study and implement dedicated techniques for resource analysis. In this work, we give an overview of theoretical properties and practical aspects of relevant abstract program representations that are known from the literature and that are used in modern resource analysis tools. Second, we are concerned with the resource analysis of imperative programs with heap allocated data structures. Inspired by recent developments on automating runtime analysis for term rewriting, we present a term abstraction of programs with heap allocated data. This abstraction safely approximates the runtime of the target program such that we can make use of existing tools for the resource analysis. In this work, we outline the term abstraction and compare it to known approaches. Third, we are concerned with the resource analysis of probabilistic programs. We consider a standard imperative programming language that is endowed with probabilistic primitives for sampling and probabilistic choice. In this work, we present a novel modular approach to automate the resource analysis of probabilistic programs. Finally, we present a framework for automation. We provide a Haskell library that is dedicated to automate resource analysis. In this work, we outline the software architecture of this library and demonstrate several case studies in which the framework has been applied successfully.

In this thesis we revisit known transformations from object-oriented bytecode programs to rewrite systems from the viewpoint of runtime complexity. We analyse Jinja bytecode (JBC for short), which exhibits core features of well–known object-oriented programming languages but provides a formal semantics. Using standard techniques from static program analysis we define two alternative representations of JBC executions. First, we provide a graph-based abstraction of bytecode states and establish a Galois insertion between the abstract domain and sets of bytecode states. We define an abstract semantics from which we obtain a finite representation of JBC executions as computation graphs. We show that this representation is correct and that the computation graph of a program is always computable and finite. Second, we provide a term-based abstraction of bytecode states that we obtain from the nodes of the computation graph. From the edge relation of the computation graph we obtain a finite representation of JBC executions as constrained term rewrite systems . We show that the transformation is complexity preserving . That is, an upper bound on the runtime complexity of the resulting constrained term rewrite system implies an upper bound on the runtime complexity of the bytecode program. Moreover, we show that the transformation is non-termination preserving . We restrict to non-recursive methods and make use of existing heap shape analysis to approximate acyclicity and reachability. The transformation has been implemented in the prototype Jat.

## Proceedings & Journals

We present a novel methodology for the automated resource analysis of non-deterministic, probabilistic imperative programs, based on three ingredients: (1) a refinement of the ert-calculus of Kaminski et al. to reason about expected costs; (2) an integration of an expected value analysis within this calculus; and (3) a constraint-solver over iterative refineable cost functions facilitated by off-the-shelf SMT-solvers.

This gives a unique modular approach to the problem at hand. Program fragments can be analysed in full independence. Further, it allows us to incorporate sampling from dynamic distributions, making our analysis more realistic.

All of this has been implemented in the tool eco-imp. Our experiments show that our tool runs on average orders of magnitude faster than comparable tools: execution times of several seconds become milliseconds. It is also the first tool that can provide a fully automated analysis of the Coupon Collector’s problem, whose formalisation requires sampling from dynamic distributions.

In this paper we show how the runtime complexity of imperative programs can be analysed fully automatically by a transformation to term rewrite systems, the complexity of which can then be automatically verified by existing complexity tools. We restrict to well-formed Jinja bytecode programs that only make use of non-recursive methods. The analysis can handle programs with cyclic data only if the termination behaviour is independent thereof.

We exploit a term-based abstraction of programs within the abstract interpretation framework. The proposed transformation encompasses two stages. For the first stage we perform a combined control and data flow analysis by evaluating program states symbolically, which is shown to yield a finite representation of all execution paths of the given program through a graph, dubbed computation graph. In the second stage we encode the (finite) computation graph as a term rewrite system. This is done while carefully analysing complexity preservation and reflection of the employed transformations such that the complexity of the obtained term rewrite system reflects on the complexity of the initial program. Finally, we show how the approach can be automated and provide ample experimental evidence of the advantages of the proposed analysis.

In this extended abstract we present the GUBS Upper Bound Solver. GUBS is a dedicated constraint solver over the naturals for inequalities formed over uninterpreted function symbols and standard arithmetic operations. GUBS now forms the backbone of HoSA, a tool for analysing space and time complexity of higher-order functional programs automatically. We give insights about the implementation and report different case studies.

In this paper we present TcT v3.0, the latest version of our fully automated complexity analyser. TcT implements our framework for automated complexity analysis and focuses on extensibility and automation. TcT is open with respect to the input problem under investigation and the resource metric in question. It is the most powerful tool in the realm of automated complexity analysis of term rewrite systems. Moreover it provides an expressive problem-independent strategy language that facilitates proof search. We give insights about design choices, the implementation of the framework and report different case studies where we have applied TcT successfully.

## Informal Publications

We are concerned with the average case runtime complexity analysis of a prototypical imperative language pWhile in the spirit of Dijkstra's Guarded Command Language. This language is endowed with primitives for sampling and probabilistic choice so that randomized algorithms can be expressed. Complexity analysis in this setting is particularly appealing as efficiency is one striking reason why randomized algorithms have been introduced and studied: in many cases, the most efficient algorithm for the problem at hand is randomized.

In this extended abstract we present paicc, a transformation from guarded control-flow programs to unstructured loop programs. We make use of recent results on the decidability of polynomial growth-rate of unstructured loop programs to develop a new method for assessing polynomial runtime of guarded control-flow programs.

## Software

## Affiliations

University of Innsbruck`> `

Department of Computer Science`> `

Computational Logic`> `

Computation with Bounded Resources