Christoph Matheja

Separation Logic with inductive predicate definitions (SL) and hyperedge
replacement grammars (HRG) are established formalisms to describe the
abstract shape of data structures maintained by heap-manipulating programs.

Fragments of both formalisms are known to coincide, and neither the
entailment problem for SL nor its counterpart for HRGs, the inclusion
problem, are decidable in general.

We introduce tree-like grammars (TLG), a fragment of HRGs with a
decidable inclusion problem.

By the correspondence between HRGs and SL, we simultaneously obtain an
equivalent SL fragment (SLTL) featuring some remarkable properties
including a decidable entailment problem.

Ronald de Haan

The model checking problem for temporal logics is an important problem with applications in key areas of computer science. Indispensable for the state-of-the-art in solving this problem in large-scale settings is the technique of bounded model checking. We investigate the theoretical possibilities of this technique using parameterized complexity. In particular, we provide a complete parameterized complexity classification for the model checking problem for symbolically represented Kripke structures for various fragments of the temporal logics LTL, CTL and CTL*. We argue that a known result from the literature for a restricted fragment of LTL can be seen as an fpt-time encoding into SAT, and show that such encodings are not possible for any of the other fragments of the temporal logics that we consider.

Sayan Mitra

In this talk we will discuss an approach for automating correctness
proofs of distributed systems using small model properties and
automatic theorem provers. Small model theorems provide bounds on the
size of the model that need to be checked for ascertaining the
validity of logical formulas. The common theme in this talk is the
observation that inductive invariance and progress proof rules, which
are arguably the most commonly used techniques in checking correctness
of distributed systems, can be expressed in a form that has small
model properties. This then makes it is possible to check only finite
instances of the system, and infer correctness of arbitrarily large
instances. The approach extends to system models with skewed clocks,
communication patterns that are regular graphs, as well as models with
partial synchrony.

Antonia Lechner

We consider the complexity of deciding the truth of first-order existential sentences of Presburger arithmetic with divisibility. We show that if such a sentence is satisfiable then the smallest satisfying assignment has size at most exponential in the size of the formula, which implies that the decision problem is in NEXPTIME. Establishing this upper bound requires subtle adaptations to an existing decidability proof of Lipshitz.

A more general family of subsets of Presburger arithmetic with divisibility have been known to be decidable by reduction to the purely existential case. We use one of these subsets to show decidability of the LTL synthesis problem for one-counter automata with integer-valued parameters, where counter values range over the nonnegative integers and counter updates are encoded in binary. This problem asks whether for a given parametric one-counter automaton and LTL formula there exist values for the parameters such that all computations from the initial configuration satisfy the formula. The proof of decidability is by reduction to an intermediary synthesis problem which can be encoded as a formula of a decidable subset of Presburger arithmetic with divisibility. We obtain new upper bounds for all of these problems using the NEXPTIME upper bound for the purely existential subset of Presburger arithmetic with divisibility.

Christian Herrera

Quasi-equal clock reduction for networks of timed automata
replaces clocks in equivalence classes by representative clocks. An exist-
ing approach which reduces quasi-equal clocks and does not constrain
the support of properties of networks, yields significant reductions of the
overall verification time of properties. However, this approach requires
strong assumptions on networks in order to soundly apply the reduc-
tion of clocks. In this work we propose a transformation which does not
require assumptions on networks, and does not constrain the support
of properties of networks. We demonstrate that the cost of verifying
properties is much lower in transformed networks than in their original
counterparts with quasi-equal clocks.

Bertrand Meyer

There are many theories of programming; this one seeks to describe programs, programming languages and programming in a simple mathematical framework, based on a small set of high-school-level concepts of elementary set theory (sets and binary relations). It uses only three starting operations (union, restriction and composition) and covers the core concepts of programming as well as concurrency, non-determinism, control structures, correctness. From the theory?s definitions it is possible to deduce, as theorems, the axioms of classic theories such as those of Hoare and Kahn. Bertrand Meyer is an academic, author, project manager and consultant in software engineering.

Marko Horvat

We apply state-of-the-art model checkers and theorem provers for information and
network security to break real-world security protocols. When we find potential
vulnerabilities, we gradually harden the protocols against powerful attackers.
We present our work that influenced an update of the ISO/IEC 11770 standard for
key management techniques, and joint work with Mozilla that confirms the existence
of a conjectured flaw in the not-yet-finalised TLS 1.3 protocol.

Rupak Majumdar

 

Population protocols are a model for parameterized systems in which a set of identical, anonymous, finite-state processes interact pairwise through rendezvous synchronization. In each step, the pair of interacting processes is chosen by a random scheduler. Angluin et al. (PODS 2004) studied population protocols as a distributed computation model. They characterized the computational power in the limit (semi-linear predicates) of a subclass of protocols (the well-specified ones). However, the modeling power of protocols go beyond computation of semi-linear predicates and they can be used to study a wide range of distributed protocols, such as asynchronous leader election or consensus, stochastic evolutionary processes, or chemical reaction networks. Correspondingly, one is interested in checking specifications on these protocols that go beyond the well-specified computation of predicates.
We characterize the decidability frontier for the model checking problem for population protocols against
probabilistic linear-time specifications. We show that the model checking problem is decidable for qualitative objectives, but as hard as the reachability problem for Petri nets. However, it is undecidable for quantitative properties. Our decidability proof uses techniques from Petri net theory, and the characterization of limit behaviors of population protocols. In particular, our results show decidability of the following basic verification problems:  Is a given protocol well-specified (i.e., computes a predicate in the sense of Angluin et al.)?  Does a protocol compute a given predicate?
[Joint work with Javier Esparza, Pierre Ganty, and Jerome Leroux]

Alessandro Abate

Abstract:

This talk looks at the development of abstraction techniques based on quantitative approximations, in order to formally verify complex systems and to provide computable approaches for the correct-by-design synthesis of control architectures. The approach employs techniques and concepts from the formal verification area, such as that of (approximate probabilistic) bisimulation, over models and problems from the field of systems and control. While emphasising the generality of the approach over a diverse set of model classes, this talk zooms in on stochastic hybrid systems, which are probabilistic models with heterogeneous dynamics (continuous/discrete, i.e. hybrid, as well as nonlinear). A case study in energy networks, dealing with the problem of demand response, is employed to clarify concepts and techniques. Theory is complemented by algorithms, all packaged in software tools  (called FAUST^2 for stochastic models) that are freely available to users. 

Bio: 
Alessandro Abate received the Laurea degree in electrical engineering in October 2002 from the University of Padova, Padova, Italy, the M.S. degree in May 2004, and the Ph.D. degree in December 2007, both in electrical engineering and computer sciences, from the University of California, Berkeley, CA, USA.
He is an Associate Professor in the Department of Computer Science at the University of Oxford, Oxford, U.K. He has been an International Fellow in the CS Lab at SRI International in Menlo Park, CA, USA, and a Postdoctoral Researcher at Stanford University, Stanford, CA, USA, in the Department of Aeronautics and Astronautics. From June 2009 to mid 2013 he has been an Assistant Professor at the Delft Center for Systems and Control, TU Delft – Delft University of Technology, The Netherlands.
His research interests are in the analysis, verification, and control of probabilistic and hybrid systems, and in their general application over a number of domains, particularly in energy and in systems biology. 

Christian Schilling

Continuous-time Markov chain (CTMC) models have become a central tool for understanding the dynamics of complex reaction networks and the importance of stochasticity in the underlying biochemical processes. When such models are employed to answer questions in applications, in order to ensure that the model provides a sufficiently accurate representation of the real system, it is of vital importance that the model parameters are inferred from real measured data. This, however, is often a formidable task and all of the existing methods fail in one case or the other, usually because the underlying CTMC model is high-dimensional and computationally difficult to analyze. The parameter inference methods that tend to scale best in the dimension of the CTMC are based on so-called moment closure approximations. However, there exists a large number of different moment closure approximations and it is typically hard to say a priori which of the approximations is the most suitable for the inference procedure. Here, we propose a moment-based parameter inference method that automatically chooses the most appropriate moment closure method. Accordingly, contrary to existing methods, the user is not required to be experienced in moment closure techniques. In addition to that, our method adaptively changes the approximation during the parameter inference to ensure that always the best approximation is used, even in cases where different approximations are best in different regions of the parameter space.