We consider distributed timed systems that implement leader election
protocols which are at the heart of clock synchronization protocols. We
develop abstraction techniques for parameterized model checking of such
protocols under arbitrary network topologies, where nodes have independently
evolving clocks. We apply our technique for model checking the root election
part of the ﬂooding time synchronisation protocol (FTSP), and obtain
improved results compared to previous work. We model check the protocol for
all topologies in which the distance to the node to be elected leader is
bounded by a given parameter.
Tool prototyping is an essential step in developing novel software verification algorithms and techniques. However, implementing a verifier prototype that can handle real-world programs is a huge endeavor, which hinders researchers by forcing them to spend more time engineering tools, and less time innovating. In this talk, I will present the SMACK software verification toolchain. The toolchain provides a modular and extensible software verification ecosystem that decouples the front-end source language details from back-end verification algorithms. It achieves that by translating from the LLVM compiler intermediate representation into the Boogie intermediate verification language. SMACK benefits the software verification community in several ways: (i) it can be used as an off-the-shelf software verifier in an applied software verification project, (ii) it enables researchers to rapidly develop and release new verification algorithms, (iii) it allows for adding support for new languages in its front-end. We have used SMACK to verify numerous C/C++ programs, including industry examples, showing it is mature and competitive. Likewise, SMACK is already being used in several existing verification research prototypes.
The talk will have three parts in which I will discuss the results achieved with my colleagues about nondeterministic, deterministic, and semi-deterministic omega-automata. For nondeterministic automata, I will show which aspects of property automata can influence the performance of explicit model checking and what improvements can be made in LTL-to-automata traslation if we have some knowledge about the verified system. Then I will present our efficient translation of a fragment of LTL to deterministic automata. Finally, I will explore the jungle of Buchi automata classes that lie between deterministic and nondeterministic automata. I will present how to efficiently complement semi-deterministic automata and how to obtain them from nondeterministic generalized Buchi automata.
It is known that one can extract Craig interpolants from so-called local derivations. An interpolant extracted from such a derivation is a boolean combination of formulas occurring in the derivation. However, standard complete proof systems, such as superposition, for theories having the interpolation property are not necessarily complete for local proofs.
In this paper we investigate interpolant extraction from non-local proofs in the superposition calculus and prove a number of general results about interpolant extraction and complexity of extracted interpolants. In particular, we prove that the number of quantifier alternations in first-order interpolants of formulas without quantifier alternations is unbounded. This result has far-reaching consequences for using local proofs as a foundation for interpolating proof systems – any such proof system should deal with formulas of arbitrary quantifier complexity.
To search for alternatives for interpolating proof systems, we consider several variations on interpolation and local proofs. Namely, we give an algorithm for building interpolants from resolution refutations in logic without equality and discuss additional constraints when this approach can be also used for logic with equality. We finally propose a new direction related to interpolation via local proofs in first-order theories.
Every day, we make sequences of decisions to accomplish our goals, and we attempt to achieve these goals in the most favorable way. In many aspects of life, such as in security or economy, the optimality of these decisions is critical, and a computational support for decision making is thus needed. Sequential decision making is, however, computationally challenging in the presence of uncertainty (partially observable Markov decision processes) or even adversaries (partially observable stochastic games). We provide a game theoretic model of one-sided partially observable stochastic games which is motivated by problems arising mainly in security. It captures both the uncertainty of the decision maker and the adversarial nature of the problem into account. Our framework assumes two players where one player is given the advantage of having perfect information. We show that we can solve such games in the similar fashion as one solves POMDPs using the value iteration algorithm, including its more practical approximate variants based on point-based updates of the value function.
Languages K and L are separable by language S if K is included in S and L is disjoint from S.
I will present recent results on decidability of the problem whether two languages of one counter automata are separable
by some regular language.
Virtually all real-valued computations are carried out using floating-point data types and operations. With the current emphasis of system development often being on computational efficiency, developers as well as compilers are increasingly attempting to optimize floating-point routines. Reasoning about the correctness of these optimizations is complicated, and requires error analysis procedures with different characteristics and trade-offs. In my talk, I will motivate the need for such analyses. Then, I will present both a dynamic and a rigorous static analysis we developed for estimating errors of floating-point routines. Finally, I will describe how we extended our rigorous static analysis into a procedure for mixed-precision tuning of floating-point routines.
Zvonimir Rakamaric is an assistant professor in the School of Computing at the University of Utah. Prior to this, he was a postdoctoral fellow at Carnegie Mellon University in Silicon Valley, where he worked closely with researchers from the Robust Software Engineering Group at NASA Ames Research Center to improve the coverage of testing of NASA’s flight critical systems. Zvonimir received his bachelor’s degree in Computer Science from the University of Zagreb, Croatia; he obtained his M.Sc. and Ph.D. from the Department of Computer Science at the University of British Columbia, Canada.
Zvonimir‘s research mission is to improve the reliability and resilience of complex software systems by empowering developers with practical tools and techniques for analysis of their artifacts. He is a recipient of the NSF CAREER Award 2016, Microsoft Research Software Engineering Innovation Foundation (SEIF) Award 2012, Microsoft Research Graduate Fellowship 2008-2010, Silver Medal in the ACM Student Research Competition at the 32nd International Conference on Software Engineering (ICSE) 2010, and the Outstanding Student Paper Award at the 13th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) 2007.
Analyzing and reasoning about safety properties of software systems
becomes an especially challenging task for programs with complex flow
and, in particular, with loops or recursion. For such programs one needs
additional information, for example in the form of loop invariants,
expressing properties to hold at intermediate program points. We study
program loops with non-trivial arithmetic, implementing addition and
multiplication among numeric program variables. In this talk, we present
a new approach for automatically generating all polynomial invariants of
a class of such programs, based on techniques from computer algebra,
which will be explained thoroughly and intuitively.
Probabilistic programming is en vogue. It is used to describe
complex Bayesian networks, quantum programs, security protocols and
biological systems. Programming languages like C, C#, Java, Prolog,
Scala, etc. all have their probabilistic version. Key features are
random sampling and means to adjust distributions based on obtained
information from measurements and system observations. We show some
semantic intricacies, argue that termination is more involved than the
halting problem, and discuss recursion as well as run-time analysis.