Computational Logic
Department of Computer Science - Universität Innsbruck
  • Home
  • Research
    • Publications
    • Projects
    • Software
  • Teaching
  • Members
    • Aart Middeldorp
      • Edited Books
  • Events
    • Event Archive
  • News
    • News Archive
  1. You are here:  
  2. Home
  3. Research
  4. Publications
  5. Publications 2016

Workshop on Hammers for Type Theory 2016

Jasmin Blanchette, Cezary Kaliszyk (eds)
Proceedings First Workshop on Hammers for Type Theory, HaTT 2016, 210, 2016.

doi logo doi:10.4204/EPTCS.210.4

 

BibTex

@proceedings{jbck-hatt16,
editor = {Jasmin Blanchette and Cezary Kaliszyk},
title = {Proceedings First Workshop on Hammers for Type Theory,
HaTT 2016},
booktitle = {Proceedings First Workshop on Hammers for Type Theory,
HaTT 2016},
series = {{EPTCS}},
volume = {210},
year = {2016},
url = {http://dx.doi.org/10.4204/EPTCS.210},
doi = {10.4204/EPTCS.210},
}

 

Details
Category: Publications 2016
Published: 28 April 2025

Improving Automation in Interactive Theorem Provers by Efficient Encoding of Lambda-Abstractions

Łukasz Czajka
Proceedings of the 5th ACM SIGPLAN Conference on Certified Programs and Proofs (CPP 2016), pp. 49 – 57, 2016.

doi logo doi:10.1145/2854065.2854069

 

Abstract

Hammers are tools for employing external automated theorem provers (ATPs) to improve automation in formal proof assistants. Strong automation can greatly ease the task of developing formal proofs. An essential component of any hammer is the translation of the logic of a proof assistant to the format of an ATP. Formalisms of state-of-the-art ATPs are usually first-order, so some method for eliminating lambda-abstractions is needed. We present an experimental comparison of several combinatory abstraction algorithms for HOLHammer – a hammer for HOL Light. The algorithms are compared on problems involving non-trivial lambda- abstractions selected from the HOL Light core library and a library for multivariate analysis. We succeeded in developing algorithms which outperform both lambda-lifting and the simple Schönfinkel’s algorithm used in Sledgehammer for Isabelle/HOL. This increases the ATPs’ success rate on translated problems, thus enhancing automation in proof assistants.

 

BibTex

@inproceedings{LC-CPP16,
author = "{\L}ukasz Czajka",
title = "Improving Automation in Interactive Theorem Provers by Efficient Encoding
of Lambda-Abstractions",
booktitle = "Proceedings of the 5th {ACM} {SIGPLAN} Conference on Certified Programs
and Proofs (CPP 2016)",
pages = "49--57",
year = 2016,
doi = "10.1145/2854065.2854069"
}

Details
Category: Publications 2016
Published: 28 April 2025

Formalizing Jordan Normal Forms in Isabelle/HOL

René Thiemann and Akihisa Yamada
Proceedings of the 5th ACM SIGPLAN Conference on Certified Programs and Proofs (CPP 2016), pp. 88 – 99, 2016.

doi logo doi:10.1145/2854065.2854073

 

Abstract

In automated complexity analysis of term rewriting, estimating the growth rate of the values in the k-th power of a matrix A – for fixed A and increasing k – is of fundamental interest. This growth rate can be exactly characterized via A’s Jordan normal form (JNF). We formalize this result in our library IsaFoR and our certifier CeTA, and thereby improve the support for certifying polynomial bounds derived by (untrusted) complexity analysis tools. To this end, we develop a new library for matrices that allows us to conveniently work with block matrices. Besides the mentioned complexity result, we formalize Gram-Schmidt’s orthogonalization algorithm and the Schur decomposition in order to prove existence of JNFs. We also provide a uniqueness result for JNFs which allows us to compute Jordan blocks for individual eigenvalues. In order to determine eigenvalues automatically, we moreover formalize Yun’s square-free factorization algorithm.

 

BibTex

@inproceedings{RTAY-CPP2016,
author = "Ren{\'e} Thiemann and Akihisa Yamada",
title = "Formalizing {J}ordan Normal Forms in {I}sabelle/{HOL}",
booktitle = "Proceedings of the 5th {ACM} {SIGPLAN} Conference on Certified Programs
and Proofs ({CPP} 2016)",
pages = "88--99",
year = 2016,
doi = "10.1145/2854065.2854073"
}

Details
Category: Publications 2016
Published: 28 April 2025

Towards a Mizar Environment for Isabelle: Foundations and Language

Cezary Kaliszyk, Karol Pąk, and Josef Urban
Proceedings of the 5th ACM SIGPLAN Conference on Certified Programs and Proofs (CPP 2016), pp. 58 – 65, 2016.

doi logo doi:10.1145/2854065.2854070

 

Abstract

In this paper we explore the possibility of emulating the Mizar environment as close as possible inside the Isabelle logical framework. We introduce adaptations to the Isabelle/FOL object logic that correspond to the logic of Mizar, as well as Isar inner syntax notations that correspond to these of the Mizar language. We show how Isabelle types can be used to differentiate between the syntactic categories of the Mizar language, such as sets and Mizar types including modes and attributes, and show how they interact with the basic constructs of the Tarski-Grothendieck set theory. We discuss Mizar definitions and provide simple abbreviations that allow the introduction of Mizar predicates, functions, attributes and modes using the Isabelle/Pure language elements for introducing definitions and theorems. We finally consider the definite and indefinite description operators in Mizar and their use to introduce definitions by ``means’‘ and ``equals’‘. We demonstrate the usability of the environment on a sample Mizar-style formalization, with cluster inferences and ``by’‘ steps performed manually.

 

BibTex

@inproceedings{CKKPJU-CPP2016,
author = "Cezary Kaliszyk and Karol P{\k{a}}k and Josef Urban",
title = "Towards a {M}izar Environment for {I}sabelle: Foundations and Language",
booktitle = "Proceedings of the 5th {ACM} {SIGPLAN} Conference on Certified Programs
and Proofs (CPP 2016)",
pages = "58--65",
year = 2016,
doi = "10.1145/2854065.2854070"
}

Details
Category: Publications 2016
Published: 28 April 2025

The Complexity of Interaction

Stéphane Gimenez and Georg Moser
Proceedings of the 43rd Annual ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages (POPL 2016), 243 – 255, 2016.

doi logo doi:10.1145/2837614.2837646

 

Abstract

In this paper, we analyze the complexity of functional programs written in the interaction-net computation model, an asynchronous, parallel and confluent model that generalizes linear-logic proof nets. Employing user-defined sized and scheduled types, we certify concrete time, space and space-time complexity bounds for both sequential and parallel reductions of interaction-net programs by suitably assigning complexity potentials to typed nodes. The relevance of this approach is illustrated on archetypal programming examples. The provided analysis is precise, compositional and is, in theory, not restricted to particular complexity classes.

 

BibTex

@inproceedings{SGGM-POPL2016,
author = "St{\'e}phane Gimenez and Georg Moser",
title = "The Complexity of Interaction",
booktitle = "Proceedings of the 43rd Annual {ACM} {SIGPLAN-SIGACT} Symposium on
Principles of Programming Languages ({POPL} 2016)",
pages = "243--255",
year = 2016,
doi = "10.1145/2837614.2837646"
}

Details
Category: Publications 2016
Published: 28 April 2025

Polynomial Interpolation

René Thiemann and Akihisa Yamada
Archive of Formal Proofs 2016.

AFP Entry

 

Abstract

We formalized three algorithms for polynomial interpolation over arbitrary fields: Lagrange’s explicit expression, the recursive algorithm of Neville and Aitken, and the Newton interpolation in combination with an efficient implementation of divided differences. Variants of these algorithms for integer polynomials are also available, where sometimes the interpolation can fail; e.g., there is no linear integer polynomial p such that p(0) = 0 and p(2) = 1. Moreover, for the Newton interpolation for integer polynomials, we proved that all intermediate results that are computed during the algorithm must be integers. This admits an early failure detection in the implementation. Finally, we proved the uniqueness of polynomial interpolation.
The development also contains improved code equations to speed up the division of integers in target languages.

 

BibTex

@article{Polynomial_Interpolation-AFP,
author = "Ren{\'e} Thiemann and Akihisa Yamada",
title = "Polynomial Interpolation",
journal = "Archive of Formal Proofs",
year = 2016,
note = "\url{https://www.isa-afp.org/entries/Polynomial_Interpolation.html},
Formal proof development"
}

Details
Category: Publications 2016
Published: 28 April 2025

Polynomial Factorization

René Thiemann and Akihisa Yamada
Archive of Formal Proofs 2016.

AFP entry

 

Abstract

Based on existing libraries for polynomial interpolation and matrices, we formalized several factorization algorithms for polynomials, including Kronecker’s algorithm for integer polynomials, Yun’s square-free factorization algorithm for field polynomials, and Berlekamp’s algorithm for polynomials over finite fields. By combining the last one with Hensel’s lifting, we derive an efficient factorization algorithm for the integer polynomials, which is then lifted for rational polynomials by mechanizing Gauss’ lemma. Finally, we assembled a combined factorization algorithm for rational polynomials, which combines all the mentioned algorithms and additionally uses the explicit formula for roots of quadratic polynomials and a rational root test.
As side products, we developed division algorithms for polynomials over integral domains, as well as primality-testing and prime-factorization algorithms for integers.

 

BibTex

@article{Polynomial_Factorization-AFP,
author = "Ren{\'e} Thiemann and Akihisa Yamada",
title = "Polynomial Factorization",
journal = "Archive of Formal Proofs",
year = 2016,
note = "{\url{https://www.isa-afp.org/entries/Polynomial_Factorization.html},
Formal proof development}"
}

Details
Category: Publications 2016
Published: 28 April 2025

Hammering towards QED

Jasmin C. Blanchette, Cezary Kaliszyk, Lawrence C. Paulson, and Josef Urban
Journal of Formalized Reasoning 9(1), pp. 101 – 148, 2016

doi logo doi:10.6092/issn.1972-5787/4593

 

Abstract

This paper surveys the emerging methods to automate reasoning over large libraries developed with formal proof assistants. We call these methods hammers. They give the authors of formal proofs a strong “one-stroke” tool for discharging difficult lemmas without the need for careful and detailed manual programming of proof search.
The main ingredients underlying this approach are efficient automatic theorem provers that can cope with hundreds of axioms, suitable translations of richer logics to their formalisms, heuristic and learning methods that select relevant facts from large libraries, and methods that reconstruct the automatically found proofs inside the proof assistants. We outline the history of these methods, explain the main issues and techniques, and show their strength on several large benchmarks. We also discuss the relation of this technology to the QED Manifesto and consider its implications for QED-style efforts.

 

BibTex

@article{JBCKLPJU-JFR2016,
author = "Jasmin C. Blanchette and Cezary Kaliszyk and Lawrence C. Paulson and
Josef Urban",
title = "Hammering towards {QED}",
journal = "Journal of Formalized Reasoning",
volume = 9,
number = 1,
pages = "101--148",
year = 2016,
doi = "10.6092/issn.1972-5787/4593"
}

Details
Category: Publications 2016
Published: 28 April 2025

A Learning-Based Fact Selector for Isabelle/HOL

Jasmin Christian Blanchette, David Greenaway, Cezary Kaliszyk, Daniel Kühlwein, and Josef Urban
Journal of Automated Reasoning Volume 53, Number 3, pp. 219 – 244, 2016.

pdf icon pdf doi logo doi:10.1007/s10817-016-9362-8

© Springer Netherlands

 

Abstract

Equational theories that contain axioms expressing associativity and commutativity (AC) of certain operators are ubiquitous. Theorem proving methods in such theories rely on well-founded orders that are compatible with the AC axioms. In this paper we consider various definitions of AC-compatible Knuth-Bendix orders. The orders of Steinbach and of Korovin and Voronkov are revisited. The former is enhanced to a more powerful version, and we modify the latter to amend its lack of monotonicity on non-ground terms. We further present new complexity results. An extension reflecting the recent proposal of subterm coefficients in standard Knuth-Bendix orders is also given. The various orders are compared on problems in termination and completion.

 

BibTex

@article{DBLP:journals/jar/BlanchetteGKKU16,
author = {Jasmin Christian Blanchette and
David Greenaway and
Cezary Kaliszyk and
Daniel K{\"{u}}hlwein and
Josef Urban},
title = {A Learning-Based Fact Selector for Isabelle/HOL},
journal = {J. Autom. Reasoning},
volume = {57},
number = {3},
pages = {219--244},
year = {2016},
url = {http://dx.doi.org/10.1007/s10817-016-9362-8},
doi = {10.1007/s10817-016-9362-8},
timestamp = {Mon, 05 Sep 2016 19:00:15 +0200},
biburl = {http://dblp.uni-trier.de/rec/bib/journals/jar/BlanchetteGKKU16},
bibsource = {dblp computer science bibliography, http://dblp.org}
}

Details
Category: Publications 2016
Published: 28 April 2025

AC-KBO Revisited

Akihisa Yamada, Sarah Winkler, Nao Hirokawa, and Aart Middeldorp
Theory and Practice of Logic Programming 16(2), pp. 163 – 188, 2016.

pdf icon pdf doi logo doi:10.1017/S1471068415000083

© Cambridge University Press 2015

 

Abstract

Equational theories that contain axioms expressing associativity and commutativity (AC) of certain operators are ubiquitous. Theorem proving methods in such theories rely on well-founded orders that are compatible with the AC axioms. In this paper we consider various definitions of AC-compatible Knuth-Bendix orders. The orders of Steinbach and of Korovin and Voronkov are revisited. The former is enhanced to a more powerful version, and we modify the latter to amend its lack of monotonicity on non-ground terms. We further present new complexity results. An extension reflecting the recent proposal of subterm coefficients in standard Knuth-Bendix orders is also given. The various orders are compared on problems in termination and completion.

 

BibTex

@article{AYSWNHAM-TPLP16,
author = "Akihisa Yamada and Sarah Winkler and Nao Hirokawa and
Aart Middeldorp",
title = "{AC-KBO} Revisited",
journal = "Theory and Practice of Logic Programming",
volume = 16,
issue = 2,
pages = "163--188"
year = 2016,
doi = "10.1017/S1471068415000083"
}

Details
Category: Publications 2016
Published: 28 April 2025

TcT: Tyrolean Complexity Tool

Martin Avanzini, Georg Moser and Michael Schaper
Proceedings of the 22nd International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS 2016), Lecture Notes in Computer Science 9636, pp. 407 – 423, 2016.

pdf icon pdf doi logo doi:10.1007/978-3-662-49674-9_24

 

Abstract

In this paper we present TcT v3.0, the latest version of our fully automated complexity analyser. TcT implements our framework for automated complexity analysis and focuses on extensibility and automation. TcT is open with respect to the input problem under investigation and the resource metric in question. It is the most powerful tool in the realm of automated complexity analysis of term rewrite systems. Moreover it provides an expressive problem-independent strategy language that facilitates proof search. We give insights about design choices, the implementation of the framework and report different case studies where we have applied TcT successfully.

 

BibTex

@inproceedings{AMS:2016,
author = {Martin Avanzini and Georg Moser and Michael Schaper},
title = "{TcT}: {T}yrolean {C}omplexity {T}ool",
booktitle = "Proceedings of the 22nd International Conference on Tools and
Algorithms for the Construction and Analysis of Systems
(TACAS 2016)",
series = "Lecture Notes in Computer Science",
volume = 9636,
pages = 407--423,
year = 2016
}

Details
Category: Publications 2016
Published: 28 April 2025

Perron-Frobenius Theorem for Spectral Radius Analysis

Jose Divasón and Ondřej Kunčar and René Thiemann and Akihisa Yamada
Archive of Formal Proofs 2016.

AFP entry

 

Abstract

The spectral radius of a matrix A is the maximum norm of all eigenvalues of A. In previous work we already formalized that for a complex matrix A, the values in An grow polynomially in n if and only if the spectral radius is at most one. One problem with the above characterization is the determination of all complex eigenvalues. In case A contains only non-negative real values, a simplification is possible with the help of the Perron-Frobenius theorem, which tells us that it suffices to consider only the real eigenvalues of A, i.e., applying Sturm’s method can decide the polynomial growth of An.

We formalize the Perron-Frobenius theorem based on a proof via Brouwer’s fixpoint theorem, which is available in the HOL multivariate analysis (HMA) library. Since the results on the spectral radius is based on matrices in the Jordan normal form (JNF) library, we further develop a connection which allows us to easily transfer theorems between HMA and JNF. With this connection we derive the combined result: if A is a non-negative real matrix, and no real eigenvalue of A is strictly larger than one, then An is polynomially bounded in n.

 

BibTex

@article{Perron_Frobenius-AFP,
author = {Jose Divasón and Ondřej Kunčar and René Thiemann and Akihisa Yamada},
title = {Perron—{F}robenius Theorem for Spectral Radius Analysis},
journal = {Archive of Formal Proofs},
month = may,
year = 2016,
note = {\url{http://isa-afp.org/entries/Perron_Frobenius.shtml}, Formal proof development},
ISSN = {2150-914x},
}

Details
Category: Publications 2016
Published: 28 April 2025

A combination framework for complexity

Martin Avanzini and Georg Moser
Information and Computation 248, pp. 22 – 55, 2016.

pdf icon pdf doi logo doi:10.1016/j.ic.2015.12.007

 

Abstract

In this paper we present a combination framework for the automated polynomial complexity analysis of term rewrite systems. The framework covers both derivational and runtime complexity analysis, and is employed as theoretical foundation in the automated complexity tool TCT. We present generalisations of powerful complexity techniques, notably a generalisation of complexity pairs and (weak) dependency pairs. Finally, we also present a novel technique, called dependency graph decomposition, that in the dependency pair setting greatly increases modularity.

 

BibTex

@article{MAGM-IC2016,
title = "A combination framework for complexity",
author = "Martin Avanzini and Georg Moser",
journal = "Information and Computation",
volume = 248,
pages = "22-55",
year = 2016
}

Details
Category: Publications 2016
Published: 28 April 2025

Automating the First-Order Theory of Left-Linear Right-Ground Term Rewrite Systems

Franziska Rapp and Aart Middeldorp
Proceedings of the 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016), Leibniz International Proceedings in Informatics 52, pp. 36:1 – 36:12, 2016.

pdf icon pdf doi logo doi:10.4230/LIPIcs.FSCD.2016.36

 

Abstract

The first-order theory of rewriting is decidable for finite left-linear right-ground rewrite systems. We present a new tool that implements the decision procedure for this theory. It is based on tree automata techniques. The tool offers the possibility to synthesize rewrite systems that satisfy properties that are expressible in the first-order theory of rewriting.

 

BibTex

@inproceedings{FRAM-FSCD16,
author = "Franziska Rapp and Aart Middeldorp",
title = "Automating the First-Order Theory of Left-Linear
Right-Ground Term Rewrite Systems",
booktitle = "Proceedings of the 1st International Conference on Formal
Structures for Computation and Deduction
editor = "Delia Kesner and Brigitte Pientka",
series = "Leibniz International Proceedings in Informatics",
volume = 52,
pages = "36:1--36:12",
year = 2016,
doi = "10.4230/LIPIcs.FSCD.2016.36"
}

Details
Category: Publications 2016
Published: 28 April 2025

Complexity of Acyclic Term Graph Rewriting

Martin Avanzini and Georg Moser
Proceedings of the 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016), Leibniz International Proceedings in Informatics 52, pp. 10:1 – 10:18, 2016.

pdf icon pdf  doi logo doi:10.4230/LIPIcs.FSCD.2016.10

 

Abstract

Term rewriting has been used as a formal model to reason about the complexity of logic, functional, and imperative programs. In contrast to term rewriting, term graph rewriting permits sharing of common sub-expressions, and consequently is able to capture more closely reasonable implementations of rule based languages. However, the automated complexity analysis of term graph rewriting has received little to no attention.

With this work, we provide first steps towards overcoming this situation. We present adaptions of two prominent complexity techniques from term rewriting, viz, the interpretation method and dependency tuples. Our adaptions are non-trivial, in the sense that they can observe not only term but also graph structures, i.e. take sharing into account. In turn, the developed methods allow us to more precisely estimate the runtime complexity of programs where sharing of sub-expressions is essential.

 

BibTex

@inproceedings{MAGM-FSCD2016,
author = "Martin Avanzini and Georg Moser",
title = "Complexity of Acyclic Term Graph Rewriting",
booktitle = "Proceedings of the 1st International Conference on Formal
Structures for Computation and Deduction (FSCD 2016)",
series = "Leibniz International Proceedings in Informatics",
volume = 52,
pages = "10:1-10:18",
year = 2016
}

Details
Category: Publications 2016
Published: 28 April 2025

Goal Translation for a Hammer for Coq

Łukasz Czajka and Cezary Kaliszyk
1st International Workshop on Hammers for Type Theories, EPTCS 210, pp. 13-20, 2016.

pdf icon pdf  doi logo doi:10.4204/EPTCS.210.4

 

Abstract

Hammers are tools that provide general purpose automation for formal proof assistants. Despite the gaining popularity of the more advanced versions of type theory, there are no hammers for such systems. We present an extension of the various hammer components to type theory: (i) a translation of a significant part of the Coq logic into the format of automated proof systems; (ii) a proof reconstruction mechanism based on a Ben-Yelles-type algorithm combined
with limited rewriting, congruence closure and a first-order generalization of the left rules of Dyckhoff’s system LJT.

 

BibTex

@inproceedings{lcck-hatt16,
author = {\L{}ukasz Czajka and
Cezary Kaliszyk},
title = {Goal Translation for a Hammer for Coq (Extended Abstract)},
booktitle = {Proceedings First International Workshop on Hammers for
Type Theories (HaTT 2016)},
pages = {13--20},
year = {2016},
doi = {10.4204/EPTCS.210.4},
editor = {Jasmin Blanchette and
Cezary Kaliszyk},
series = {{EPTCS}},
volume = {210},
}

Details
Category: Publications 2016
Published: 28 April 2025

Certifying Confluence of Almost Orthogonal CTRSs via Exact Tree Automata Completion

Christian Sternagel and Thomas Sternagel
Proceedings of the 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016), Leibniz International Proceedings in Informatics 52, pp. 29:1-29:16, 2016.

pdf icon pdf  doi logo doi:10.4230/LIPIcs.FSCD.2016.29

 

Abstract

Suzuki et al. showed that properly oriented, right-stable, orthogonal, and oriented conditional term rewrite systems with extra variables in right-hand sides are confluent. We present our Isabelle/HOL formalization of this result, including two generalizations. On the one hand, we relax proper orientedness and orthogonality to extended proper orientedness and almost orthogonality modulo infeasibility, as suggested by Suzuki et al. On the other hand, we further loosen the requirements of the latter, enabling more powerful methods for proving infeasibility of conditional critical pairs. Furthermore, we formalized a construction by Jacquemard that employs exact tree automata completion for non-reachability analysis and apply it to certify infeasibility of conditional critical pairs. Combining these two results and extending the conditional confluence checker ConCon accordingly, we are able to automatically prove and certify confluence of an important class of conditional term rewrite systems.

 

BibTex

@inproceedings{CSTS-FSCD16,
author = "Christian Sternagel and Thomas Sternagel",
title = "Certifying Confluence of Almost Orthogonal CTRSs via
Exact Tree Automata Completion",
booktitle = "Proceedings of the 1st International Conference on Formal
Structures for Computation and Deduction",
editor = "Delia Kesner and Brigitte Pientka",
series = "Leibniz International Proceedings in Informatics",
volume = 52,
pages = "29:1--29:16",
year = 2016,
doi = "10.4230/LIPIcs.FSCD.2016.29",
}

Details
Category: Publications 2016
Published: 28 April 2025

Interaction Automata and the ia2d Interpreter

Stéphane Gimenez and David Obwaller
Proceedings of the 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016), Leibniz International Proceedings in Informatics 52, pp. 35:1 – 35:11, 2016.

doi logo doi:10.4230/LIPIcs.FSCD.2016.35

 

Abstract

We introduce interaction automata as a topological model of computation and present the conceptual plane interpreter ia2d. Interaction automata form a refinement of both interaction nets and cellular automata models that combine data deployment, memory management and structured computation mechanisms. Their local structure is inspired from pointer machines and allows an asynchronous spatial distribution of the computation. Our tool can be considered as a proof-of-concept piece of abstract hardware on which functional programs can be run in parallel.

 

BibTex

@inproceedings{SGDO-FSCD2016,
author = "St{\'e}phane Gimenez and David Obwaller",
title = "Interaction Automata and the {ia2d} Interpreter",
booktitle = "Proceedings of the 1st International Conference on Formal
Structures for Computation and Deduction (FSCD 2016)",
series = "Leibniz International Proceedings in Informatics (LIPIcs)",
volume = 52,
pages = "35:1--35:11",
year = 2016
}

Details
Category: Publications 2016
Published: 28 April 2025

Normalisation by Random Descent

Vincent van Oostrom and Yoshihito Toyama
Proceedings of the 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016), Leibniz International Proceedings in Informatics 52, pp. 32:1 – 32:18, 2016.

pdf icon pdf doi logo doi:10.4230/LIPIcs.FSCD.2016.32

© Creative Commons License CC-BY

 

Abstract

We present abstract hyper-normalisation results for strategies. These results are then applied to term rewriting systems, both first and higher-order. For example, we show hyper-normalisation of the left—outer strategy for, what we call, left—outer pattern rewrite systems, a class comprising both Combinatory Logic and the lambda-calculus but also systems with critical pairs. Our results apply to strategies that need not be deterministic but do have Newman’s random
descent property: all reductions to normal form have the same length, with Huet and Lévy’s external strategy being an example. Technically, we base our development on supplementing the usual notion of commutation diagram with a notion of order, expressing that the measure of its right leg does not exceed that of its left leg, where measure is an abstraction of the usual notion of length. We give an exact characterisation of such global commutation diagrams, for pairs of reductions, by means of local ones, for pairs of steps, we dub Dyck diagrams

 

BibTex

@inproceedings{VvOYT-FSCD16,
author = "Vincent van Oostrom and Yoshihito Toyama",
title = "Normalisation by Random Descent",
booktitle = "Proceedings of the 1st International Conference on Formal
Structures for Computation and Deduction (FSCD 2016)",
pages = "32:1-32:18",
series = "Leibniz International Proceedings in Informatics (LIPIcs)",
volume = 52,
editor = "Delia Kesner and Brigitte Pientka",
publisher = "Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik",
year = 2016
}

Details
Category: Publications 2016
Published: 28 April 2025

Gödel’s Functional Interpretation and the Concept of Learning

Thomas Powell
Proceedings of the 31st Annual ACM/IEEE Symposium on Logic in Computer Science (LICS 2016), pp. 136 – 145, 2016.

pdf icon pdf doi logo doi:10.1145/2933575.2933605

 

Abstract

In this article we study Gödel’s functional interpretation from the perspective of learning. We define the notion of a learning algorithm, and show that intuitive realizers of the functional interpretation of both induction and various comprehension schemas can be given in terms of these algorithms. In the case of arithmetical comprehension, we clarify how our learning realizers compare to those obtained traditionally using bar recursion, demonstrating that bar recursive interpretations of comprehension correspond to ‘forgetful’ learning algorithms. The main purpose of this work is to gain a deeper insight into the semantics of programs extracted using the functional interpretation. However, in doing so we also aim to better understand how it relates to other interpretations of classical logic for which the notion of learning is inbuilt, such as Hilbert’s epsilon calculus or the more recent learning-based realizability interpretations of Aschieri and Berardi.

 

BibTex

@inproceedings{TP-LICS2016,
author = "Thomas Powell",
title = "G{\"o}del's Functional Interpretation and the Concept of
Learning",
booktitle = "Proceedings of the 31st Annual ACM/IEEE Symposium on Logic in
Computer Science (LICS 2016)",
pages = "136-145",
year = 2016,
doi = "10.1145/2933575.2933605"
}

Details
Category: Publications 2016
Published: 28 April 2025

TH1: The TPTP Typed Higher-Order Form with Rank-1 Polymorphism

Cezary Kaliszyk, Geoff Sutcliffe, Florian Rabe
5th Workshop on Practical Aspects of Automated Reasoning, CEUR-WS 1635, pp. 41-55, 2016.

pdf icon pdf 

 

Abstract

The TPTP world is a well established infrastructure that supports research, development, and deployment of Automated Theorem Proving (ATP) systems for classical logics. The TPTP language is one of the keys to the success of the TPTP world. Originally the TPTP world supported only first-order clause normal form (CNF). Over the years support for full first-order form (FOF), monomorphic typed first-order form (TF0), rank-1 polymorphic typed first-order form (TF1), and monomorphic typed higher-order form (TH0) have been added. The TF0, TF1, and TH0 languages also include constructs for arithmetic. This paper introduces the TH1 form, an extension of TH0 with TF1-style rank-1 polymorphism. TH1 is designed to be easy to process by existing reasoning tools that support ML-style polymorphism. The hope is that TH1 will be implemented in many popular ATP systems for typed higher-order logic.

 

BibTex

@inproceedings{ckgsfr-paar16,
author = {Cezary Kaliszyk and
Geoff Sutcliffe and
Florian Rabe},
title = {{TH1:} The {TPTP} Typed Higher-Order Form with Rank-1
Polymorphism},
pages = {41--55},
booktitle = {Proc. 5th Workshop on Practical Aspects of Automated
Reasoning (PAAR 2016)},
year = {2016},
editor = {Pascal Fontaine and
Stephan Schulz and
Josef Urban},
series = {{CEUR}},
volume = {1635},
publisher = {CEUR-WS.org},
}

Details
Category: Publications 2016
Published: 28 April 2025

© 2025 Computational Logic

g48
  • Contact
  • Data and Privacy
  • Something
  • FAQs
  • About