Machine Learning Guidance for Connection Tableaux

  • PDF / 894,141 Bytes
  • 34 Pages / 439.37 x 666.142 pts Page_size
  • 49 Downloads / 176 Views

DOWNLOAD

REPORT


Machine Learning Guidance for Connection Tableaux Michael Färber1 · Cezary Kaliszyk1

· Josef Urban2

Received: 10 April 2018 / Accepted: 14 July 2020 © The Author(s) 2020

Abstract Connection calculi allow for very compact implementations of goal-directed proof search. We give an overview of our work related to connection tableaux calculi: first, we show optimised functional implementations of connection tableaux proof search, including a consistent Skolemisation procedure for machine learning. Then, we show two guidance methods based on machine learning, namely reordering of proof steps with Naive Bayesian probabilities, and expansion of a proof search tree with Monte Carlo Tree Search. Keywords Connection tableaux · Internal guidance · Monte Carlo

1 Introduction Connection calculi enable goal-directed proof search in a variety of logics. Connections were considered among others for classical first-order logic [49], for higher-order logic [3] and for linear logic [23]. An important family of connection provers for first-order logic is derived from leanCoP [57,62]. leanCoP was inspired by leanTAP [5], which is a prover based on free-variable semantic tableaux. leanTAP popularised lean theorem proving, which uses Prolog to maximise efficiency while minimising code. The compact Prolog implementation of lean theorem provers made them attractive for experiments both with the calculus and with the implementation. For example, leanCoP has been adapted for intuitionistic (ileanCoP [56]), modal (MleanCoP [60]), and nonclausal first-order logic (nanoCoP [61]). The intuitionistic version of leanCoP [56] became the state-of-art prover for first-order problems in intuitionistic logic [67]. A variant of leanCoP with interpreted linear arithmetic (leanCoP-Ω) won the TFA division of CASC-J5 [77]. Various implementation modifications can be performed very ele-

B

Cezary Kaliszyk [email protected] Michael Färber [email protected] Josef Urban [email protected]

1

University of Innsbruck, Innsbruck, Austria

2

Czech Technical University in Prague, Prague, Czech Republic

123

M. Färber et al.

gantly, such as search strategies, scheduling, restricted backtracing [58], randomization of the order of proof search steps [66], and internal guidance [38,83]. We have used connection provers from the leanCoP family as a basis for experiments with machine learning (see Sect. 5) and proof certification [41]. For these applications, we implemented connection provers in functional instead of logic programming languages. There are several reasons: First, a large number of interactive theorem provers (ITPs), such as HOL Light [28], HOL4 [74], Isabelle [86], Coq [8], and Agda [16] are written in functional programming languages, lending themselves well to integration of functional proof search tactics. Second, several machine learning algorithms such as Naive Bayes and k-NN have been implemented efficiently for ITPs in functional languages [15,39]. Third, we achieve better performance with functional-style implementation