Sat 13 Jan 2018 15:00 - 15:30 at Crocker - Session 2 Chair(s): William E. Byrd

Inspired by the recent evolution of deep neural networks (DNNs) in machine learning, we explore their application to the problems that often appear in the research of programming languages (PL problems), such as program synthesis and automated theorem proving. As a first step towards this goal, we report the result of our preliminary experimental result of the application of DNNs to proof synthesis of the intuitionistic propositional logic. The idea is to view the proof-synthesis problem as a translation from a proposition to its proof. We train a seq2seq model, which is a popular model in neural machine translation, so that it generates a proof term from a given proposition. We empirically observe that, although the generated proof is correct only for approximately half of the propositions in the test data, a generated proof term is close to a correct one in terms of the tree edit distance of AST. Based on this observation, we implemented a proof-search procedure that uses the generated proof term as a hint, which outperforms a brute-force one. We discuss future research directions and the gaps to be filled between DNN and PL problems.

Abstract (obt18-paper8.pdf)546KiB

Sat 13 Jan
Times are displayed in time zone: (GMT-07:00) Tijuana, Baja California change

13:30 - 15:30: Off the Beaten Track 2018 - Session 2 at Crocker
Chair(s): William E. ByrdUniversity of Alabama at Birmingham, USA
OBT-201813:30 - 14:00
OBT-201814:00 - 14:30
Annie CherkaevUniversity of Utah, Sebastian MusslickPrinceton University, Jonathan CohenPrinceton University, Vivek SrikumarUniversity of Utah, Matthew FlattUniversity of Utah
File Attached
OBT-201814:30 - 15:00
Max WillseyUniversity of Washington, Jared RoeschUniversity of Washington, USA
File Attached
OBT-201815:00 - 15:30
Taro SekiyamaIBM Research, Japan, Akifumi ImanishiKyoto University, Kohei SuenagaGraduate School of Informatics, Kyoto University
File Attached