Synthesizing datalog programs using numerical relaxation

Cited 0 time in webofscience Cited 16 time in scopus
  • Hit : 144
  • Download : 0
The problem of learning logical rules from examples arises in diverse fields, including program synthesis, logic programming, and machine learning. Existing approaches either involve solving computationally difficult combinatorial problems, or performing parameter estimation in complex statistical models. In this paper, we present DIFFLOG, a technique to extend the logic programming language Datalog to the continuous setting. By attaching real-valued weights to individual rules of a Datalog program, we naturally associate numerical values with individual conclusions of the program. Analogous to the strategy of numerical relaxation in optimization problems, we can now first determine the rule weights which cause the best agreement between the training labels and the induced values of output tuples, and subsequently recover the classical discrete-valued target program from the continuous optimum. We evaluate DIFFLOG on a suite of 34 benchmark problems from recent literature in knowledge discovery, formal verification, and database query-by-example, and demonstrate significant improvements in learning complex programs with recursive rules, invented predicates, and relations of arbitrary arity.
Publisher
International Joint Conferences on Artificial Intelligence
Issue Date
2019-08-10
Language
English
Citation

28th International Joint Conference on Artificial Intelligence, IJCAI 2019, pp.6117 - 6124

ISSN
1045-0823
DOI
10.24963/ijcai.2019/847
URI
http://hdl.handle.net/10203/277248
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0