Revision as of 21:10, 22 February 2007 edit138.100.11.226 (talk) →History← Previous edit | Latest revision as of 08:48, 24 November 2024 edit undoTheAstorPastor (talk | contribs)Extended confirmed users1,206 edits Reverting edit(s) by 180.188.247.49 (talk) to rev. 1257522981 by 118.136.77.234: Test edits (UV 0.1.6)Tags: Ultraviolet Undo | ||
(755 intermediate revisions by more than 100 users not shown) | |||
Line 1: | Line 1: | ||
{{short description|Programming paradigm based on formal logic}} | |||
'''Logic programming''' (which might better be called '''logical programming''' by analogy with ] and ]) is, in its broadest sense, the use of mathematical logic for computer programming. In this view of logic programming, which can be traced at least as far back as ]'s ] proposal, logic is used as a purely declarative representation language, and a theorem-prover or model-generator is used as the problem-solver. The problem-solving task is split between the programmer, who is responsible only for ensuring the truth of programs expressed in logical form, and the theorem-prover or model-generator, which is responsible for solving problems efficiently. | |||
'''Logic programming''' is a ], ] and ] paradigm based on formal ]. A logic program is a set of sentences in logical form, representing knowledge about some problem domain. Computation is performed by applying logical reasoning to that knowledge, to solve problems in the domain. Major logic programming language families include ], ] (ASP) and ]. In all of these languages, rules are written in the form of '']'': | |||
Logic programming in the narrower sense of ] and its variants is the use of logic as both a declarative and procedural representation language. The programmer is responsible, not only for ensuring the truth of programs, but also for ensuring their efficiency. In most cases, to achieve efficiency, the programmer needs to be aware of and to exploit the problem-solving behavior of the theorem-prover. In this respect, logic programming is like conventional imperative programming, using programs to control the behaviour of a program executor. However, unlike imperative programs, which have only a procedural interpretation, logic programs also have a declarative, logical interpretation, which helps to ensure their correctness. Moreover, such programs, being declarative, are at a higher conceptual level than purely imperative programs; and their program executers, being theorem-provers, operate at a higher conceptual level than conventional compilers and interpreters. | |||
:<code>A :- B<sub>1</sub>, ..., B<sub>n</sub>.</code> | |||
and are read as declarative sentences in logical form: | |||
:<code>A if B<sub>1</sub> and ... and B<sub>n</sub>.</code> | |||
<code>A</code> is called the ''head'' of the rule, <code>B<sub>1</sub></code>, ..., <code>B<sub>n</sub></code> is called the ''body'', and the <code>B<sub>i</sub></code> are called '']'' or conditions. When n = 0, the rule is called a ''fact'' and is written in the simplified form: | |||
:<code>A.</code> | |||
Queries (or goals) have the same syntax as the bodies of rules and are commonly written in the form: | |||
:<code>?- B<sub>1</sub>, ..., B<sub>n</sub>.</code> | |||
In the simplest case of ]s (or "definite" clauses), all of the A, B<sub>1</sub>, ..., B<sub>n</sub> are ]e of the form p(t<sub>1</sub> ,..., t<sub>m</sub>), where p is a predicate symbol naming a relation, like "motherhood", and the t<sub>i</sub> are terms naming objects (or individuals). Terms include both constant symbols, like "charles", and variables, such as X, which start with an upper case letter. | |||
Consider, for example, the following Horn clause program: | |||
<syntaxhighlight lang="prolog"> | |||
mother_child(elizabeth, charles). | |||
father_child(charles, william). | |||
father_child(charles, harry). | |||
parent_child(X, Y) :- | |||
mother_child(X, Y). | |||
parent_child(X, Y) :- | |||
father_child(X, Y). | |||
grandparent_child(X, Y) :- | |||
parent_child(X, Z), | |||
parent_child(Z, Y). | |||
</syntaxhighlight> | |||
Given a query, the program produces answers. | |||
For instance for a query <code>?- parent_child(X, william)</code>, the single answer is | |||
<syntaxhighlight lang="prolog"> | |||
X = charles | |||
</syntaxhighlight> | |||
Various queries can be asked. For instance | |||
the program can be queried both to generate grandparents and to generate grandchildren. It can even be used to generate all pairs of grandchildren and grandparents, or simply to check if a given pair is such a pair: | |||
<syntaxhighlight lang="prolog"> | |||
grandparent_child(X, william). | |||
X = elizabeth | |||
?- grandparent_child(elizabeth, Y). | |||
Y = william; | |||
Y = harry. | |||
?- grandparent_child(X, Y). | |||
X = elizabeth | |||
Y = william; | |||
X = elizabeth | |||
Y = harry. | |||
?- grandparent_child(william, harry). | |||
no | |||
?- grandparent_child(elizabeth, harry). | |||
yes | |||
</syntaxhighlight> | |||
Although Horn clause logic programs are ],<ref>{{cite journal |last=Tärnlund |first=S.Å. |date=1977 |title=Horn clause computability |journal=] |volume=17 |issue=2 |pages=215–226 |doi=10.1007/BF01932293|s2cid=32577496 }}</ref><ref>{{cite journal |last1=Andréka |first1=H. |last2=Németi |first2=I. |date=1978 |title=The generalised completeness of Horn predicate-logic as a programming language |url=https://cyber.bibl.u-szeged.hu/index.php/actcybern/article/view/3160 |journal=Acta Cybernetica |volume=4 |issue=1 |pages=3–10}}</ref> for most practical applications, Horn clause programs need to be extended to "normal" logic programs with negative conditions. For example, the definition of sibling uses a negative condition, where the ] = is defined by the clause <code> X = X </code>: | |||
<syntaxhighlight lang="prolog"> | |||
sibling(X, Y) :- | |||
parent_child(Z, X), | |||
parent_child(Z, Y), | |||
not(X = Y). | |||
</syntaxhighlight> | |||
Logic programming languages that include negative conditions have the knowledge representation capabilities of a ]. | |||
In ASP and Datalog, logic programs have only a ] reading, and their execution is performed by means of a proof procedure or model generator whose behaviour is not meant to be controlled by the programmer. However, in the Prolog family of languages, logic programs also have a ] interpretation as goal-reduction procedures. From this point of view, clause A :- B<sub>1</sub>,...,B<sub>n</sub> is understood as: | |||
:to solve <code>A</code>, solve <code>B<sub>1</sub></code>, and ... and solve <code>B<sub>n</sub></code>. | |||
Negative conditions in the bodies of clauses also have a procedural interpretation, known as '']'': A negative literal <code> not B</code> is deemed to hold if and only if the positive literal <code> B</code> fails to hold. | |||
Much of the research in the field of logic programming has been concerned with trying to develop a logical semantics for negation as failure and with developing other semantics and other implementations for negation. These developments have been important, in turn, for supporting the development of ] for logic-based ] and ]. | |||
==History== | ==History== | ||
The use of mathematical logic to represent and execute ]s is also a feature of the ], developed by ] in the 1930s. However, the first proposal to use the ] form of logic for representing computer programs was made by ].<ref>{{cite conference|first=Cordell|last=Green|url=https://www.ijcai.org/Proceedings/69/Papers/023.pdf|title=Application of Theorem Proving to Problem Solving|conference=IJCAI 1969}}</ref> This used an axiomatization of a subset of ], together with a representation of an input-output relation, to compute the relation by simulating the execution of the program in LISP. Foster and Elcock's ], on the other hand, employed a combination of equations and lambda calculus in an assertional programming language that places no constraints on the order in which operations are performed.<ref>{{cite conference|first1=J.M.|last1=Foster|first2=E.W.|last2=Elcock|title=ABSYS 1: An Incremental Compiler for Assertions: an Introduction|conference=Fourth Annual Machine Intelligence Workshop|series=Machine Intelligence|volume=4|place=Edinburgh, UK|publisher=]|date=1969|pages=423–429}}</ref> | |||
Logic programming, with its current syntax of facts and rules, can be traced back to debates in the late 1960s and early 1970s about declarative versus procedural representations of knowledge in ]. Advocates of declarative representations were notably working at ], associated with ], ] and Cordell Green, and in ], with ] (an academic visitor from ]), ], and ]. Advocates of procedural representations were mainly centered at ], under the leadership of ] and ].<ref>{{Cite journal| doi = 10.1145/35043.35046 | last1 = Kowalski | first1 = R. A. | title = The early years of logic programming | journal = Communications of the ACM | volume = 31 | pages = 38–43 | year = 1988 | s2cid = 12259230 |url=http://www.doc.ic.ac.uk/~rak/papers/the%20early%20years.pdf}}</ref> | |||
Logic programming in the first and wider sense gave rise to a number of implementations, such as those by Fisher Black (1964), James Slagle (1965) and Cordell Green (1969), which were question-answering systems in the spirit of McCarthy's advice-taker. Foster and Elcock's ] (1969), on the other hand, was probably the first language to be explicitly developed as an assertional programming language. | |||
Although it was based on the proof methods of logic, ], developed by ] at MIT, was the first language to emerge within this proceduralist paradigm.<ref>{{cite conference|first=Carl|last=Hewitt|author-link=Carl Hewitt|title=Planner: A Language for Proving Theorems in Robots|url=https://www.ijcai.org/Proceedings/69/Papers/030.pdf|conference=IJCAI 1969}}</ref> Planner featured pattern-directed invocation of procedural plans from goals (i.e. goal-reduction or ]) and from assertions (i.e. ]). The most influential implementation of Planner was the subset of Planner, called Micro-Planner, implemented by ], ] and ]. Winograd used Micro-Planner to implement the landmark, natural-language understanding program ].<ref name ="Winograd">{{cite journal|first=Terry|last=Winograd|author-link=Terry Winograd|title=Understanding natural language|journal=]|volume=3| issue = 1|date=1972|pages=1–191|doi=10.1016/0010-0285(72)90002-3}}</ref> For the sake of efficiency, Planner used a backtracking control structure so that only one possible computation path had to be stored at a time. Planner gave rise to the programming languages ],<ref name=Rulifson>{{cite tech report | |||
Logic programming in the narrower sense can be traced back to debates in the late 1960s and early 1970s about declarative versus procedural representations of knowledge in Artificial Intelligence. Advocates of declarative representations were notably working at ], associated with ], Bertram Raphael and Cordell Green, and in ], with ] (an academic visitor from ]), ], and ]. Advocates of procedural representations were mainly centered at ], under the leadership of ] and ]. | |||
| author =Jeff Rulifson | |||
| author-link =Jeff Rulifson | |||
|author2=Jan Derksen |author3=Richard Waldinger | |||
| title =QA4, A Procedural Calculus for Intuitive Reasoning | |||
| id =SRI AI Center Technical Note 73 | |||
| date =November 1973 | |||
| url =https://apps.dtic.mil/sti/pdfs/ADA052440.pdf | |||
}} | |||
</ref> Popler,<ref>Davies, J.M., 1971. POPLER: a POP-2 planner. Edinburgh University, Department of Machine Intelligence and Perception.</ref> Conniver,<ref>{{cite tech report |last1=McDermott |first1=D.V. |author-link1=Drew McDermott |last2=Sussman |first2=G.J. |author-link2=Gerald Jay Sussman |date=May 1972 |title=The Conniver reference manual |id=Artificial Intelligence Memo No. 259 |url=https://www.researchgate.net/publication/37597046_The_Conniver_Reference_Manual}}</ref> QLISP,<ref>{{cite tech report |last1=Reboh |first1=R. |last2=Sacerdoti |first2=E.D. |date=August 1973 |title=A preliminary QLISP manual |publisher=Artificial Intelligence Center, SRI International |url=https://www.sri.com/publication/cyber-formal-methods-pubs/preliminary-qlisp-manual/}}</ref> and the concurrent language Ether.<ref>{{cite journal |last1=Kornfeld |first1=W.A. |last2=Hewitt |first2=C.E. |author-link2=Carl Hewitt |date=1981 |title=The scientific community metaphor |journal=IEEE Transactions on Systems, Man, and Cybernetics |volume=11 |issue=1 |pages=24–33|doi=10.1109/TSMC.1981.4308575 |hdl=1721.1/5693 |s2cid=1322857 |hdl-access=free }}</ref> | |||
Hayes and Kowalski in Edinburgh tried to reconcile the logic-based declarative approach to knowledge representation with Planner's procedural approach. Hayes (1973) developed an equational language, Golux, in which different procedures could be obtained by altering the behavior of the theorem prover.<ref>{{cite conference|first=Pat|last=Hayes|title=Computation and Deduction|book-title=Proceedings of the 2nd MFCS Symposium|publisher=]|date=1973|pages=105–118}}</ref> | |||
Although it was based on logic, ], developed at MIT, was the first language to emerge within this proceduralist paradigm . The most influential implementation of Planner was the subset of Planner, called Micro-Planner, implemented by ], ] and ]. It was used to implement Winograd's natural-language understanding program ], which was a landmark at that time. Planner featured pattern-directed invocation of procedural plans from both assertions and goals. In order to cope with the very limited memory systems that were available when it was developed, Planner used backtracking control structure so that only one possible computation path had to be stored at a time. From Planner there developed the programming languages QA-4, Popler, Conniver, QLISP, and the concurrent language Ether. | |||
In the meanwhile, ] in ] was working on ], using logic to represent semantics and using resolution for question-answering. During the summer of 1971, Colmerauer invited Kowalski to Marseille, and together they discovered that the clausal form of logic could be used to represent ] and that resolution theorem provers could be used for parsing. They observed that some theorem provers, like hyper-resolution,<ref>{{cite journal |last=Robinson |first=J. |date=1965 |title=Automatic deduction with hyper-resolution |journal=] |volume=1 |issue=3 |pages=227–234 |doi=10.2307/2272384|jstor=2272384 }}</ref> behave as bottom-up parsers and others, like ] (1971)<ref>{{cite journal|first1=Robert|last1=Kowalski|first2=Donald|last2=Kuehner|url=http://www.doc.ic.ac.uk/~rak/papers/sl.pdf|title=Linear Resolution with Selection Function|journal=]|volume=2|issue=3–4|date=1971|pages=227–260|doi=10.1016/0004-3702(71)90012-9}}</ref> behave as top-down parsers. | |||
Hayes and Kowalski in Edinburgh tried to reconcile the logic-based declarative approach to knowledge representation with Planner's procedural approach. Hayes (1973) developed an equational language, Golux, in which different procedures could be obtained by altering the behavior of the theorem prover. Kowalski, on the other hand, showed how SL-resolution applied to sentences in the clausal form of logic behave as goal-reduction procedures. Kowalski collaborated with Colmerauer in Marseille, who developed these ideas in the design and implementation of the programming language Prolog. From Prolog there developed, among others, the programming languages ], ], ], ], ], ], Ciao, ], and ], as well as a variety of ], (see Shapiro (1989) for a survey), ] languages and ]. | |||
It was in the following summer of 1972, that Kowalski, again working with Colmerauer, developed the procedural interpretation of implications in clausal form. It also became clear that such clauses could be restricted to definite clauses or ]s, and that SL-resolution could be restricted (and generalised) to ]. Kowalski's procedural interpretation and SLD were described in a 1973 memo, published in 1974.<ref name = "Kowalski">{{cite web|first=Robert|last=Kowalski|url=https://www.doc.ic.ac.uk/~rak/papers/IFIP%2074.pdf|title=Predicate Logic as a Programming Language|id=Memo 70|publisher=Department of Artificial Intelligence, ]|date=1973}} Also in Proceedings IFIP Congress, Stockholm, North Holland Publishing Co., 1974, pp. 569–574.</ref> | |||
==Prolog== | |||
{{ main|Prolog }} | |||
Colmerauer, with Philippe Roussel, used the procedural interpretation as the basis of Prolog, which was implemented in the summer and autumn of 1972. The first Prolog program, also written in 1972 and implemented in Marseille, was a French question-answering system. The use of Prolog as a practical programming language was given great momentum by the development of a compiler by ] in Edinburgh in 1977. Experiments demonstrated that Edinburgh Prolog could compete with the processing speed of other ] languages such as ].<ref>{{cite journal |last1=Warren |first1=D.H. |last2=Pereira |first2=L.M. |last3=Pereira |first3=F. |date=1977 |title=Prolog-the language and its implementation compared with Lisp |journal=] |volume=12 |issue=8 |pages=109–115|doi=10.1145/872734.806939 }}</ref> Edinburgh Prolog became the ''de facto'' standard and strongly influenced the definition of ] standard Prolog. | |||
The programming language ] was developed in 1972 by ]. It emerged from a collaboration between Colmerauer in ] and Robert Kowalski in Edinburgh. Colmerauer was working on natural language understanding, using logic to represent semantics and using resolution for question-answering. During the summer of 1971, Colmerauer and Kowalski discovered that the clausal form of logic could be used to represent formal grammars and that resolution theorem provers could be used for parsing. They saw that some theorem provers, like hyper-resolution, behave as bottom-up parsers and others, like ] (1971), behave as top-down parsers. | |||
Logic programming gained international attention during the 1980s, when it was chosen by the Japanese ] to develop the software for the ] (FGCS) project. The FGCS project aimed to use logic programming to develop advanced ] applications on massively ]. Although the project initially explored the use of Prolog, it later adopted the use of ], because it was closer to the FGCS computer architecture. | |||
During the following summer of 1972, Kowalski, again working with Colmerauer, observed that SL-resolution treats universally quantified clauses in the declarative form of implications | |||
However, the committed choice feature of concurrent logic programming interfered with the language's logical semantics<ref>Ueda, K., 2018. Logic/constraint programming and concurrency: The hard-won lessons of the fifth generation computer project. Science of Computer Programming, 164, pp.3-17.</ref> and with its suitability for knowledge representation and problem solving applications. Moreover, the parallel computer systems developed in the project failed to compete with advances taking place in the development of more conventional, general-purpose computers. Together these two issues resulted in the FGCS project failing to meet its objectives. Interest in both logic programming and AI fell into world-wide decline.<ref>H.P. Newquist, 2020. The Brain Makers: The History Of Artificial Intelligence. The Relayer Group.</ref> | |||
:<tt>B<sub>1</sub> and … and B<sub>n</sub> implies H</tt> | |||
In the meanwhile, more declarative logic programming approaches, including those based on the use of Prolog, continued to make progress independently of the FGCS project. In particular, although Prolog was developed to combine declarative and procedural representations of knowledge, the purely declarative interpretation of logic programs became the focus for applications in the field of ]s. Work in this field became prominent around 1977, when Hervé Gallaire and ] organized a workshop on logic and databases in Toulouse.<ref>{{Citation | editor1-first = Hervé | editor1-last = Gallaire | editor2-first = John 'Jack' | editor2-last = Minker | contribution = Logic and Data Bases, Symposium on Logic and Data Bases, Centre d'études et de recherches de Toulouse, 1977 | title = Advances in Data Base Theory | publisher = Plenum Press | place = New York | year = 1978 | isbn = 978-0-306-40060-5 | url-access = registration | url = https://archive.org/details/logicdatabases0000symp }}.</ref> The field was eventually renamed as '']''. | |||
as goal-reduction procedures | |||
This focus on the logical, declarative reading of logic programs was given further impetus by the development of ] in the 1980s and ] in the 1990s. It is also receiving renewed emphasis in recent applications of Prolog<ref name="Prolog Book">{{cite book |last1=Warren |first1=D.S. |editor-last1=Warren |editor-first1=D.S. |editor-last2=Dahl |editor-first2=V. |editor-last3=Eiter |editor-first3=T. |editor-last4=Hermenegildo |editor-first4=M.V. |editor-last5=Kowalski |editor-first5=R. |editor-last6=Rossi |editor-first6=F. |chapter=Introduction to Prolog |title=Prolog: The Next 50 Years |series=Lecture Notes in Computer Science() |date=2023 |volume=13900 |publisher=Springer, Cham. |doi=10.1007/978-3-031-35254-6_1 |pages=3–19|isbn=978-3-031-35253-9 }}</ref> | |||
:to show/solve <tt>H</tt>, show/solve <tt>B<sub>1</sub></tt> and … and <tt>B<sub>n</sub></tt>. | |||
The ] (ALP) was founded in 1986 to promote Logic Programming. Its official journal until 2000, was '']''. Its founding ] was ].<ref>{{cite journal |first=J. Alan |last=Robinson |year=2001 |title=Invited Editorial |journal=Theory and Practice of Logic Programming |volume=1 |issue=1 |pages=1 |publisher=] |doi=10.1017/s1471068400000028|doi-broken-date=1 November 2024 |doi-access=free }}</ref> In 2001, the journal was renamed ''The Journal of Logic and Algebraic Programming'', and the official journal of ALP became '']'', published by ]. | |||
This dual declarative/procedural interpretation later became formalised in the Prolog notation | |||
==Concepts== | |||
:<tt>H :- B<sub>1</sub>, …, B<sub>n</sub>.</tt> | |||
Logic programs enjoy a rich variety of semantics and problem solving methods, as well as a wide range of applications in programming, databases, knowledge representation and problem solving. | |||
===Algorithm = Logic + Control=== | |||
which can be read (and used) both declaratively and procedurally. It also became clear that such clauses could be restricted to definite clauses or ]s, where <tt>H</tt>, <tt>B<sub>1</sub></tt>, …, <tt>B<sub>n</sub></tt> are all atomic predicate logic formulae, and that SL-resolution could be restricted (and generalised) to LUSH or SLD-resolution. | |||
The procedural interpretation of logic programs, which uses backward reasoning to reduce goals to subgoals, is a special case of the use of a problem-solving strategy to '''control''' the use of a declarative, '''logical''' representation of knowledge to obtain the behaviour of an '''algorithm'''. More generally, different problem-solving strategies can be applied to the same logical representation to obtain different algorithms. Alternatively, different algorithms can be obtained with a given problem-solving strategy by using different logical representations.<ref>{{cite journal|author=R.A.Kowalski|title=Algorithm=Logic + Control|journal=]|volume=22| issue = 7|date=July 1979|pages=424–436|doi=10.1145/359131.359136|s2cid = 2509896|doi-access=free}}</ref> | |||
The two main problem-solving strategies are ] (goal reduction) and ], also known as top-down and bottom-up reasoning, respectively. | |||
Colmerauer, with Philippe Roussel, used this dual interpretation of clauses as the basis of Prolog, which was implemented in the summer and autumn of 1972. The first Prolog program, also written in 1972 and implemented in Marseille, was a French question-answering system. Kowalski's procedural interpretation and LUSH were later described in a 1973 memo, published in 1974. | |||
In the simple case of a propositional Horn clause program and a top-level atomic goal, backward reasoning determines an ], which constitutes the search space for solving the goal. The top-level goal is the root of the tree. Given any node in the tree and any clause whose head matches the node, there exists a set of child nodes corresponding to the sub-goals in the body of the clause. These child nodes are grouped together by an "and". The alternative sets of children corresponding to alternative ways of solving the node are grouped together by an "or". | |||
The close relationship between the declarative and the procedural interpretations remains a distinctive feature of logic programming languages, though the relationship becomes more complex when negation, disjunction, and quantifiers other than the default ones are allowed in programs. | |||
Any search strategy can be used to search this space. Prolog uses a sequential, last-in-first-out, backtracking strategy, in which only one alternative and one sub-goal are considered at a time. For example, subgoals can be solved in parallel, and clauses can also be tried in parallel. The first strategy is called '''{{Visible anchor|and-parallel}}''' and the second strategy is called '''{{Visible anchor|or-parallel}}'''. Other search strategies, such as intelligent backtracking,<ref>{{cite book |last1=Bruynooghe |first1=M. |last2=Pereira |first2=L.M. |date=1984 |chapter=Deduction revision by intelligent backtracking |pages=194–215 |title=Implementations of Prolog |publisher=Ellis Horwood |location=Chichester, England}}</ref> or best-first search to find an optimal solution,<ref>{{cite conference |last=Nakamura |first=K. |date=July 1985 |title=Heuristic Prolog: logic program execution by heuristic search |conference=Conference on Logic Programming |pages=148–155 |location=Berlin, Heidelberg |publisher=Springer Berlin Heidelberg}}</ref> are also possible. | |||
In the more general, non-propositional case, where sub-goals can share variables, other strategies can be used, such as choosing the subgoal that is most highly instantiated or that is sufficiently instantiated so that only one procedure applies.<ref>{{cite journal |last1=Genesereth |first1=M.R. |last2=Ginsberg |first2=M.L. |date=1985 |title=Logic programming |journal=] |volume=28 |issue=9 |pages=933–941|doi=10.1145/4284.4287 |s2cid=15527861 |doi-access=free }}</ref> Such strategies are used, for example, in ]. | |||
In most cases, backward reasoning from a query or goal is more efficient than forward reasoning. But sometimes with Datalog and Answer Set Programming, there may be no query that is separate from the set of clauses as a whole, and then generating all the facts that can be derived from the clauses is a sensible problem-solving strategy. Here is another example, where forward reasoning beats backward reasoning in a more conventional computation task, where the goal <code>?- fibonacci(n, Result)</code> is to find the n<sup>th</sup> fibonacci number: | |||
<syntaxhighlight lang="prolog"> | |||
fibonacci(0, 0). | |||
fibonacci(1, 1). | |||
fibonacci(N, Result) :- | |||
N > 1, | |||
N1 is N - 1, | |||
N2 is N - 2, | |||
fibonacci(N1, F1), | |||
fibonacci(N2, F2), | |||
Result is F1 + F2. | |||
</syntaxhighlight> | |||
Here the relation <code>fibonacci(N, M)</code> stands for the function <code>fibonacci(N) = M</code>, and the predicate <code>N is Expression</code> is Prolog notation for the predicate that instantiates the variable <code>N</code> to the value of <code>Expression</code>. | |||
Given the goal of computing the fibonacci number of <code>n</code>, backward reasoning reduces the goal to the two subgoals of computing the fibonacci numbers of n-1 and n-2. It reduces the subgoal of computing the fibonacci number of n-1 to the two subgoals of computing the fibonacci numbers of n-2 and n-3, redundantly computing the fibonacci number of n-2. This process of reducing one fibonacci subgoal to two fibonacci subgoals continues until it reaches the numbers 0 and 1. Its complexity is of the order 2<sup>n</sup>. In contrast, forward reasoning generates the sequence of fibonacci numbers, starting from 0 and 1 without any recomputation, and its complexity is linear with respect to n. | |||
Prolog cannot perform forward reasoning directly. But it can achieve the effect of forward reasoning within the context of backward reasoning by means of ]: Subgoals are maintained in a table, along with their solutions. If a subgoal is re-encountered, it is solved directly by using the solutions already in the table, instead of re-solving the subgoals redundantly.<ref>{{cite journal |last1=Swift |first1=T. |last2=Warren |first2=D.S. |date=January 2012 |title=XSB: Extending Prolog with tabled logic programming |journal=] |volume=12 |issue=1–2 |pages=157–187|doi=10.1017/S1471068411000500 |arxiv=1012.5123 |s2cid=6153112 }}</ref> | |||
=== Relationship with functional programming === | |||
{{See also|Functional programming#Comparison to logic programming}} | |||
Logic programming can be viewed as a generalisation of functional programming, in which functions are a special case of relations.<ref name=dis>{{cite book|author1=Daniel Friedman|author2=William Byrd|author3=Oleg Kiselyov|author4=Jason Hemann|title=The Reasoned Schemer, Second Edition|year=2018|publisher=The MIT Press}}</ref> | |||
For example, the function, mother(X) = Y, (every X has only one mother Y) can be represented by the relation mother(X, Y). In this respect, logic programs are similar to ], which also represent functions as relations. | |||
Compared with relational syntax, functional syntax is more compact for nested functions. For example, in functional syntax the definition of maternal grandmother can be written in the nested form: | |||
<syntaxhighlight lang="prolog"> | |||
maternal_grandmother(X) = mother(mother(X)). | |||
</syntaxhighlight> | |||
The same definition in relational notation needs to be written in the unnested, flattened form: | |||
<syntaxhighlight lang="prolog"> | |||
maternal_grandmother(X, Y) :- mother(X, Z), mother(Z, Y). | |||
</syntaxhighlight> | |||
However, nested syntax can be regarded as syntactic sugar for unnested syntax. ] Prolog, for example, transforms functional syntax into relational form and executes the resulting logic program using the standard Prolog execution strategy.<ref>A. Casas, D. Cabeza, M. V. Hermenegildo. A Syntactic Approach to | |||
Combining Functional Notation, Lazy Evaluation and Higher-Order in | |||
LP Systems. The 8th International Symposium on Functional and Logic Programming (FLOPS'06), pages 142-162, April 2006.</ref> Moreover, the same transformation can be used to execute nested relations that are not functional. For example: | |||
<syntaxhighlight lang="prolog"> | |||
grandparent(X) := parent(parent(X)). | |||
parent(X) := mother(X). | |||
parent(X) := father(X). | |||
mother(charles) := elizabeth. | |||
father(charles) := phillip. | |||
mother(harry) := diana. | |||
father(harry) := charles. | |||
?- grandparent(X,Y). | |||
X = harry, | |||
Y = elizabeth. | |||
X = harry, | |||
Y = phillip. | |||
</syntaxhighlight> | |||
=== Relationship with relational programming === | |||
The term ''relational programming'' has been used to cover a variety of programming languages that treat functions as a special case of relations. Some of these languages, such as ]<ref name=dis /> | |||
and relational linear programming<ref>Kersting, K., Mladenov, M. and Tokmakov, P., 2017. Relational linear programming. Artificial Intelligence, 244, pp.188-216.</ref> | |||
are logic programming languages in the sense of this article. | |||
However, the relational language RML is an imperative programming language | |||
<ref>Beyer, D., 2006, May. Relational programming with CrocoPat. In Proceedings of the 28th International Conference on Software engineering (pp. 807-810).</ref> whose core construct is a | |||
relational expression, which is similar to an expression in first-order predicate logic. | |||
Other relational programming languages are based on the relational calculus<ref>MacLennan, B.J., 1983. Overview of relational programming. ACM SIGPLAN Notices, 18(3), pp.36-45.</ref> or relational algebra.<ref>Behnke, R., Berghammer, R., Meyer, E. and Schneider, P., 1998. RELVIEW—A system for calculating with relations and relational programming. In Fundamental Approaches to Software Engineering: First International Conference, FASE'98 Held as Part of the Joint European Conferences on Theory and Practice of Software, ETAPS'98 Lisbon, Portugal, March 28–April 4, 1998 Proceedings 1 (pp. 318-321). Springer Berlin Heidelberg.</ref> | |||
=== Semantics of Horn clause programs === | |||
{{main|Syntax and semantics of logic programming}} | |||
Viewed in purely logical terms, there are two approaches to the declarative semantics of Horn clause logic programs: One approach is the original ''] semantics'', which understands solving a goal as showing that the goal is a theorem that is true in all ] of the program. | |||
In this approach, computation is ] in ]; and both ], as in SLD resolution, and ], as in hyper-resolution, are correct and complete theorem-proving methods. Sometimes such theorem-proving methods are also regarded as providing a separate ] for logic programs. But from a logical point of view, they are proof methods, rather than semantics. | |||
The other approach to the declarative semantics of Horn clause programs is the ''] semantics'', which understands solving a goal as showing that the goal is true (or satisfied) in some ] of the program. For Horn clause programs, there always exists such a standard model: It is the unique ''minimal model'' of the program. | |||
Informally speaking, a minimal model is a model that, when it is viewed as the set of all (variable-free) facts that are true in the model, contains no smaller set of facts that is also a model of the program. | |||
For example, the following facts represent the minimal model of the family relationships example in the introduction of this article. All other variable-free facts are false in the model: | |||
<syntaxhighlight lang="prolog"> | |||
mother_child(elizabeth, charles). | |||
father_child(charles, william). | |||
father_child(charles, harry). | |||
parent_child(elizabeth, charles). | |||
parent_child(charles, william). | |||
parent_child(charles, harry). | |||
grandparent_child(elizabeth, william). | |||
grandparent_child(elizabeth, harry). | |||
</syntaxhighlight> | |||
The satisfiability semantics also has an alternative, more mathematical characterisation as the ] of the function that uses the rules in the program to derive new facts from existing facts in one step of inference. | |||
Remarkably, the same problem-solving methods of forward and backward reasoning, which were originally developed for the logical consequence semantics, are equally applicable to the satisfiability semantics: Forward reasoning generates the minimal model of a Horn clause program, by deriving new facts from existing facts, until no new additional facts can be generated. Backward reasoning, which succeeds by reducing a goal to subgoals, until all subgoals are solved by facts, ensures that the goal is true in the minimal model, without generating the model explicitly.<ref>{{cite journal|last1=Van Emden|first1=M.H.|last2=Kowalski|first2=R.A.|date=October 1976|title=The semantics of predicate logic as a programming language|journal=]|volume=23|issue=4|pages=733–742|doi=10.1145/321978.321991 |s2cid=11048276 |doi-access=free}}</ref> | |||
The difference between the two declarative semantics can be seen with the definitions of addition and multiplication in ], which represents the natural numbers <code>0, 1, 2, ...</code> as a sequence of terms of the form <code>0, s(0), s(s(0)), ...</code>. In general, the term <code>s(X)</code> represents the successor of <code>X,</code> namely <code>X + 1.</code> Here are the standard definitions of addition and multiplication in functional notation: | |||
<pre> | |||
X + 0 = X. | |||
X + s(Y) = s(X + Y). | |||
i.e. X + (Y + 1) = (X + Y) + 1 | |||
X × 0 = 0. | |||
X × s(Y) = X + (X × Y). | |||
i.e. X × (Y + 1) = X + (X × Y). | |||
</pre> | |||
Here are the same definitions as a logic program, using <code>add(X, Y, Z)</code> to represent <code>X + Y = Z,</code> and <code>multiply(X, Y, Z)</code> to represent <code>X × Y = Z</code>: | |||
<syntaxhighlight lang="prolog"> | |||
add(X, 0, X). | |||
add(X, s(Y), s(Z)) :- add(X, Y, Z). | |||
multiply(X, 0, 0). | |||
multiply(X, s(Y), W) :- multiply(X, Y, Z), add(X, Z, W). | |||
</syntaxhighlight> | |||
The two declarative semantics both give the same answers for the same existentially quantified conjunctions of addition and multiplication goals. For example <code>2 × 2 = X</code> has the solution <code>X = 4</code>; and <code>X × X = X + X</code> has two solutions <code>X = 0</code> and <code>X = 2</code>: | |||
<syntaxhighlight lang="prolog"> | |||
?- multiply(s(s(0)), s(s(0)), X). | |||
X = s(s(s(s(0)))). | |||
?- multiply(X, X, Y), add(X, X, Y). | |||
X = 0, Y = 0. | |||
X = s(s(0)), Y = s(s(s(s(0)))). | |||
</syntaxhighlight> | |||
However, with the logical-consequence semantics, there are non-standard models of the program, in which, for example, <code>add(s(s(0)), s(s(0)), s(s(s(s(s(0)))))),</code> i.e. <code>2 + 2 = 5</code> is true. But with the satisfiability semantics, there is only one model, namely the standard model of arithmetic, in which <code>2 + 2 = 5</code> is false. | |||
In both semantics, the goal <syntaxhighlight lang="prolog" inline>?- add(s(s(0)), s(s(0)), s(s(s(s(s(0))))))</syntaxhighlight> fails. In the satisfiability semantics, the failure of the goal means that the truth value of the goal is false. But in the logical consequence semantics, the failure means that the truth value of the goal is unknown. | |||
===Negation as failure=== | ===Negation as failure=== | ||
{{Main|Negation as failure}} | |||
] (NAF), as a way of concluding that a negative condition <code>not p</code> holds by showing that the positive condition <code>p</code> fails to hold, was already a feature of early Prolog systems. The resulting extension of ] is called ]. A similar construct, called "thnot", also existed in ]. | |||
The logical semantics of NAF was unresolved until ]<ref>{{cite book |last=Clark |first=K.L. |title=Logic and Data Bases |chapter=Negation as Failure |author-link=Keith Clark (computer scientist) |date=1977 |pages=293–322 |doi=10.1007/978-1-4684-3384-5_11 |location=Boston, MA |publisher=Springer US|isbn=978-1-4684-3386-9 }}</ref> showed that, under certain natural conditions, NAF is an efficient, correct (and sometimes complete) way of reasoning with the logical consequence semantics using the ] of a logic program in first-order logic. | |||
Micro-Planner had a construct, called "thnot", which when applied to an expression returns the value true if (and only if) the evaluation of the expression fails. An equivalent operator is normally built-in in | |||
modern ]'s implementations and has been called "]". It is normally | |||
written as not(p), where p is an atom whose variables normally has been instantiated by the time not(p) is invoked. A more cryptic (but standard) syntax is \+ p . Negation as failure literals can occur as conditions not(B<sub>i</sub>) in the body of program clauses. | |||
Completion amounts roughly to regarding the set of all the program clauses with the same predicate in the head, say: | |||
:< |
:<code>A :- Body<sub>1</sub>.</code> | ||
:< |
:<code> ...</code> | ||
:< |
:<code>A :- Body<sub>k</sub>.</code> | ||
as a definition of the predicate | as a definition of the predicate: | ||
:< |
:<code>A iff (Body<sub>1</sub> or ... or Body<sub>k</sub>)</code> | ||
where |
where <code>iff</code> means "if and only if". The completion also includes axioms of equality, which correspond to ]. Clark showed that proofs generated by SLDNF are structurally similar to proofs generated by a natural deduction style of reasoning with the completion of the program. | ||
Consider, for example, the following program: | |||
The notion of completion is closely related to McCarthy's ] semantics for default reasoning, and to the ]. | |||
<syntaxhighlight lang="prolog"> | |||
should_receive_sanction(X, punishment) :- | |||
is_a_thief(X), | |||
not should_receive_sanction(X, rehabilitation). | |||
should_receive_sanction(X, rehabilitation) :- | |||
is_a_thief(X), | |||
is_a_minor(X), | |||
not is_violent(X). | |||
is_a_thief(tom). | |||
</syntaxhighlight> | |||
Given the goal of determining whether tom should receive a sanction, the first rule succeeds in showing that tom should be punished: | |||
<syntaxhighlight lang="prolog"> | |||
As an alternative to the completion semantics, negation as failure can also be interpreted epistemically, as in the ] of ]. In this interpretation not(B<sub>i</sub>) means literally that B<sub>i</sub> is not known or not believed. The epistemic interpretation has the advantage that it can be combined very simply with classical negation, as in "extended logic programming", to formalise such phrases as "the contrary can not be shown", where "contrary" is classical negation and "can not be shown" is the epistemic interpretation of negation as failure. | |||
?- should_receive_sanction(tom, Sanction). | |||
Sanction = punishment. | |||
</syntaxhighlight> | |||
This is because tom is a thief, and it cannot be shown that tom should be rehabilitated. It cannot be shown that tom should be rehabilitated, because it cannot be shown that tom is a minor. | |||
===Prolog implementation=== | |||
If, however, we receive new information that tom is indeed a minor, the previous conclusion that tom should be punished is replaced by the new conclusion that tom should be rehabilitated: | |||
The first ] implementation was Marseille Prolog, developed in 1972. The use of Prolog as a practical programming language was given great momentum by the development of a compiler by David Warren in Edinburgh | |||
in 1977. Experiments demonstrated that Edinburgh Prolog could compete with the processing speed of other symbolic programming languages such as Lisp. Edinburgh Prolog became the ''de facto'' standard and strongly influenced the definition of ] standard Prolog. | |||
<syntaxhighlight lang="prolog"> | |||
==Concurrent logic programming== | |||
minor(tom). | |||
?- should_receive_sanction(tom, Sanction). | |||
], Steve Gregory, Vijay Saraswat, Udi Shapiro, Kazunori Ueda, etc. developed a family of ]-like concurrent message passing systems using unification of shared variables and data structure streams for messages. Efforts were made to base these systems on mathematical logic, and they were used as the basis of the ]. The Prolog-like concurrent systems were based on message passing and consequently were subject to the same indeterminacy . | |||
Sanction = rehabilitation. | |||
</syntaxhighlight> | |||
This property of withdrawing a conclusion when new information is added, is called non-monotonicity, and it makes logic programming a ]. | |||
==Higher-order logic programming== | |||
But, if we are now told that tom is violent, the conclusion that tom should be punished will be reinstated: | |||
Several researchers have extended logic programming with ] features derived from ], such as predicate variables. Such languages include the Prolog extensions ] and ]. | |||
<syntaxhighlight lang="prolog"> | |||
==Linear logic programming== | |||
violent(tom). | |||
?- should_receive_sanction(tom, Sanction). | |||
Basing logic programming within linear logic has resulted in the design of logic programming languages that are considerably more expressive than those based on classical logic. Horn clause programs can only represent state change by the change in arguments to predicates. In linear logic programming, one can use the ambient linear logic to support state change. Some early designs of logic programming languages based on linear logic include LO , Lolli , ACL , and Forum . Forum provides a goal-direct interpretation of all of linear logic. | |||
Sanction = punishment. | |||
</syntaxhighlight> | |||
The completion of this program is: | |||
==Application domains== | |||
<syntaxhighlight lang="prolog"> | |||
*]s, where the program generates a recommendation or answer from a large model of the application domain. | |||
should_receive_sanction(X, Sanction) iff | |||
*], where the program generates novel theorems to extend some existing body of theory. | |||
Sanction = punishment, is_a_thief(X), | |||
not should_receive_sanction(X, rehabilitation) | |||
or Sanction = rehabilitation, is_a_thief(X), is_a_minor(X), | |||
not is_violent(X). | |||
is_a_thief(X) iff X = tom. | |||
is_a_minor(X) iff X = tom. | |||
is_violent(X) iff X = tom. | |||
</syntaxhighlight> | |||
The notion of completion is closely related to ] ] semantics for default reasoning,<ref>{{cite journal | last1=Gelfond |first1=M. |last2=Przymusinska |first2=H. |last3=Przymusinski |first3=T. |date=1989 |title=On the relationship between circumscription and negation as failure |journal=] |volume=38 |issue=1 |pages=75–94|doi=10.1016/0004-3702(89)90068-4 }}</ref> and to ] ].<ref>{{cite journal |last=Shepherdson |first=J.C. |date=1984 |title=Negation as failure: a comparison of Clark's completed data base and Reiter's closed world assumption |journal=] |volume=1 |issue=1 |pages=51–79|doi=10.1016/0743-1066(84)90023-2 }}</ref> | |||
The completion semantics for negation is a logical consequence semantics, for which SLDNF provides a proof-theoretic implementation. However, in the 1980s, the satisfiability semantics became more popular for logic programs with negation. In the satisfiability semantics, negation is interpreted according to the classical definition of truth in an intended or standard model of the logic program. | |||
In the case of logic programs with negative conditions, there are two main variants of the satisfiability semantics: In the ], the intended model of a logic program is a unique, three-valued, minimal model, which always exists. The well-founded semantics generalises the notion of ] in mathematical logic.<ref>{{cite journal |last1=Denecker |first1=M. |last2=Ternovska |first2=E. |title=A logic of nonmonotone inductive definitions |journal=] |volume=9 |issue=2 |pages=14:1–14:52 |date=2008|doi=10.1145/1342991.1342998 |s2cid=13156469 |url=https://lirias.kuleuven.be/handle/123456789/222628 |arxiv=cs/0501025 }}</ref> ]<ref>{{cite conference |last1=Rao |first1=P. |last2=Sagonas |first2=K. |last3=Swift |first3=T. |last4=Warren |first4=D.S. |last5=Freire |first5=J. |title=XSB: A system for efficiently computing well-founded semantics |conference=Logic Programming And Nonmonotonic Reasoning: 4th International Conference, LPNMR'97 |location=Dagstuhl Castle, Germany |date=July 28–31, 1997 |pages=430–440 |publisher=Springer Berlin Heidelberg |doi=10.1007/3-540-63255-7_33}}</ref> implements the well-founded semantics using SLG resolution.<ref>{{cite journal |author1=W. Chen |author2=D. S. Warren |title=Tabled Evaluation with Delaying for General Logic Programs | |||
|journal=] |volume=43 |issue=1 |pages=20–74 |date=January 1996 |doi=10.1145/227595.227597|s2cid=7041379 |doi-access=free }}</ref> | |||
In the alternative ], there may be no intended models or several intended models, all of which are minimal and two-valued. The stable model semantics underpins ] (ASP). | |||
Both the well-founded and stable model semantics apply to arbitrary logic programs with negation. However, both semantics coincide for ] logic programs. For example, the program for sanctioning thieves is (locally) stratified, and all three semantics for the program determine the same intended model: | |||
<syntaxhighlight lang="prolog"> | |||
should_receive_sanction(tom, punishment). | |||
is_a_thief(tom). | |||
is_a_minor(tom). | |||
is_violent(tom). | |||
</syntaxhighlight> | |||
Attempts to understand negation in logic programming have also contributed to the development of ].<ref>{{cite journal |author=Phan Minh Dung |title=On the acceptability of arguments and its fundamental role in nonmonotonic reasoning, logic programming, and n–person games |journal=Artificial Intelligence |volume=77 |issue= 2|pages=321–357 |year=1995 |doi=10.1016/0004-3702(94)00041-X|doi-access=free }}</ref> In an argumentation interpretation of negation, the initial argument that tom should be punished because he is a thief, is attacked by the argument that he should be rehabilitated because he is a minor. But the fact that tom is violent undermines the argument that tom should be rehabilitated and reinstates the argument that tom should be punished. | |||
===Metalogic programming=== | |||
], in which programs are treated as data, was already a feature of early Prolog implementations.<ref>Colmerauer, A. and Roussel, P., 1996. The birth of Prolog. In History of programming languages---II (pp. 331-367).</ref><ref name="Warren">Warren, D.H., Pereira, L.M. and Pereira, F., 1977. Prolog-the language and its implementation compared with Lisp. ACM SIGPLAN Notices, 12(8), pp.109-115.</ref> For example, the Edinburgh DEC10 implementation of Prolog included "an interpreter and a compiler, both written in Prolog itself".<ref name="Warren"/> The simplest metaprogram is the so-called "]" meta-interpreter: | |||
<syntaxhighlight lang="prolog"> | |||
solve(true). | |||
solve((B,C)):- solve(B),solve(C). | |||
solve(A):- clause(A,B),solve(B). | |||
</syntaxhighlight> | |||
where true represents an empty conjunction, and (B,C) is a composite term representing the conjunction of B and C. The predicate clause(A,B) means that there is a clause of the form A :- B. | |||
Metaprogramming is an application of the more general use of a '']'' or '']'' to describe and reason about another language, called the ''object language''. | |||
Metalogic programming allows object-level and metalevel representations to be combined, as in natural language. For example, in the following program, the atomic formula <code>attends(Person, Meeting)</code> occurs both as an object-level formula, and as an argument of the metapredicates <code>prohibited</code> and <code>approved.</code> | |||
<syntaxhighlight lang="prolog"> | |||
prohibited(attends(Person, Meeting)) :- | |||
not(approved(attends(Person, Meeting))). | |||
should_receive_sanction(Person, scolding) :- attends(Person, Meeting), | |||
lofty(Person), prohibited(attends(Person, Meeting)). | |||
should_receive_sanction(Person, banishment) :- attends(Person, Meeting), | |||
lowly(Person), prohibited(attends(Person, Meeting)). | |||
approved(attends(alice, tea_party)). | |||
attends(mad_hatter, tea_party). | |||
attends(dormouse, tea_party). | |||
lofty(mad_hatter). | |||
lowly(dormouse). | |||
?- should_receive_sanction(X,Y). | |||
Person = mad_hatter, | |||
Sanction = scolding. | |||
Person = dormouse, | |||
Sanction = banishment. | |||
</syntaxhighlight> | |||
===Relationship with the ]=== | |||
In his popular Introduction to Cognitive Science,<ref>{{Cite book|title=Mind: Introduction to Cognitive Science|last=Thagard|first=Paul|publisher=The MIT Press|year=2005|isbn=9780262701099|pages=11}}https://www.google.co.uk/books/edition/Mind_second_edition/gjcR1U2HT7kC?hl=en&gbpv=1&pg=PP11&printsec=frontcover</ref> ] includes logic and ] as alternative approaches to modelling human thinking. He argues that rules, which have the form ''IF condition THEN action'', are "very similar" to logical conditionals, but they are simpler and have greater psychological plausability (page 51). Among other differences between logic and rules, he argues that logic uses deduction, but rules use search (page 45) and can be used to reason either forward or backward (page 47). Sentences in logic "have to be interpreted as ''universally true''", but rules can be ''defaults'', which admit exceptions (page 44). | |||
He states that "unlike logic, rule-based systems can also easily represent strategic information | |||
about what to do" (page 45). For example, "IF you want to go home for the weekend, and you have bus fare, THEN | |||
you can catch a bus". He does not observe that the same strategy of reducing a goal to subgoals can be interpreted, in the manner of logic programming, as applying backward reasoning to a logical conditional: | |||
<syntaxhighlight lang="prolog"> | |||
can_go(you, home) :- have(you, bus_fare), catch(you, bus). | |||
</syntaxhighlight> | |||
All of these characteristics of rule-based systems - search, forward and backward reasoning, default reasoning, and goal-reduction - are also defining characteristics of logic programming. This suggests that Thagard's conclusion (page 56) that: | |||
<blockquote> | |||
Much of human knowledge is naturally described in terms of rules, and many kinds of thinking such as planning can be modeled by rule-based systems. | |||
</blockquote> | |||
also applies to logic programming. | |||
Other arguments showing how logic programming can be used to model aspects of human thinking are presented by ] and ] in their book, | |||
Human Reasoning and Cognitive Science.<ref>{{cite book | |||
| last = Stenning | |||
| first = Keith | |||
|author2=van Lambalgen, Michiel | |||
| title = Human reasoning and cognitive science | |||
| url = https://archive.org/details/humanreasoni_sten_2008_000_10735669 | |||
| url-access = registration | |||
| publisher = ] | |||
| year = 2008| isbn = 978-0-262-19583-6 | |||
}}https://philpapers.org/archive/STEHRA-5.pdf</ref> They show how the non-monotonic character of logic programs can be used to explain human performance on a variety of psychological tasks. They also show (page 237) that "closed–world reasoning in its guise as logic programming has an appealing neural implementation, unlike classical logic." | |||
In The Proper Treatment of Events,<ref>Van Lambalgen, M. and Hamm, F., 2008. The proper treatment of events. John Wiley & Sons. | |||
https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=3126320bb6e37ca3727fed404828b53fc56ff063</ref> | |||
Michiel van Lambalgen and Fritz Hamm investigate the use of constraint logic programming to code "temporal notions in natural language by looking at the way human beings construct time". | |||
===Knowledge representation=== | |||
The use of logic to represent procedural knowledge and strategic information was one of the main goals contributing to the early development of logic programming. Moreover, it continues to be an important feature of the Prolog family of logic programming languages today. However, many applications of logic programming, including Prolog applications, increasingly focus on the use of logic to represent purely declarative knowledge. These applications include both the representation of general ] knowledge and the representation of domain specific ]. | |||
Commonsense includes knowledge about cause and effect, as formalised, for example, in the ], ] and ]s. Here is a simplified example, which illustrates the main features of such formalisms. The first clause states that a fact holds immediately after an event initiates (or causes) the fact. The second clause is a '']'', which states that a fact that holds at a time continues to hold at the next time unless it is terminated by an event that happens at the time. This formulation allows more than one event to occur at the same time: | |||
<syntaxhighlight lang="prolog"> | |||
holds(Fact, Time2) :- | |||
happens(Event, Time1), | |||
Time2 is Time1 + 1, | |||
initiates(Event, Fact). | |||
holds(Fact, Time2) :- | |||
happens(Event, Time1), | |||
Time2 is Time1 + 1, | |||
holds(Fact, Time1), | |||
not(terminated(Fact, Time1)). | |||
terminated(Fact, Time) :- | |||
happens(Event, Time), | |||
terminates(Event, Fact). | |||
</syntaxhighlight> | |||
Here <code>holds</code> is a meta-predicate, similar to <code>solve</code> above. However, whereas <code>solve</code> has only one argument, which applies to general clauses, the first argument of <code>holds</code> is a fact and the second argument is a time (or state). The atomic formula <code>holds(Fact, Time)</code> expresses that the <code>Fact</code> holds at the <code>Time</code>. Such time-varying facts are also called ]. The atomic formula <code>happens(Event, Time)</code> expresses that the Event happens at the <code>Time</code>. | |||
The following example illustrates how these clauses can be used to reason about causality in a toy ]. Here, in the initial state at time 0, a green block is on a table and a red block is stacked on the green block (like a traffic light). At time 0, the red block is moved to the table. At time 1, the green block is moved onto the red block. Moving an object onto a place terminates the fact that the object is on any place, and initiates the fact that the object is on the place to which it is moved: | |||
<syntaxhighlight lang="prolog"> | |||
holds(on(green_block, table), 0). | |||
holds(on(red_block, green_block), 0). | |||
happens(move(red_block, table), 0). | |||
happens(move(green_block, red_block), 1). | |||
initiates(move(Object, Place), on(Object, Place)). | |||
terminates(move(Object, Place2), on(Object, Place1)). | |||
?- holds(Fact, Time). | |||
Fact = on(green_block,table), | |||
Time = 0. | |||
Fact = on(red_block,green_block), | |||
Time = 0. | |||
Fact = on(green_block,table), | |||
Time = 1. | |||
Fact = on(red_block,table), | |||
Time = 1. | |||
Fact = on(green_block,red_block), | |||
Time = 2. | |||
Fact = on(red_block,table), | |||
Time = 2. | |||
</syntaxhighlight> | |||
Forward reasoning and backward reasoning generate the same answers to the goal <code>holds(Fact, Time)</code>. But forward reasoning generates fluents ''progressively'' in temporal order, and backward reasoning generates fluents ''regressively'', as in the domain-specific use of ] in the ].<ref>Reiter, R., 1991. The frame problem in the situation calculus: A simple solution (sometimes) and a completeness result for goal regression. Artificial and Mathematical Theory of Computation, 3.</ref> | |||
Logic programming has also proved to be useful for representing domain-specific expertise in ]s.<ref>Merritt, D., 2012. Building expert systems in Prolog. Springer Science & Business Media. https://ds.amu.edu.et/xmlui/bitstream/handle/123456789/4434/%28Text%20Book%29%20Building%20Expert%20Systems%20in%20Prolog.pdf?sequence=1&isAllowed=y</ref> But human expertise, like general-purpose commonsense, is mostly implicit and ], and it is often difficult to represent such implicit knowledge in explicit rules. This difficulty does not arise, however, when logic programs are used to represent the existing, explicit rules of a business organisation or legal authority. | |||
For example, here is a representation of a simplified version of the first sentence of the British Nationality Act, which states that a person who is born in the UK becomes a British citizen at the time of birth if a parent of the person is a British citizen at the time of birth: | |||
<syntaxhighlight lang="prolog"> | |||
initiates(birth(Person), citizen(Person, uk)):- | |||
time_of(birth(Person), Time), | |||
place_of(birth(Person), uk), | |||
parent_child(Another_Person, Person), | |||
holds(citizen(Another_Person, uk), Time). | |||
</syntaxhighlight> | |||
Historically, the representation of a large portion of the British Nationality Act as a logic program in the 1980s<ref>{{cite journal|last1=Sergot|first1=M.J.|last2=Sadri|first2=F.|last3=Kowalski|first3=R.A.|last4=Kriwaczek|first4=F.|last5=Hammond|first5=P|last6=Cory|first6=H.T.|date=1986|url=http://www.doc.ic.ac.uk/~rak/papers/British%20Nationality%20Act.pdf|title=The British Nationality Act as a logic program|journal=]|volume=29|issue=5|pages=370–386|doi=10.1145/5689.5920 |s2cid=5665107 }}</ref> was "hugely influential for the development of computational representations of legislation, showing how logic programming enables intuitively appealing representations that can be directly deployed to generate automatic inferences".<ref>{{cite journal|last1=Prakken|first1=H.|last2=Sartor|first2=G.|date=October 2015|url=https://www.sciencedirect.com/science/article/pii/S0004370215000910/pdfft?md5=4dc0cf07e5c2a6926d285431b987a859&pid=1-s2.0-S0004370215000910-main.pdf|title=Law and logic: a review from an argumentation perspective|journal=]|volume=227|pages=214–245|doi=10.1016/j.artint.2015.06.005|s2cid=4261497 }}</ref> | |||
More recently, the PROLEG system,<ref>Satoh, K., 2023. PROLEG: Practical legal reasoning system. In Prolog: The Next 50 Years (pp. 277-283). Cham: Springer Nature Switzerland.</ref> initiated in 2009 and consisting of approximately 2500 rules and exceptions of civil code and supreme court case rules in Japan, has become possibly the largest legal rule base in the world.<ref name=":02">{{Cite journal |last1=Körner |first1=Philipp |last2=Leuschel |first2=Michael |last3=Barbosa |first3=João |last4=Costa |first4=Vítor Santos |last5=Dahl |first5=Verónica |last6=Hermenegildo |first6=Manuel V. |last7=Morales |first7=Jose F. |last8=Wielemaker |first8=Jan |last9=Diaz |first9=Daniel |last10=Abreu |first10=Salvador |last11=Ciatto |first11=Giovanni |date=November 2022 |title=Fifty Years of Prolog and Beyond |journal=Theory and Practice of Logic Programming |language=en |volume=22 |issue=6 |pages=776–858 |doi=10.1017/S1471068422000102 |issn=1471-0684|doi-access=free |arxiv=2201.10816 }}</ref> | |||
==Variants and extensions== | |||
===Prolog=== | |||
{{Main|Prolog}} | |||
The SLD resolution rule of inference is neutral about the order in which subgoals in the bodies of clauses can be ''selected'' for solution. For the sake of efficiency, Prolog restricts this order to the order in which the subgoals are written. SLD is also neutral about the strategy for searching the space of SLD proofs. | |||
Prolog searches this space, top-down, depth-first, trying different clauses for solving the same (sub)goal in the order in which the clauses are written. | |||
This search strategy has the advantage that the current branch of the tree can be represented efficiently by a ]. When a goal clause at the top of the stack is reduced to a new goal clause, the new goal clause is pushed onto the top of the stack. When the selected subgoal in the goal clause at the top of the stack cannot be solved, the search strategy '']'', removing the goal clause from the top of the stack, and retrying the attempted solution of the selected subgoal in the previous goal clause using the next clause that matches the selected subgoal. | |||
Backtracking can be restricted by using a subgoal, called '']'', written as !, which always succeeds but cannot be backtracked. Cut can be used to improve efficiency, but can also interfere with the logical meaning of clauses. In many cases, the use of cut can be replaced by negation as failure. In fact, negation as failure can be defined in Prolog, by using cut, together with any literal, say ''fail'', that unifies with the head of no clause: | |||
<syntaxhighlight lang="prolog"> | |||
not(P) :- P, !, fail. | |||
not(P). | |||
</syntaxhighlight> | |||
Prolog provides other features, in addition to cut, that do not have a logical interpretation. These include the built-in predicates ''assert'' and ''retract'' for destructively updating the state of the program during program execution. | |||
For example, the ] can be implemented without frame axioms using destructive change of state: | |||
<syntaxhighlight lang="prolog"> | |||
on(green_block, table). | |||
on(red_block, green_block). | |||
move(Object, Place2) :- | |||
retract(on(Object, Place1)), | |||
assert(on(Object, Place2). | |||
</syntaxhighlight> | |||
The sequence of move events and the resulting locations of the blocks can be computed by executing the query: | |||
<syntaxhighlight lang="prolog"> | |||
?- move(red_block, table), move(green_block, red_block), on(Object, Place). | |||
Object = red_block, | |||
Place = table. | |||
Object = green_block, | |||
Place = red_block. | |||
</syntaxhighlight> | |||
Various extensions of logic programming have been developed to provide a logical framework for such destructive change of state.<ref name="TL">Bonner, A.J. and Kifer, M., 1993, February. Transaction Logic Programming. In ICLP (Vol. 93, pp. 257-279).</ref><ref>Genesereth, M., 2023. Dynamic logic programming. In Prolog: The Next 50 Years (pp. 197-209). Cham: Springer Nature Switzerland.</ref><ref>Kowalski, R., Sadri, F., Calejo, M. and Dávila, J., 2023. Combining logic programming and imperative programming in LPS. In Prolog: The Next 50 Years (pp. 210-223). Cham: Springer Nature Switzerland.</ref> | |||
The broad range of Prolog applications, both in isolation and in combination with other languages is highlighted in the Year of Prolog Book,<ref name="Prolog Book"/> celebrating the 50 year anniversary of Prolog in 2022. | |||
Prolog has also contributed to the development of other programming languages, including ], ], ], ], ], ], ], ], and ]. | |||
===Constraint logic programming=== | |||
{{Main|Constraint logic programming}} | |||
] (CLP) combines Horn clause logic programming with ]. It extends Horn clauses by allowing some predicates, declared as constraint predicates, to occur as literals in the body of a clause. Constraint predicates are not defined by the facts and rules in the program, but are predefined by some domain-specific model-theoretic structure or theory. | |||
Procedurally, subgoals whose predicates are defined by the program are solved by goal-reduction, as in ordinary logic programming, but constraints are simplified and checked for satisfiability by a domain-specific constraint-solver, which implements the semantics of the constraint predicates. An initial problem is solved by reducing it to a satisfiable conjunction of constraints. | |||
Interestingly, the first version of Prolog already included a constraint predicate dif(term1, term2), from Philippe Roussel's 1972 PhD thesis, which succeeds if both of its arguments are different terms, but which is delayed if either of the terms contains a variable.<ref name=":02"/> | |||
The following constraint logic program represents a toy temporal database of <code>john's</code> history as a teacher: | |||
<syntaxhighlight lang="prolog"> | |||
teaches(john, hardware, T) :- 1990 ≤ T, T < 1999. | |||
teaches(john, software, T) :- 1999 ≤ T, T < 2005. | |||
teaches(john, logic, T) :- 2005 ≤ T, T ≤ 2012. | |||
rank(john, instructor, T) :- 1990 ≤ T, T < 2010. | |||
rank(john, professor, T) :- 2010 ≤ T, T < 2014. | |||
</syntaxhighlight> | |||
Here <code>≤</code> and <code><</code> are constraint predicates, with their usual intended semantics. The following goal clause queries the database to find out when <code>john</code> both taught <code>logic</code> and was a <code>professor</code>: | |||
<syntaxhighlight lang="prolog"> | |||
?- teaches(john, logic, T), rank(john, professor, T). | |||
</syntaxhighlight> | |||
The solution | |||
<code> | |||
2010 ≤ T, T ≤ 2012 | |||
</code> | |||
results from simplifying the constraints | |||
<code> | |||
2005 ≤ T, T ≤ 2012, 2010 ≤ T, T < 2014. | |||
</code> | |||
Constraint logic programming has been used to solve problems in such fields as ], ], ] verification, ], ], and finance. It is closely related to ]. | |||
=== Datalog === | |||
{{Main|Datalog}} | |||
Datalog is a database definition language, which combines a relational view of data, as in ]s, with a logical view, as in logic programming. | |||
Relational databases use a relational calculus or relational algebra, with ], such as ''union'', ''intersection'', ''set difference'' and ''cartesian product'' to specify queries, which access a database. Datalog uses logical connectives, such as ''or'', ''and'' and ''not'' in the bodies of rules to define relations as part of the database itself. | |||
It was recognized early in the development of relational databases that recursive queries cannot be expressed in either relational algebra or relational calculus, and that this defficiency can be remedied by introducing a least-fixed-point operator.<ref>Aho, A.V. and Ullman, J.D., 1979, January. Universality of data retrieval languages. In Proceedings of the 6th ACM SIGACT-SIGPLAN symposium on Principles of programming languages (pp. 110-119).</ref><ref>Maier, D., Tekle, K.T., Kifer, M. and Warren, D.S., 2018. Datalog: concepts, history, and outlook. In Declarative Logic Programming: Theory, Systems, and Applications (pp. 3-100).</ref> In contrast, recursive relations can be defined naturally by rules in logic programs, without the need for any new logical connectives or operators. | |||
Datalog differs from more general logic programming by having only constants and variables as terms. Moreover, all facts are variable-free, and rules are restricted, so that if they are executed bottom-up, then the derived facts are also variable-free. | |||
For example, consider the family database: | |||
<syntaxhighlight lang="prolog"> | |||
mother_child(elizabeth, charles). | |||
father_child(charles, william). | |||
father_child(charles, harry). | |||
parent_child(X, Y) :- | |||
mother_child(X, Y). | |||
parent_child(X, Y) :- | |||
father_child(X, Y). | |||
ancestor_descendant(X, Y) :- | |||
parent_child(X, X). | |||
ancestor_descendant(X, Y) :- | |||
ancestor_descendant(X, Z), | |||
ancestor_descendant(Z, Y). | |||
</syntaxhighlight> | |||
Bottom-up execution derives the following set of additional facts and terminates: | |||
<syntaxhighlight lang="prolog"> | |||
parent_child(elizabeth, charles). | |||
parent_child(charles, william). | |||
parent_child(charles, harry). | |||
ancestor_descendant(elizabeth, charles). | |||
ancestor_descendant(charles, william). | |||
ancestor_descendant(charles, harry). | |||
ancestor_descendant(elizabeth, william). | |||
ancestor_descendant(elizabeth, harry). | |||
</syntaxhighlight> | |||
Top-down execution derives the same answers to the query: | |||
<syntaxhighlight lang="prolog"> | |||
?- ancestor_descendant(X, Y). | |||
</syntaxhighlight> | |||
But then it goes into an infinite loop. However, top-down execution with ] gives the same answers and terminates without looping. | |||
=== Answer set programming === | |||
{{Main|Answer Set Programming}} | |||
Like Datalog, Answer Set programming (ASP) is not Turing-complete. Moreover, instead of separating goals (or queries) from the program to be used in solving the goals, ASP treats the whole program as a goal, and solves the goal by generating a stable model that makes the goal true. For this purpose, it uses the ], according to which a logic program can have zero, one or more intended models. For example, the following program represents a degenerate variant of the map colouring problem of colouring two countries red or green: | |||
<syntaxhighlight lang="prolog"> | |||
country(oz). | |||
country(iz). | |||
adjacent(oz, iz). | |||
colour(C, red) :- country(C), not(colour(C, green)). | |||
colour(C, green) :- country(C), not(colour(C, red)). | |||
</syntaxhighlight> | |||
The problem has four solutions represented by four stable models: | |||
<syntaxhighlight lang="prolog"> | |||
country(oz). country(iz). adjacent(oz, iz). colour(oz, red). colour(iz, red). | |||
country(oz). country(iz). adjacent(oz, iz). colour(oz, green). colour(iz, green). | |||
country(oz). country(iz). adjacent(oz, iz). colour(oz, red). colour(iz, green). | |||
country(oz). country(iz). adjacent(oz, iz). colour(oz, green). colour(iz, red). | |||
</syntaxhighlight> | |||
To represent the standard version of the map colouring problem, we need to add a constraint that two adjacent countries cannot be coloured the same colour. In ASP, this constraint can be written as a clause of the form: | |||
<syntaxhighlight lang="prolog"> | |||
:- country(C1), country(C2), adjacent(C1, C2), colour(C1, X), colour(C2, X). | |||
</syntaxhighlight> | |||
With the addition of this constraint, the problem now has only two solutions: | |||
<syntaxhighlight lang="prolog"> | |||
country(oz). country(iz). adjacent(oz, iz). colour(oz, red). colour(iz, green). | |||
country(oz). country(iz). adjacent(oz, iz). colour(oz, green). colour(iz, red). | |||
</syntaxhighlight> | |||
The addition of constraints of the form <code>:- Body.</code> eliminates models in which <code>Body</code> is true. | |||
Confusingly, ''constraints in ASP'' are different from ''constraints in CLP''. Constraints in CLP are predicates that qualify answers to queries (and solutions of goals). Constraints in ASP are clauses that eliminate models that would otherwise satisfy goals. Constraints in ASP are like integrity constraints in databases. | |||
This combination of ordinary logic programming clauses and constraint clauses illustrates the generate-and-test methodology of problem solving in ASP: The ordinary clauses define a search space of possible solutions, and the constraints filter out unwanted solutions.<ref>Eiter, T., Ianni, G. and Krennwallner, T., 2009. Answer Set Programming: A Primer. In Reasoning Web. Semantic Technologies for Information Systems: 5th International Summer School 2009, Brixen-Bressanone, Italy, August 30-September 4, 2009, Tutorial Lectures (pp. 40-110).</ref> | |||
Most implementations of ASP proceed in two steps: First they instantiate the program in all possible ways, reducing it to a propositional logic program (known as ''grounding''). Then they apply a propositional logic problem solver, such as the ] or a ]. However, some implementations, such as s(CASP)<ref>{{cite journal |first1=J. |last1=Arias |first2=M. |last2=Carro |first3=E. |last3=Salazar |first4=K. |last4=Marple |first5=G. |last5=Gupta |title=Constraint Answer Set Programming without Grounding |journal=Theory and Practice of Logic Programming |volume=18 |issue=3–4 |pages=337–354 |date=2018|doi=10.1017/S1471068418000285 |s2cid=13754645 |doi-access=free |arxiv=1804.11162 }}</ref> use a goal-directed, top-down, SLD resolution-like procedure without | |||
grounding. | |||
===Abductive logic programming=== | |||
{{Main|Abductive logic programming}} | |||
]<ref>{{cite journal |first1=M. |last1=Denecker |first2=A.C. |last2=Kakas |title=Special issue: abductive logic programming |journal=Journal of Logic Programming |volume=44 |issue=1–3 |pages=1–4 |date=July 2000 |doi=10.1016/S0743-1066(99)00078-3 |doi-access=free }}</ref> (ALP), like CLP, extends normal logic programming by allowing the bodies of clauses to contain literals whose predicates are not defined by clauses. In ALP, these predicates are declared as ''abducible'' (or ''assumable''), and are used as in ] to explain observations, or more generally to add new facts to the program (as assumptions) to solve goals. | |||
For example, suppose we are given an initial state in which a red block is on a green block on a table at time 0: | |||
<syntaxhighlight lang="prolog"> | |||
holds(on(green_block, table), 0). | |||
holds(on(red_block, green_block), 0). | |||
</syntaxhighlight> | |||
Suppose we are also given the goal: | |||
<syntaxhighlight lang="prolog"> | |||
?- holds(on(green_block,red_block), 3), holds(on(red_block,table), 3). | |||
</syntaxhighlight> | |||
The goal can represent an observation, in which case a solution is an explanation of the observation. Or the goal can represent a desired future state of affairs, in which case a solution is a plan for achieving the goal.<ref>Eshghi, K., 1988, August. Abductive Planning with Event Calculus. In ICLP/SLP (pp. 562-579).</ref> | |||
We can use the rules for cause and effect presented earlier to solve the goal, by treating the <code>happens</code> predicate as abducible: | |||
<syntaxhighlight lang="prolog"> | |||
holds(Fact, Time2) :- | |||
happens(Event, Time1), | |||
Time2 is Time1 + 1, | |||
initiates(Event, Fact). | |||
holds(Fact, Time2) :- | |||
happens(Event, Time1), | |||
Time2 is Time1 + 1, | |||
holds(Fact, Time1), | |||
not(terminated(Fact, Time1)). | |||
terminated(Fact, Time) :- | |||
happens(Event, Time), | |||
terminates(Event, Fact). | |||
initiates(move(Object, Place), on(Object, Place)). | |||
terminates(move(Object, Place2), on(Object, Place1)). | |||
</syntaxhighlight> | |||
ALP solves the goal by reasoning backwards and adding assumptions to the program, to solve abducible subgoals. In this case there are many alternative solutions, including: | |||
<syntaxhighlight lang="prolog"> | |||
happens(move(red_block, table), 0). | |||
happens(tick, 1). | |||
happens(move(green_block, red_block), 2). | |||
</syntaxhighlight> | |||
<syntaxhighlight lang="prolog"> | |||
happens(tick,0). | |||
happens(move(red_block, table), 1). | |||
happens(move(green_block, red_block), 2). | |||
</syntaxhighlight> | |||
<syntaxhighlight lang="prolog"> | |||
happens(move(red_block, table), 0). | |||
happens(move(green_block, red_block), 1). | |||
happens(tick, 2). | |||
</syntaxhighlight> | |||
Here <syntaxhighlight inline lang="prolog">tick</syntaxhighlight> is an event that marks the passage of time without initiating or terminating any fluents. | |||
There are also solutions in which the two <code>move</code> events happen at the same time. For example: | |||
<syntaxhighlight lang="prolog"> | |||
happens(move(red_block, table), 0). | |||
happens(move(green_block, red_block), 0). | |||
happens(tick, 1). | |||
happens(tick, 2). | |||
</syntaxhighlight> | |||
Such solutions, if not desired, can be removed by adding an integrity constraint, which is like a constraint clause in ASP: | |||
<syntaxhighlight lang="prolog"> | |||
:- happens(move(Block1, Place), Time), happens(move(Block2, Block1), Time). | |||
</syntaxhighlight> | |||
Abductive logic programming has been used for fault diagnosis, planning, natural language processing and machine learning. It has also been used to interpret negation as failure as a form of abductive reasoning.<ref>Eshghi, K. and Kowalski, R.A., 1989, June. Abduction Compared with Negation by Failure. In ICLP (Vol. 89, pp. 234-255).</ref> | |||
===Inductive logic programming=== | |||
{{Main|Inductive logic programming}} | |||
Inductive logic programming (ILP) is an approach to ] that ] logic programs as hypothetical generalisations of positive and negative examples. Given a logic program representing background knowledge and positive examples together with constraints representing negative examples, an ILP system induces a logic program that generalises the positive examples while excluding the negative examples. | |||
ILP is similar to ALP, in that both can be viewed as generating hypotheses to explain observations, and as employing constraints to exclude undesirable hypotheses. But in ALP the hypotheses are variable-free facts, and in ILP the hypotheses are general rules.<ref>{{Cite book |last1=Nienhuys-Cheng |first1=Shan-hwei |title=Foundations of inductive logic programming |last2=Wolf |first2=Ronald de |date=1997 |publisher=Springer |isbn=978-3-540-62927-6 |series=Lecture notes in computer science Lecture notes in artificial intelligence |location=Berlin Heidelberg |page=173}}</ref><ref>Flach, P.A. and Kakas, A.C., 2000. On the relation between abduction and inductive learning. In Abductive Reasoning and Learning (pp. 1-33). Dordrecht: Springer Netherlands.</ref> | |||
For example, given only background knowledge of the mother_child and father_child relations, and suitable examples of the grandparent_child relation, current ILP systems can generate the definition of grandparent_child, inventing an auxiliary predicate, which can be interpreted as the parent_child relation:<ref>Cropper, A. and Dumančić, S., 2022. Inductive logic programming at 30: a new introduction. Journal of Artificial Intelligence Research, 74, pp.765-850.</ref> | |||
<syntaxhighlight lang="prolog"> | |||
grandparent_child(X, Y):- auxiliary(X, Z), auxiliary(Z, Y). | |||
auxiliary(X, Y):- mother_child(X, Y). | |||
auxiliary(X, Y):- father_child(X, Y). | |||
</syntaxhighlight> | |||
Stuart Russell<ref>Russell, S., 2019. Human compatible: Artificial intelligence and the problem of control. Penguin.</ref> has referred to such invention of new concepts as the most important step needed for reaching human-level AI. | |||
Recent work in ILP, combining logic programming, learning and probability, has given rise to the fields of ] and ]. | |||
===Concurrent logic programming=== | |||
{{Main|Concurrent logic programming}} | |||
Concurrent logic programming integrates concepts of logic programming with ]. Its development was given a big impetus in the 1980s by its choice for the systems programming language of the ].<ref>Shunichi Uchida and Kazuhiro Fuchi. ''Proceedings of the FGCS Project Evaluation Workshop''. Institute for New Generation Computer Technology (ICOT). 1992.</ref> | |||
A concurrent logic program is a set of guarded ] of the form: | |||
::<code>H :- G<sub>1</sub>, ..., G<sub>n</sub> | B<sub>1</sub>, ..., B<sub>n</sub>.</code> | |||
The conjunction <code>G<sub>1</sub>, ... , G<sub>n</sub></code> is called the ] of the clause, and {{char|{{!}}}} is the commitment operator. Declaratively, guarded Horn clauses are read as ordinary logical implications: | |||
::<code>H if G<sub>1</sub> and ... and G<sub>n</sub> and B<sub>1</sub> and ... and B<sub>n</sub>.</code> | |||
However, procedurally, when there are several clauses whose heads <code>H</code> match a given goal, then all of the clauses are executed in parallel, checking whether their guards <code>G<sub>1</sub>, ... , G<sub>n</sub></code> hold. If the guards of more than one clause hold, then a committed choice is made to one of the clauses, and execution proceeds with the subgoals <code>B<sub>1</sub>, ..., B<sub>n</sub></code> of the chosen clause. These subgoals can also be executed in parallel. Thus concurrent logic programming implements a form of "don't care nondeterminism", rather than "don't know nondeterminism". | |||
For example, the following concurrent logic program defines a predicate <code>shuffle(Left, Right, Merge)</code>, which can be used to shuffle two lists <code>Left</code> and <code>Right</code>, combining them into a single list <code>Merge</code> that preserves the ordering of the two lists <code>Left</code> and <code>Right</code>: | |||
<syntaxhighlight lang="prolog"> | |||
shuffle(, , ). | |||
shuffle(Left, Right, Merge) :- | |||
Left = | | |||
Merge = , | |||
shuffle(Rest, Right, ShortMerge). | |||
shuffle(Left, Right, Merge) :- | |||
Right = | | |||
Merge = , | |||
shuffle(Left, Rest, ShortMerge). | |||
</syntaxhighlight> | |||
Here, <code></code> represents the empty list, and <code></code> represents a list with first element <code>Head</code> followed by list <code>Tail</code>, as in Prolog. (Notice that the first occurrence of {{char|{{!}}}} in the second and third clauses is the list constructor, whereas the second occurrence of {{char|{{!}}}} is the commitment operator.) The program can be used, for example, to shuffle the lists <code></code> and <code></code> by invoking the goal clause: | |||
<syntaxhighlight lang="prolog"> | |||
shuffle(, , Merge). | |||
</syntaxhighlight> | |||
The program will non-deterministically generate a single solution, for example <code>Merge = </code>. | |||
] has argued<ref name="Hewitt">{{cite web | url=https://hal.archives-ouvertes.fr/hal-01148496v6/document | title=Inconsistency Robustness for Logic Programs | publisher=Hal Archives | date=27 April 2016 | access-date=7 November 2016 | author=Hewitt, Carl | pages=21–26}}</ref> that, because of the ], concurrent logic programming cannot implement general concurrency. However, according to the logical semantics, any result of a computation of a concurrent logic program is a logical consequence of the program, even though not all logical consequences can be derived. | |||
===Concurrent constraint logic programming=== | |||
{{Main|Concurrent constraint logic programming}} | |||
]<ref>Saraswat, V.A. and Rinard, M., 1989, December. Concurrent constraint programming. In Proceedings of the 17th ACM SIGPLAN-SIGACT symposium on Principles of programming languages (pp. 232-245).</ref> combines concurrent logic programming and ], using constraints to control concurrency. A clause can contain a guard, which is a set of constraints that may block the applicability of the clause. When the guards of several clauses are satisfied, concurrent constraint logic programming makes a committed choice to use only one. | |||
===Higher-order logic programming=== | |||
Several researchers have extended logic programming with ] features derived from ], such as predicate variables. Such languages include the Prolog extensions ]<ref name="hilog-jlp">{{cite journal |last1=Chen |first1=Weidong |last2=Kifer |first2=Michael |last3=Warren |first3=David S. |date=February 1993 |title=HiLog: A foundation for higher-order logic programming |journal=] |volume=15 |issue=3 |pages=187–230 |doi=10.1016/0743-1066(93)90039-J|doi-access=free }}</ref> and ].<ref>Miller, D.A. and Nadathur, G., 1986, July. Higher-order logic programming. In International Conference on Logic Programming (pp. 448-462). Berlin, Heidelberg: Springer Berlin Heidelberg.</ref> | |||
===Linear logic programming=== | |||
Basing logic programming within ] has resulted in the design of logic programming languages that are considerably more expressive than those based on classical logic. Horn clause programs can only represent state change by the change in arguments to predicates. In linear logic programming, one can use the ambient linear logic to support state change. Some early designs of logic programming languages based on linear logic include LO,<ref>{{cite journal|first=Jean-Marc|last=Andreoli|doi=10.1093/logcom/2.3.297|title=Logic Programming with Focusing Proofs in Linear Logic|journal=]|date=1 June 1992|volume=2|issue=3|pages=297–347}}</ref> Lolli,<ref>{{cite journal|first1=Joshua|last1=Hodas|first2=Dale|last2=Miller|url=http://repository.upenn.edu/cgi/viewcontent.cgi?article=1540&context=cis_reports|title=Logic Programming in a Fragment of Intuitionistic Linear Logic|journal=]|date=1994|volume=110|issue=2|pages=327–365|doi=10.1006/inco.1994.1036 |doi-access=free}}</ref> ACL,<ref>{{cite conference|first1=Naoki|last1=Kobayashi|first2= Akinori|last2=Yonezawa|author-link2=Akinori Yonezawa|title=Asynchronous communication model based on linear logic|conference=US/Japan Workshop on Parallel Symbolic Computing|date=1994|pages=279–294|citeseerx=10.1.1.42.8749 }}</ref> and Forum.<ref>{{cite journal|first=Dale|last=Miller|title=Forum: A Multiple-Conclusion Specification Logic|journal=]|date=30 September 1996|volume=165|issue=1|pages=201–232|doi=10.1016/0304-3975(96)00045-X|doi-access=free}}</ref> Forum provides a goal-directed interpretation of all linear logic. | |||
===Object-oriented logic programming=== | |||
]<ref>Kifer, M. and Lausen, G., 1989, June. F-logic: a higher-order language for reasoning about objects, inheritance, and scheme. In Proceedings of the 1989 ACM SIGMOD international conference on Management of data (pp. 134-146).</ref> extends logic programming with objects and the frame syntax. | |||
]<ref>de Moura, P.J.L., 2003. Design of an Object-Oriented Logic Programming Language (Doctoral dissertation, Universidade da Beira Interior).</ref> extends the Prolog programming language with support for objects, protocols, and other OOP concepts. It supports most standard-compliant Prolog systems as backend compilers. | |||
===Transaction logic programming=== | |||
]<ref name="TL"/> is an extension of logic programming with a logical theory of state-modifying updates. It has both a model-theoretic semantics and a procedural one. An implementation of a subset of Transaction logic is available in the ]<ref>Yang, G. and Kifer, M., 2000, July. FLORA: Implementing an efficient DOOD system using a tabling logic engine. In International Conference on Computational Logic (pp. 1078-1093). Berlin, Heidelberg: Springer Berlin Heidelberg.</ref> system. Other prototypes are also ]. | |||
==See also== | ==See also== | ||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] (includes ]) | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
==Citations== | |||
*] | |||
{{reflist}} | |||
*] | |||
*] | |||
*] | |||
*] | |||
== |
==Sources== | ||
===General introductions=== | |||
<!--These sources are consider good introductions to the subject as a whole and can be used to verify most of the points made in the article--> | |||
* {{Cite journal | doi = 10.1016/0743-1066(94)90025-6| title = Logic programming and knowledge representation| journal = The Journal of Logic Programming| volume = 19–20| pages = 73–148| year = 1994| last1 = Baral | first1 = C. | last2 = Gelfond | first2 = M. | url=http://redwood.cs.ttu.edu/~mgelfond/PAPERS/survey.pdf| doi-access = free}} | |||
* {{Cite journal| doi = 10.1145/35043.35046 | last1 = Kowalski | first1 = R. A. | title = The early years of logic programming | journal = Communications of the ACM | volume = 31 | pages = 38–43 | year = 1988 | s2cid = 12259230 |url=http://www.doc.ic.ac.uk/~rak/papers/the%20early%20years.pdf}} | |||
* {{Cite book | title = Foundations of Logic Programming | last1 = Lloyd | first1 = J. W. | year = 1987 | publisher = Springer-Verlag | edition = 2nd}} | |||
===Other sources=== | |||
*John McCarthy. '''Programs with ]''' Symposium on Mechanization of Thought Processes. National Physical Laboratory. Teddington, England. 1958. | |||
<!--These sources are referred to specifically in the text of the article and can be used to verify specific points made in the article--> | |||
*Fisher Black. '''A deductive question answering system''' Harvard University. Thesis. 1964. | |||
* John McCarthy. . ''Symposium on Mechanization of Thought Processes''. National Physical Laboratory. Teddington, England. 1958. | |||
*James Slagle. '''Experiments with a Deductive Question-Answering Program''' CACM. December, 1965. | |||
* {{cite journal|title= Uniform proofs as a foundation for logic programming|journal= Annals of Pure and Applied Logic|volume= 51|issue= 1–2|pages= 125–157|doi= 10.1016/0168-0072(91)90068-W|year= 1991|last1= Miller|first1= Dale|last2= Nadathur|first2= Gopalan|last3= Pfenning|first3= Frank|last4= Scedrov|first4= Andre|url= https://repository.upenn.edu/cis_reports/711|doi-access= free}} | |||
*Cordell Green. '''Application of Theorem Proving to Problem Solving''' IJCAI 1969. | |||
* Ehud Shapiro (Editor). ''Concurrent Prolog''. MIT Press. 1987. | |||
*Carl Hewitt. '''Planner: A Language for Proving Theorems in Robots''' IJCAI 1969. | |||
* James Slagle. . CACM. December 1965. | |||
*Gerry Sussman and Terry Winograd. '''''' AI Memo No, 203, MIT Project MAC, July 1970. | |||
* ]; Hogger, Christopher John; Robinson, J.A., eds. (1993-1998). .Vols. 1–5, Oxford University Press. | |||
*Carl Hewitt. '''Procedural Embedding of Knowledge In Planner''' IJCAI 1971. | |||
*Terry Winograd. '''''' MIT AI TR-235. January 1971. | |||
*Bruce Anderson. '''Documentation for LIB PICO-PLANNER''' School of Artificial Intelligence, Edinburgh University. 1972 | |||
*Bruce Baumgart. '''Micro-Planner Alternate Reference Manual''' Stanford AI Lab Operating Note No. 67, April 1972. | |||
*Julian Davies. '''Popler 1.6 Reference Manual''' University of Edinburgh, TPU Report No. 1, May 1973. | |||
*Jeff Rulifson, Jan Derksen, and Richard Waldinger. '''QA4, A Procedural Calculus for Intuitive Reasoning''' SRI AI Center Technical Note 73, November 1973. | |||
*Robert Kowalski and Donald and Kuehner, Artificial Intelligence, Vol. 2, 1971, pp. 227-60. | |||
*Robert Kowalski Memo 70, Department of Artificial Intelligence, Edinburgh University. 1973. Also in Proceedings IFIP Congress, Stockholm, North Holland Publishing Co., 1974, pp. 569-574. | |||
*Drew McDermott and Gerry Sussman. '''''' MIT AI Memo 259A. January 1974. | |||
*Earl Sacerdoti, ''et al.'' '''QLISP: A Language for the Interactive Development of Complex Systems''' AFIPS National Computer Conference. 1976. | |||
*Bill Kornfeld and Carl Hewitt. '''The Scientific Community Metaphor''' IEEE Transactions on Systems, Man, and Cybernetics. January 1981. | |||
*Bill Kornfeld. '''The Use of Parallelism to Implement a Heuristic Search''' IJCAI 1981. | |||
*Bill Kornfeld. '''Parallelism in Problem Solving''' MIT EECS Doctoral Dissertation. August 1981. | |||
*Bill Kornfeld. '''Combinatorially Implosive Algorithms''' CACM. 1982 | |||
*Carl Hewitt. '''The Challenge of Open Systems''' Byte Magazine. April 1985. | |||
*Robert Kowalski. Proceedings of the 1986 ACM fourteenth annual conference on Computer science. | |||
*Ehud Shapiro (Editor). '''Concurrent Prolog''' MIT Press. 1987. | |||
*Robert Kowalski. CACM. January 1988. | |||
*Ehud Shapiro. '''The family of concurrent logic programming languages''' ACM Computing Surveys. September 1989. | |||
*Carl Hewitt and Gul Agha. '''Guarded Horn clause languages: are they deductive and Logical?''' International Conference on Fifth Generation Computer Systems, Ohmsha 1988. Tokyo. Also in ''Artificial Intelligence at MIT'', Vol. 2. MIT Press 1991. | |||
*Shunichi Uchida and Kazuhiro Fuchi '''Proceedings of the FGCS Project Evaluation Workshop''' Institute for New Generation Computer Technology (ICOT). 1992. | |||
*Carl Hewitt. '''The repeated demise of logic programming and why it will be reincarnated''' What Went Wrong and Why: Lessons from AI Research and Applications. Technical Report SS-06-08. AAAI Press. March 2006. | |||
*J. W. Lloyd. '''Foundations of Logic Programming''' (2nd edition). Springer-Verlag 1987. | |||
* Jean-Marc Andreoli and Remo Pareschi. '''Linear Objects: Logical Processes with Built-In Inheritance, Proceeding of the Seventh International Conference on Logic Programming, Jerusalem, May 1990. | |||
* Joshua Hodas and Dale Miller. '''Logic Programming in a Fragment of Intuitionistic Linear Logic''', Information and Computation, 1994, 110(2), 327-365. | |||
* Naoki Kobayashi and Akinori Yonezawa. '''Asynchronous communication model based on linear logic''', Formal Aspects of Computing, 1994, 279-294. | |||
* Dale Miller. '''Forum: A Multiple-Conclusion Specification Language''', Theoretical Computer Science, 1996, 165 (1), 201--232. | |||
* Dale Miller. '''', in ''Linear Logic in Computer Science'', edited by Thomas Ehrhard, Jean-Yves Girard, Paul Ruet, and Phil Scott. Cambridge University Press. London Mathematical Society Lecture Note, Volume 316. | |||
* J.M. Foster and E.W. Elcock. ABSYS 1: An Incremental Compiler for Assertions: an Introduction, , Machine Intelligence 4, Edinburgh U Press, 1969, pp. 423-429 | |||
* Pat Hayes. Computation and Deduction. In Proceedings of the 2nd MFCS Symposium. Czechoslovak Academy of Sciences, 1973, pp. 105-118. | |||
== |
==Further reading== | ||
<!--These are other sources on the subject--> | |||
* Carl Hewitt. "". IJCAI 1971. | |||
* Carl Hewitt. "". ''AAAI Spring Symposium: What Went Wrong and Why: Lessons from AI Research and Applications'' 2006: 2–9. | |||
* Evgeny Dantsin, Thomas {{Not a typo|Eiter}}, Georg Gottlob, Andrei Voronkov: . ACM Comput. Surv. 33(3): 374–425 (2001) | |||
* Ulf Nilsson and Jan Maluszynski, | |||
==External links== | |||
* | |||
{{Commons category}} | |||
* | |||
* | * | ||
* {{Webarchive|url=https://web.archive.org/web/20081204113711/http://liinwww.ira.uka.de/bibliography/LogicProgramming/ |date=2008-12-04 }} | |||
* journal | |||
* | |||
* ] | |||
*'''' (journal) | |||
* | |||
* {{Webarchive|url=https://web.archive.org/web/20110903022204/http://www.mozart-oz.org/documentation/tutorial/node12.html |date=2011-09-03 }} in ] | |||
* | * | ||
* | |||
{{Programming paradigms navbox}} | |||
{{Types of programming languages}} | |||
{{Authority control}} | |||
] | |||
] | ] | ||
] | |||
] | ] | ||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] |
Latest revision as of 08:48, 24 November 2024
Programming paradigm based on formal logicLogic programming is a programming, database and knowledge representation paradigm based on formal logic. A logic program is a set of sentences in logical form, representing knowledge about some problem domain. Computation is performed by applying logical reasoning to that knowledge, to solve problems in the domain. Major logic programming language families include Prolog, Answer Set Programming (ASP) and Datalog. In all of these languages, rules are written in the form of clauses:
A :- B1, ..., Bn.
and are read as declarative sentences in logical form:
A if B1 and ... and Bn.
A
is called the head of the rule, B1
, ..., Bn
is called the body, and the Bi
are called literals or conditions. When n = 0, the rule is called a fact and is written in the simplified form:
A.
Queries (or goals) have the same syntax as the bodies of rules and are commonly written in the form:
?- B1, ..., Bn.
In the simplest case of Horn clauses (or "definite" clauses), all of the A, B1, ..., Bn are atomic formulae of the form p(t1 ,..., tm), where p is a predicate symbol naming a relation, like "motherhood", and the ti are terms naming objects (or individuals). Terms include both constant symbols, like "charles", and variables, such as X, which start with an upper case letter.
Consider, for example, the following Horn clause program:
mother_child(elizabeth, charles). father_child(charles, william). father_child(charles, harry). parent_child(X, Y) :- mother_child(X, Y). parent_child(X, Y) :- father_child(X, Y). grandparent_child(X, Y) :- parent_child(X, Z), parent_child(Z, Y).
Given a query, the program produces answers.
For instance for a query ?- parent_child(X, william)
, the single answer is
X = charles
Various queries can be asked. For instance the program can be queried both to generate grandparents and to generate grandchildren. It can even be used to generate all pairs of grandchildren and grandparents, or simply to check if a given pair is such a pair:
grandparent_child(X, william). X = elizabeth ?- grandparent_child(elizabeth, Y). Y = william; Y = harry. ?- grandparent_child(X, Y). X = elizabeth Y = william; X = elizabeth Y = harry. ?- grandparent_child(william, harry). no ?- grandparent_child(elizabeth, harry). yes
Although Horn clause logic programs are Turing complete, for most practical applications, Horn clause programs need to be extended to "normal" logic programs with negative conditions. For example, the definition of sibling uses a negative condition, where the predicate = is defined by the clause X = X
:
sibling(X, Y) :- parent_child(Z, X), parent_child(Z, Y), not(X = Y).
Logic programming languages that include negative conditions have the knowledge representation capabilities of a non-monotonic logic.
In ASP and Datalog, logic programs have only a declarative reading, and their execution is performed by means of a proof procedure or model generator whose behaviour is not meant to be controlled by the programmer. However, in the Prolog family of languages, logic programs also have a procedural interpretation as goal-reduction procedures. From this point of view, clause A :- B1,...,Bn is understood as:
- to solve
A
, solveB1
, and ... and solveBn
.
Negative conditions in the bodies of clauses also have a procedural interpretation, known as negation as failure: A negative literal not B
is deemed to hold if and only if the positive literal B
fails to hold.
Much of the research in the field of logic programming has been concerned with trying to develop a logical semantics for negation as failure and with developing other semantics and other implementations for negation. These developments have been important, in turn, for supporting the development of formal methods for logic-based program verification and program transformation.
History
The use of mathematical logic to represent and execute computer programs is also a feature of the lambda calculus, developed by Alonzo Church in the 1930s. However, the first proposal to use the clausal form of logic for representing computer programs was made by Cordell Green. This used an axiomatization of a subset of LISP, together with a representation of an input-output relation, to compute the relation by simulating the execution of the program in LISP. Foster and Elcock's Absys, on the other hand, employed a combination of equations and lambda calculus in an assertional programming language that places no constraints on the order in which operations are performed.
Logic programming, with its current syntax of facts and rules, can be traced back to debates in the late 1960s and early 1970s about declarative versus procedural representations of knowledge in artificial intelligence. Advocates of declarative representations were notably working at Stanford, associated with John McCarthy, Bertram Raphael and Cordell Green, and in Edinburgh, with John Alan Robinson (an academic visitor from Syracuse University), Pat Hayes, and Robert Kowalski. Advocates of procedural representations were mainly centered at MIT, under the leadership of Marvin Minsky and Seymour Papert.
Although it was based on the proof methods of logic, Planner, developed by Carl Hewitt at MIT, was the first language to emerge within this proceduralist paradigm. Planner featured pattern-directed invocation of procedural plans from goals (i.e. goal-reduction or backward chaining) and from assertions (i.e. forward chaining). The most influential implementation of Planner was the subset of Planner, called Micro-Planner, implemented by Gerry Sussman, Eugene Charniak and Terry Winograd. Winograd used Micro-Planner to implement the landmark, natural-language understanding program SHRDLU. For the sake of efficiency, Planner used a backtracking control structure so that only one possible computation path had to be stored at a time. Planner gave rise to the programming languages QA4, Popler, Conniver, QLISP, and the concurrent language Ether.
Hayes and Kowalski in Edinburgh tried to reconcile the logic-based declarative approach to knowledge representation with Planner's procedural approach. Hayes (1973) developed an equational language, Golux, in which different procedures could be obtained by altering the behavior of the theorem prover.
In the meanwhile, Alain Colmerauer in Marseille was working on natural-language understanding, using logic to represent semantics and using resolution for question-answering. During the summer of 1971, Colmerauer invited Kowalski to Marseille, and together they discovered that the clausal form of logic could be used to represent formal grammars and that resolution theorem provers could be used for parsing. They observed that some theorem provers, like hyper-resolution, behave as bottom-up parsers and others, like SL resolution (1971) behave as top-down parsers.
It was in the following summer of 1972, that Kowalski, again working with Colmerauer, developed the procedural interpretation of implications in clausal form. It also became clear that such clauses could be restricted to definite clauses or Horn clauses, and that SL-resolution could be restricted (and generalised) to SLD resolution. Kowalski's procedural interpretation and SLD were described in a 1973 memo, published in 1974.
Colmerauer, with Philippe Roussel, used the procedural interpretation as the basis of Prolog, which was implemented in the summer and autumn of 1972. The first Prolog program, also written in 1972 and implemented in Marseille, was a French question-answering system. The use of Prolog as a practical programming language was given great momentum by the development of a compiler by David H. D. Warren in Edinburgh in 1977. Experiments demonstrated that Edinburgh Prolog could compete with the processing speed of other symbolic programming languages such as Lisp. Edinburgh Prolog became the de facto standard and strongly influenced the definition of ISO standard Prolog.
Logic programming gained international attention during the 1980s, when it was chosen by the Japanese Ministry of International Trade and Industry to develop the software for the Fifth Generation Computer Systems (FGCS) project. The FGCS project aimed to use logic programming to develop advanced Artificial Intelligence applications on massively parallel computers. Although the project initially explored the use of Prolog, it later adopted the use of concurrent logic programming, because it was closer to the FGCS computer architecture.
However, the committed choice feature of concurrent logic programming interfered with the language's logical semantics and with its suitability for knowledge representation and problem solving applications. Moreover, the parallel computer systems developed in the project failed to compete with advances taking place in the development of more conventional, general-purpose computers. Together these two issues resulted in the FGCS project failing to meet its objectives. Interest in both logic programming and AI fell into world-wide decline.
In the meanwhile, more declarative logic programming approaches, including those based on the use of Prolog, continued to make progress independently of the FGCS project. In particular, although Prolog was developed to combine declarative and procedural representations of knowledge, the purely declarative interpretation of logic programs became the focus for applications in the field of deductive databases. Work in this field became prominent around 1977, when Hervé Gallaire and Jack Minker organized a workshop on logic and databases in Toulouse. The field was eventually renamed as Datalog.
This focus on the logical, declarative reading of logic programs was given further impetus by the development of constraint logic programming in the 1980s and Answer Set Programming in the 1990s. It is also receiving renewed emphasis in recent applications of Prolog
The Association for Logic Programming (ALP) was founded in 1986 to promote Logic Programming. Its official journal until 2000, was The Journal of Logic Programming. Its founding editor-in-chief was J. Alan Robinson. In 2001, the journal was renamed The Journal of Logic and Algebraic Programming, and the official journal of ALP became Theory and Practice of Logic Programming, published by Cambridge University Press.
Concepts
Logic programs enjoy a rich variety of semantics and problem solving methods, as well as a wide range of applications in programming, databases, knowledge representation and problem solving.
Algorithm = Logic + Control
The procedural interpretation of logic programs, which uses backward reasoning to reduce goals to subgoals, is a special case of the use of a problem-solving strategy to control the use of a declarative, logical representation of knowledge to obtain the behaviour of an algorithm. More generally, different problem-solving strategies can be applied to the same logical representation to obtain different algorithms. Alternatively, different algorithms can be obtained with a given problem-solving strategy by using different logical representations.
The two main problem-solving strategies are backward reasoning (goal reduction) and forward reasoning, also known as top-down and bottom-up reasoning, respectively.
In the simple case of a propositional Horn clause program and a top-level atomic goal, backward reasoning determines an and-or tree, which constitutes the search space for solving the goal. The top-level goal is the root of the tree. Given any node in the tree and any clause whose head matches the node, there exists a set of child nodes corresponding to the sub-goals in the body of the clause. These child nodes are grouped together by an "and". The alternative sets of children corresponding to alternative ways of solving the node are grouped together by an "or".
Any search strategy can be used to search this space. Prolog uses a sequential, last-in-first-out, backtracking strategy, in which only one alternative and one sub-goal are considered at a time. For example, subgoals can be solved in parallel, and clauses can also be tried in parallel. The first strategy is called and-parallel and the second strategy is called or-parallel. Other search strategies, such as intelligent backtracking, or best-first search to find an optimal solution, are also possible.
In the more general, non-propositional case, where sub-goals can share variables, other strategies can be used, such as choosing the subgoal that is most highly instantiated or that is sufficiently instantiated so that only one procedure applies. Such strategies are used, for example, in concurrent logic programming.
In most cases, backward reasoning from a query or goal is more efficient than forward reasoning. But sometimes with Datalog and Answer Set Programming, there may be no query that is separate from the set of clauses as a whole, and then generating all the facts that can be derived from the clauses is a sensible problem-solving strategy. Here is another example, where forward reasoning beats backward reasoning in a more conventional computation task, where the goal ?- fibonacci(n, Result)
is to find the n fibonacci number:
fibonacci(0, 0). fibonacci(1, 1). fibonacci(N, Result) :- N > 1, N1 is N - 1, N2 is N - 2, fibonacci(N1, F1), fibonacci(N2, F2), Result is F1 + F2.
Here the relation fibonacci(N, M)
stands for the function fibonacci(N) = M
, and the predicate N is Expression
is Prolog notation for the predicate that instantiates the variable N
to the value of Expression
.
Given the goal of computing the fibonacci number of n
, backward reasoning reduces the goal to the two subgoals of computing the fibonacci numbers of n-1 and n-2. It reduces the subgoal of computing the fibonacci number of n-1 to the two subgoals of computing the fibonacci numbers of n-2 and n-3, redundantly computing the fibonacci number of n-2. This process of reducing one fibonacci subgoal to two fibonacci subgoals continues until it reaches the numbers 0 and 1. Its complexity is of the order 2. In contrast, forward reasoning generates the sequence of fibonacci numbers, starting from 0 and 1 without any recomputation, and its complexity is linear with respect to n.
Prolog cannot perform forward reasoning directly. But it can achieve the effect of forward reasoning within the context of backward reasoning by means of tabling: Subgoals are maintained in a table, along with their solutions. If a subgoal is re-encountered, it is solved directly by using the solutions already in the table, instead of re-solving the subgoals redundantly.
Relationship with functional programming
See also: Functional programming § Comparison to logic programmingLogic programming can be viewed as a generalisation of functional programming, in which functions are a special case of relations. For example, the function, mother(X) = Y, (every X has only one mother Y) can be represented by the relation mother(X, Y). In this respect, logic programs are similar to relational databases, which also represent functions as relations.
Compared with relational syntax, functional syntax is more compact for nested functions. For example, in functional syntax the definition of maternal grandmother can be written in the nested form:
maternal_grandmother(X) = mother(mother(X)).
The same definition in relational notation needs to be written in the unnested, flattened form:
maternal_grandmother(X, Y) :- mother(X, Z), mother(Z, Y).
However, nested syntax can be regarded as syntactic sugar for unnested syntax. Ciao Prolog, for example, transforms functional syntax into relational form and executes the resulting logic program using the standard Prolog execution strategy. Moreover, the same transformation can be used to execute nested relations that are not functional. For example:
grandparent(X) := parent(parent(X)). parent(X) := mother(X). parent(X) := father(X). mother(charles) := elizabeth. father(charles) := phillip. mother(harry) := diana. father(harry) := charles. ?- grandparent(X,Y). X = harry, Y = elizabeth. X = harry, Y = phillip.
Relationship with relational programming
The term relational programming has been used to cover a variety of programming languages that treat functions as a special case of relations. Some of these languages, such as miniKanren and relational linear programming are logic programming languages in the sense of this article.
However, the relational language RML is an imperative programming language whose core construct is a relational expression, which is similar to an expression in first-order predicate logic.
Other relational programming languages are based on the relational calculus or relational algebra.
Semantics of Horn clause programs
Main article: Syntax and semantics of logic programmingViewed in purely logical terms, there are two approaches to the declarative semantics of Horn clause logic programs: One approach is the original logical consequence semantics, which understands solving a goal as showing that the goal is a theorem that is true in all models of the program.
In this approach, computation is theorem-proving in first-order logic; and both backward reasoning, as in SLD resolution, and forward reasoning, as in hyper-resolution, are correct and complete theorem-proving methods. Sometimes such theorem-proving methods are also regarded as providing a separate proof-theoretic (or operational) semantics for logic programs. But from a logical point of view, they are proof methods, rather than semantics.
The other approach to the declarative semantics of Horn clause programs is the satisfiability semantics, which understands solving a goal as showing that the goal is true (or satisfied) in some intended (or standard) model of the program. For Horn clause programs, there always exists such a standard model: It is the unique minimal model of the program.
Informally speaking, a minimal model is a model that, when it is viewed as the set of all (variable-free) facts that are true in the model, contains no smaller set of facts that is also a model of the program.
For example, the following facts represent the minimal model of the family relationships example in the introduction of this article. All other variable-free facts are false in the model:
mother_child(elizabeth, charles). father_child(charles, william). father_child(charles, harry). parent_child(elizabeth, charles). parent_child(charles, william). parent_child(charles, harry). grandparent_child(elizabeth, william). grandparent_child(elizabeth, harry).
The satisfiability semantics also has an alternative, more mathematical characterisation as the least fixed point of the function that uses the rules in the program to derive new facts from existing facts in one step of inference.
Remarkably, the same problem-solving methods of forward and backward reasoning, which were originally developed for the logical consequence semantics, are equally applicable to the satisfiability semantics: Forward reasoning generates the minimal model of a Horn clause program, by deriving new facts from existing facts, until no new additional facts can be generated. Backward reasoning, which succeeds by reducing a goal to subgoals, until all subgoals are solved by facts, ensures that the goal is true in the minimal model, without generating the model explicitly.
The difference between the two declarative semantics can be seen with the definitions of addition and multiplication in successor arithmetic, which represents the natural numbers 0, 1, 2, ...
as a sequence of terms of the form 0, s(0), s(s(0)), ...
. In general, the term s(X)
represents the successor of X,
namely X + 1.
Here are the standard definitions of addition and multiplication in functional notation:
X + 0 = X. X + s(Y) = s(X + Y). i.e. X + (Y + 1) = (X + Y) + 1 X × 0 = 0. X × s(Y) = X + (X × Y). i.e. X × (Y + 1) = X + (X × Y).
Here are the same definitions as a logic program, using add(X, Y, Z)
to represent X + Y = Z,
and multiply(X, Y, Z)
to represent X × Y = Z
:
add(X, 0, X). add(X, s(Y), s(Z)) :- add(X, Y, Z). multiply(X, 0, 0). multiply(X, s(Y), W) :- multiply(X, Y, Z), add(X, Z, W).
The two declarative semantics both give the same answers for the same existentially quantified conjunctions of addition and multiplication goals. For example 2 × 2 = X
has the solution X = 4
; and X × X = X + X
has two solutions X = 0
and X = 2
:
?- multiply(s(s(0)), s(s(0)), X). X = s(s(s(s(0)))). ?- multiply(X, X, Y), add(X, X, Y). X = 0, Y = 0. X = s(s(0)), Y = s(s(s(s(0)))).
However, with the logical-consequence semantics, there are non-standard models of the program, in which, for example, add(s(s(0)), s(s(0)), s(s(s(s(s(0)))))),
i.e. 2 + 2 = 5
is true. But with the satisfiability semantics, there is only one model, namely the standard model of arithmetic, in which 2 + 2 = 5
is false.
In both semantics, the goal ?- add(s(s(0)), s(s(0)), s(s(s(s(s(0))))))
fails. In the satisfiability semantics, the failure of the goal means that the truth value of the goal is false. But in the logical consequence semantics, the failure means that the truth value of the goal is unknown.
Negation as failure
Main article: Negation as failureNegation as failure (NAF), as a way of concluding that a negative condition not p
holds by showing that the positive condition p
fails to hold, was already a feature of early Prolog systems. The resulting extension of SLD resolution is called SLDNF. A similar construct, called "thnot", also existed in Micro-Planner.
The logical semantics of NAF was unresolved until Keith Clark showed that, under certain natural conditions, NAF is an efficient, correct (and sometimes complete) way of reasoning with the logical consequence semantics using the completion of a logic program in first-order logic.
Completion amounts roughly to regarding the set of all the program clauses with the same predicate in the head, say:
A :- Body1.
...
A :- Bodyk.
as a definition of the predicate:
A iff (Body1 or ... or Bodyk)
where iff
means "if and only if". The completion also includes axioms of equality, which correspond to unification. Clark showed that proofs generated by SLDNF are structurally similar to proofs generated by a natural deduction style of reasoning with the completion of the program.
Consider, for example, the following program:
should_receive_sanction(X, punishment) :- is_a_thief(X), not should_receive_sanction(X, rehabilitation). should_receive_sanction(X, rehabilitation) :- is_a_thief(X), is_a_minor(X), not is_violent(X). is_a_thief(tom).
Given the goal of determining whether tom should receive a sanction, the first rule succeeds in showing that tom should be punished:
?- should_receive_sanction(tom, Sanction). Sanction = punishment.
This is because tom is a thief, and it cannot be shown that tom should be rehabilitated. It cannot be shown that tom should be rehabilitated, because it cannot be shown that tom is a minor.
If, however, we receive new information that tom is indeed a minor, the previous conclusion that tom should be punished is replaced by the new conclusion that tom should be rehabilitated:
minor(tom). ?- should_receive_sanction(tom, Sanction). Sanction = rehabilitation.
This property of withdrawing a conclusion when new information is added, is called non-monotonicity, and it makes logic programming a non-monotonic logic.
But, if we are now told that tom is violent, the conclusion that tom should be punished will be reinstated:
violent(tom). ?- should_receive_sanction(tom, Sanction). Sanction = punishment.
The completion of this program is:
should_receive_sanction(X, Sanction) iff Sanction = punishment, is_a_thief(X), not should_receive_sanction(X, rehabilitation) or Sanction = rehabilitation, is_a_thief(X), is_a_minor(X), not is_violent(X). is_a_thief(X) iff X = tom. is_a_minor(X) iff X = tom. is_violent(X) iff X = tom.
The notion of completion is closely related to John McCarthy's circumscription semantics for default reasoning, and to Ray Reiter's closed world assumption.
The completion semantics for negation is a logical consequence semantics, for which SLDNF provides a proof-theoretic implementation. However, in the 1980s, the satisfiability semantics became more popular for logic programs with negation. In the satisfiability semantics, negation is interpreted according to the classical definition of truth in an intended or standard model of the logic program.
In the case of logic programs with negative conditions, there are two main variants of the satisfiability semantics: In the well-founded semantics, the intended model of a logic program is a unique, three-valued, minimal model, which always exists. The well-founded semantics generalises the notion of inductive definition in mathematical logic. XSB Prolog implements the well-founded semantics using SLG resolution.
In the alternative stable model semantics, there may be no intended models or several intended models, all of which are minimal and two-valued. The stable model semantics underpins answer set programming (ASP).
Both the well-founded and stable model semantics apply to arbitrary logic programs with negation. However, both semantics coincide for stratified logic programs. For example, the program for sanctioning thieves is (locally) stratified, and all three semantics for the program determine the same intended model:
should_receive_sanction(tom, punishment). is_a_thief(tom). is_a_minor(tom). is_violent(tom).
Attempts to understand negation in logic programming have also contributed to the development of abstract argumentation frameworks. In an argumentation interpretation of negation, the initial argument that tom should be punished because he is a thief, is attacked by the argument that he should be rehabilitated because he is a minor. But the fact that tom is violent undermines the argument that tom should be rehabilitated and reinstates the argument that tom should be punished.
Metalogic programming
Metaprogramming, in which programs are treated as data, was already a feature of early Prolog implementations. For example, the Edinburgh DEC10 implementation of Prolog included "an interpreter and a compiler, both written in Prolog itself". The simplest metaprogram is the so-called "vanilla" meta-interpreter:
solve(true). solve((B,C)):- solve(B),solve(C). solve(A):- clause(A,B),solve(B).
where true represents an empty conjunction, and (B,C) is a composite term representing the conjunction of B and C. The predicate clause(A,B) means that there is a clause of the form A :- B.
Metaprogramming is an application of the more general use of a metalogic or metalanguage to describe and reason about another language, called the object language.
Metalogic programming allows object-level and metalevel representations to be combined, as in natural language. For example, in the following program, the atomic formula attends(Person, Meeting)
occurs both as an object-level formula, and as an argument of the metapredicates prohibited
and approved.
prohibited(attends(Person, Meeting)) :- not(approved(attends(Person, Meeting))). should_receive_sanction(Person, scolding) :- attends(Person, Meeting), lofty(Person), prohibited(attends(Person, Meeting)). should_receive_sanction(Person, banishment) :- attends(Person, Meeting), lowly(Person), prohibited(attends(Person, Meeting)). approved(attends(alice, tea_party)). attends(mad_hatter, tea_party). attends(dormouse, tea_party). lofty(mad_hatter). lowly(dormouse). ?- should_receive_sanction(X,Y). Person = mad_hatter, Sanction = scolding. Person = dormouse, Sanction = banishment.
Relationship with the Computational-representational understanding of mind
In his popular Introduction to Cognitive Science, Paul Thagard includes logic and rules as alternative approaches to modelling human thinking. He argues that rules, which have the form IF condition THEN action, are "very similar" to logical conditionals, but they are simpler and have greater psychological plausability (page 51). Among other differences between logic and rules, he argues that logic uses deduction, but rules use search (page 45) and can be used to reason either forward or backward (page 47). Sentences in logic "have to be interpreted as universally true", but rules can be defaults, which admit exceptions (page 44).
He states that "unlike logic, rule-based systems can also easily represent strategic information about what to do" (page 45). For example, "IF you want to go home for the weekend, and you have bus fare, THEN you can catch a bus". He does not observe that the same strategy of reducing a goal to subgoals can be interpreted, in the manner of logic programming, as applying backward reasoning to a logical conditional:
can_go(you, home) :- have(you, bus_fare), catch(you, bus).
All of these characteristics of rule-based systems - search, forward and backward reasoning, default reasoning, and goal-reduction - are also defining characteristics of logic programming. This suggests that Thagard's conclusion (page 56) that:
Much of human knowledge is naturally described in terms of rules, and many kinds of thinking such as planning can be modeled by rule-based systems.
also applies to logic programming.
Other arguments showing how logic programming can be used to model aspects of human thinking are presented by Keith Stenning and Michiel van Lambalgen in their book, Human Reasoning and Cognitive Science. They show how the non-monotonic character of logic programs can be used to explain human performance on a variety of psychological tasks. They also show (page 237) that "closed–world reasoning in its guise as logic programming has an appealing neural implementation, unlike classical logic."
In The Proper Treatment of Events, Michiel van Lambalgen and Fritz Hamm investigate the use of constraint logic programming to code "temporal notions in natural language by looking at the way human beings construct time".
Knowledge representation
The use of logic to represent procedural knowledge and strategic information was one of the main goals contributing to the early development of logic programming. Moreover, it continues to be an important feature of the Prolog family of logic programming languages today. However, many applications of logic programming, including Prolog applications, increasingly focus on the use of logic to represent purely declarative knowledge. These applications include both the representation of general commonsense knowledge and the representation of domain specific expertise.
Commonsense includes knowledge about cause and effect, as formalised, for example, in the situation calculus, event calculus and action languages. Here is a simplified example, which illustrates the main features of such formalisms. The first clause states that a fact holds immediately after an event initiates (or causes) the fact. The second clause is a frame axiom, which states that a fact that holds at a time continues to hold at the next time unless it is terminated by an event that happens at the time. This formulation allows more than one event to occur at the same time:
holds(Fact, Time2) :- happens(Event, Time1), Time2 is Time1 + 1, initiates(Event, Fact). holds(Fact, Time2) :- happens(Event, Time1), Time2 is Time1 + 1, holds(Fact, Time1), not(terminated(Fact, Time1)). terminated(Fact, Time) :- happens(Event, Time), terminates(Event, Fact).
Here holds
is a meta-predicate, similar to solve
above. However, whereas solve
has only one argument, which applies to general clauses, the first argument of holds
is a fact and the second argument is a time (or state). The atomic formula holds(Fact, Time)
expresses that the Fact
holds at the Time
. Such time-varying facts are also called fluents. The atomic formula happens(Event, Time)
expresses that the Event happens at the Time
.
The following example illustrates how these clauses can be used to reason about causality in a toy blocks world. Here, in the initial state at time 0, a green block is on a table and a red block is stacked on the green block (like a traffic light). At time 0, the red block is moved to the table. At time 1, the green block is moved onto the red block. Moving an object onto a place terminates the fact that the object is on any place, and initiates the fact that the object is on the place to which it is moved:
holds(on(green_block, table), 0). holds(on(red_block, green_block), 0). happens(move(red_block, table), 0). happens(move(green_block, red_block), 1). initiates(move(Object, Place), on(Object, Place)). terminates(move(Object, Place2), on(Object, Place1)). ?- holds(Fact, Time). Fact = on(green_block,table), Time = 0. Fact = on(red_block,green_block), Time = 0. Fact = on(green_block,table), Time = 1. Fact = on(red_block,table), Time = 1. Fact = on(green_block,red_block), Time = 2. Fact = on(red_block,table), Time = 2.
Forward reasoning and backward reasoning generate the same answers to the goal holds(Fact, Time)
. But forward reasoning generates fluents progressively in temporal order, and backward reasoning generates fluents regressively, as in the domain-specific use of regression in the situation calculus.
Logic programming has also proved to be useful for representing domain-specific expertise in expert systems. But human expertise, like general-purpose commonsense, is mostly implicit and tacit, and it is often difficult to represent such implicit knowledge in explicit rules. This difficulty does not arise, however, when logic programs are used to represent the existing, explicit rules of a business organisation or legal authority.
For example, here is a representation of a simplified version of the first sentence of the British Nationality Act, which states that a person who is born in the UK becomes a British citizen at the time of birth if a parent of the person is a British citizen at the time of birth:
initiates(birth(Person), citizen(Person, uk)):- time_of(birth(Person), Time), place_of(birth(Person), uk), parent_child(Another_Person, Person), holds(citizen(Another_Person, uk), Time).
Historically, the representation of a large portion of the British Nationality Act as a logic program in the 1980s was "hugely influential for the development of computational representations of legislation, showing how logic programming enables intuitively appealing representations that can be directly deployed to generate automatic inferences".
More recently, the PROLEG system, initiated in 2009 and consisting of approximately 2500 rules and exceptions of civil code and supreme court case rules in Japan, has become possibly the largest legal rule base in the world.
Variants and extensions
Prolog
Main article: PrologThe SLD resolution rule of inference is neutral about the order in which subgoals in the bodies of clauses can be selected for solution. For the sake of efficiency, Prolog restricts this order to the order in which the subgoals are written. SLD is also neutral about the strategy for searching the space of SLD proofs. Prolog searches this space, top-down, depth-first, trying different clauses for solving the same (sub)goal in the order in which the clauses are written.
This search strategy has the advantage that the current branch of the tree can be represented efficiently by a stack. When a goal clause at the top of the stack is reduced to a new goal clause, the new goal clause is pushed onto the top of the stack. When the selected subgoal in the goal clause at the top of the stack cannot be solved, the search strategy backtracks, removing the goal clause from the top of the stack, and retrying the attempted solution of the selected subgoal in the previous goal clause using the next clause that matches the selected subgoal.
Backtracking can be restricted by using a subgoal, called cut, written as !, which always succeeds but cannot be backtracked. Cut can be used to improve efficiency, but can also interfere with the logical meaning of clauses. In many cases, the use of cut can be replaced by negation as failure. In fact, negation as failure can be defined in Prolog, by using cut, together with any literal, say fail, that unifies with the head of no clause:
not(P) :- P, !, fail. not(P).
Prolog provides other features, in addition to cut, that do not have a logical interpretation. These include the built-in predicates assert and retract for destructively updating the state of the program during program execution.
For example, the toy blocks world example above can be implemented without frame axioms using destructive change of state:
on(green_block, table). on(red_block, green_block). move(Object, Place2) :- retract(on(Object, Place1)), assert(on(Object, Place2).
The sequence of move events and the resulting locations of the blocks can be computed by executing the query:
?- move(red_block, table), move(green_block, red_block), on(Object, Place). Object = red_block, Place = table. Object = green_block, Place = red_block.
Various extensions of logic programming have been developed to provide a logical framework for such destructive change of state.
The broad range of Prolog applications, both in isolation and in combination with other languages is highlighted in the Year of Prolog Book, celebrating the 50 year anniversary of Prolog in 2022.
Prolog has also contributed to the development of other programming languages, including ALF, Fril, Gödel, Mercury, Oz, Ciao, Visual Prolog, XSB, and λProlog.
Constraint logic programming
Main article: Constraint logic programmingConstraint logic programming (CLP) combines Horn clause logic programming with constraint solving. It extends Horn clauses by allowing some predicates, declared as constraint predicates, to occur as literals in the body of a clause. Constraint predicates are not defined by the facts and rules in the program, but are predefined by some domain-specific model-theoretic structure or theory.
Procedurally, subgoals whose predicates are defined by the program are solved by goal-reduction, as in ordinary logic programming, but constraints are simplified and checked for satisfiability by a domain-specific constraint-solver, which implements the semantics of the constraint predicates. An initial problem is solved by reducing it to a satisfiable conjunction of constraints.
Interestingly, the first version of Prolog already included a constraint predicate dif(term1, term2), from Philippe Roussel's 1972 PhD thesis, which succeeds if both of its arguments are different terms, but which is delayed if either of the terms contains a variable.
The following constraint logic program represents a toy temporal database of john's
history as a teacher:
teaches(john, hardware, T) :- 1990 ≤ T, T < 1999. teaches(john, software, T) :- 1999 ≤ T, T < 2005. teaches(john, logic, T) :- 2005 ≤ T, T ≤ 2012. rank(john, instructor, T) :- 1990 ≤ T, T < 2010. rank(john, professor, T) :- 2010 ≤ T, T < 2014.
Here ≤
and <
are constraint predicates, with their usual intended semantics. The following goal clause queries the database to find out when john
both taught logic
and was a professor
:
?- teaches(john, logic, T), rank(john, professor, T).
The solution
2010 ≤ T, T ≤ 2012
results from simplifying the constraints
2005 ≤ T, T ≤ 2012, 2010 ≤ T, T < 2014.
Constraint logic programming has been used to solve problems in such fields as civil engineering, mechanical engineering, digital circuit verification, automated timetabling, air traffic control, and finance. It is closely related to abductive logic programming.
Datalog
Main article: DatalogDatalog is a database definition language, which combines a relational view of data, as in relational databases, with a logical view, as in logic programming.
Relational databases use a relational calculus or relational algebra, with relational operations, such as union, intersection, set difference and cartesian product to specify queries, which access a database. Datalog uses logical connectives, such as or, and and not in the bodies of rules to define relations as part of the database itself.
It was recognized early in the development of relational databases that recursive queries cannot be expressed in either relational algebra or relational calculus, and that this defficiency can be remedied by introducing a least-fixed-point operator. In contrast, recursive relations can be defined naturally by rules in logic programs, without the need for any new logical connectives or operators.
Datalog differs from more general logic programming by having only constants and variables as terms. Moreover, all facts are variable-free, and rules are restricted, so that if they are executed bottom-up, then the derived facts are also variable-free.
For example, consider the family database:
mother_child(elizabeth, charles). father_child(charles, william). father_child(charles, harry). parent_child(X, Y) :- mother_child(X, Y). parent_child(X, Y) :- father_child(X, Y). ancestor_descendant(X, Y) :- parent_child(X, X). ancestor_descendant(X, Y) :- ancestor_descendant(X, Z), ancestor_descendant(Z, Y).
Bottom-up execution derives the following set of additional facts and terminates:
parent_child(elizabeth, charles). parent_child(charles, william). parent_child(charles, harry). ancestor_descendant(elizabeth, charles). ancestor_descendant(charles, william). ancestor_descendant(charles, harry). ancestor_descendant(elizabeth, william). ancestor_descendant(elizabeth, harry).
Top-down execution derives the same answers to the query:
?- ancestor_descendant(X, Y).
But then it goes into an infinite loop. However, top-down execution with tabling gives the same answers and terminates without looping.
Answer set programming
Main article: Answer Set ProgrammingLike Datalog, Answer Set programming (ASP) is not Turing-complete. Moreover, instead of separating goals (or queries) from the program to be used in solving the goals, ASP treats the whole program as a goal, and solves the goal by generating a stable model that makes the goal true. For this purpose, it uses the stable model semantics, according to which a logic program can have zero, one or more intended models. For example, the following program represents a degenerate variant of the map colouring problem of colouring two countries red or green:
country(oz). country(iz). adjacent(oz, iz). colour(C, red) :- country(C), not(colour(C, green)). colour(C, green) :- country(C), not(colour(C, red)).
The problem has four solutions represented by four stable models:
country(oz). country(iz). adjacent(oz, iz). colour(oz, red). colour(iz, red). country(oz). country(iz). adjacent(oz, iz). colour(oz, green). colour(iz, green). country(oz). country(iz). adjacent(oz, iz). colour(oz, red). colour(iz, green). country(oz). country(iz). adjacent(oz, iz). colour(oz, green). colour(iz, red).
To represent the standard version of the map colouring problem, we need to add a constraint that two adjacent countries cannot be coloured the same colour. In ASP, this constraint can be written as a clause of the form:
:- country(C1), country(C2), adjacent(C1, C2), colour(C1, X), colour(C2, X).
With the addition of this constraint, the problem now has only two solutions:
country(oz). country(iz). adjacent(oz, iz). colour(oz, red). colour(iz, green). country(oz). country(iz). adjacent(oz, iz). colour(oz, green). colour(iz, red).
The addition of constraints of the form :- Body.
eliminates models in which Body
is true.
Confusingly, constraints in ASP are different from constraints in CLP. Constraints in CLP are predicates that qualify answers to queries (and solutions of goals). Constraints in ASP are clauses that eliminate models that would otherwise satisfy goals. Constraints in ASP are like integrity constraints in databases.
This combination of ordinary logic programming clauses and constraint clauses illustrates the generate-and-test methodology of problem solving in ASP: The ordinary clauses define a search space of possible solutions, and the constraints filter out unwanted solutions.
Most implementations of ASP proceed in two steps: First they instantiate the program in all possible ways, reducing it to a propositional logic program (known as grounding). Then they apply a propositional logic problem solver, such as the DPLL algorithm or a Boolean SAT solver. However, some implementations, such as s(CASP) use a goal-directed, top-down, SLD resolution-like procedure without grounding.
Abductive logic programming
Main article: Abductive logic programmingAbductive logic programming (ALP), like CLP, extends normal logic programming by allowing the bodies of clauses to contain literals whose predicates are not defined by clauses. In ALP, these predicates are declared as abducible (or assumable), and are used as in abductive reasoning to explain observations, or more generally to add new facts to the program (as assumptions) to solve goals.
For example, suppose we are given an initial state in which a red block is on a green block on a table at time 0:
holds(on(green_block, table), 0). holds(on(red_block, green_block), 0).
Suppose we are also given the goal:
?- holds(on(green_block,red_block), 3), holds(on(red_block,table), 3).
The goal can represent an observation, in which case a solution is an explanation of the observation. Or the goal can represent a desired future state of affairs, in which case a solution is a plan for achieving the goal.
We can use the rules for cause and effect presented earlier to solve the goal, by treating the happens
predicate as abducible:
holds(Fact, Time2) :- happens(Event, Time1), Time2 is Time1 + 1, initiates(Event, Fact). holds(Fact, Time2) :- happens(Event, Time1), Time2 is Time1 + 1, holds(Fact, Time1), not(terminated(Fact, Time1)). terminated(Fact, Time) :- happens(Event, Time), terminates(Event, Fact). initiates(move(Object, Place), on(Object, Place)). terminates(move(Object, Place2), on(Object, Place1)).
ALP solves the goal by reasoning backwards and adding assumptions to the program, to solve abducible subgoals. In this case there are many alternative solutions, including:
happens(move(red_block, table), 0). happens(tick, 1). happens(move(green_block, red_block), 2).
happens(tick,0). happens(move(red_block, table), 1). happens(move(green_block, red_block), 2).
happens(move(red_block, table), 0). happens(move(green_block, red_block), 1). happens(tick, 2).
Here tick
is an event that marks the passage of time without initiating or terminating any fluents.
There are also solutions in which the two move
events happen at the same time. For example:
happens(move(red_block, table), 0). happens(move(green_block, red_block), 0). happens(tick, 1). happens(tick, 2).
Such solutions, if not desired, can be removed by adding an integrity constraint, which is like a constraint clause in ASP:
:- happens(move(Block1, Place), Time), happens(move(Block2, Block1), Time).
Abductive logic programming has been used for fault diagnosis, planning, natural language processing and machine learning. It has also been used to interpret negation as failure as a form of abductive reasoning.
Inductive logic programming
Main article: Inductive logic programmingInductive logic programming (ILP) is an approach to machine learning that induces logic programs as hypothetical generalisations of positive and negative examples. Given a logic program representing background knowledge and positive examples together with constraints representing negative examples, an ILP system induces a logic program that generalises the positive examples while excluding the negative examples.
ILP is similar to ALP, in that both can be viewed as generating hypotheses to explain observations, and as employing constraints to exclude undesirable hypotheses. But in ALP the hypotheses are variable-free facts, and in ILP the hypotheses are general rules.
For example, given only background knowledge of the mother_child and father_child relations, and suitable examples of the grandparent_child relation, current ILP systems can generate the definition of grandparent_child, inventing an auxiliary predicate, which can be interpreted as the parent_child relation:
grandparent_child(X, Y):- auxiliary(X, Z), auxiliary(Z, Y). auxiliary(X, Y):- mother_child(X, Y). auxiliary(X, Y):- father_child(X, Y).
Stuart Russell has referred to such invention of new concepts as the most important step needed for reaching human-level AI.
Recent work in ILP, combining logic programming, learning and probability, has given rise to the fields of statistical relational learning and probabilistic inductive logic programming.
Concurrent logic programming
Main article: Concurrent logic programmingConcurrent logic programming integrates concepts of logic programming with concurrent programming. Its development was given a big impetus in the 1980s by its choice for the systems programming language of the Japanese Fifth Generation Project (FGCS).
A concurrent logic program is a set of guarded Horn clauses of the form:
H :- G1, ..., Gn | B1, ..., Bn.
The conjunction G1, ... , Gn
is called the guard of the clause, and | is the commitment operator. Declaratively, guarded Horn clauses are read as ordinary logical implications:
H if G1 and ... and Gn and B1 and ... and Bn.
However, procedurally, when there are several clauses whose heads H
match a given goal, then all of the clauses are executed in parallel, checking whether their guards G1, ... , Gn
hold. If the guards of more than one clause hold, then a committed choice is made to one of the clauses, and execution proceeds with the subgoals B1, ..., Bn
of the chosen clause. These subgoals can also be executed in parallel. Thus concurrent logic programming implements a form of "don't care nondeterminism", rather than "don't know nondeterminism".
For example, the following concurrent logic program defines a predicate shuffle(Left, Right, Merge)
, which can be used to shuffle two lists Left
and Right
, combining them into a single list Merge
that preserves the ordering of the two lists Left
and Right
:
shuffle(, , ). shuffle(Left, Right, Merge) :- Left = | Merge = , shuffle(Rest, Right, ShortMerge). shuffle(Left, Right, Merge) :- Right = | Merge = , shuffle(Left, Rest, ShortMerge).
Here, represents the empty list, and
represents a list with first element
Head
followed by list Tail
, as in Prolog. (Notice that the first occurrence of | in the second and third clauses is the list constructor, whereas the second occurrence of | is the commitment operator.) The program can be used, for example, to shuffle the lists and
by invoking the goal clause:
shuffle(, , Merge).
The program will non-deterministically generate a single solution, for example Merge =
.
Carl Hewitt has argued that, because of the indeterminacy of concurrent computation, concurrent logic programming cannot implement general concurrency. However, according to the logical semantics, any result of a computation of a concurrent logic program is a logical consequence of the program, even though not all logical consequences can be derived.
Concurrent constraint logic programming
Main article: Concurrent constraint logic programmingConcurrent constraint logic programming combines concurrent logic programming and constraint logic programming, using constraints to control concurrency. A clause can contain a guard, which is a set of constraints that may block the applicability of the clause. When the guards of several clauses are satisfied, concurrent constraint logic programming makes a committed choice to use only one.
Higher-order logic programming
Several researchers have extended logic programming with higher-order programming features derived from higher-order logic, such as predicate variables. Such languages include the Prolog extensions HiLog and λProlog.
Linear logic programming
Basing logic programming within linear logic has resulted in the design of logic programming languages that are considerably more expressive than those based on classical logic. Horn clause programs can only represent state change by the change in arguments to predicates. In linear logic programming, one can use the ambient linear logic to support state change. Some early designs of logic programming languages based on linear logic include LO, Lolli, ACL, and Forum. Forum provides a goal-directed interpretation of all linear logic.
Object-oriented logic programming
F-logic extends logic programming with objects and the frame syntax.
Logtalk extends the Prolog programming language with support for objects, protocols, and other OOP concepts. It supports most standard-compliant Prolog systems as backend compilers.
Transaction logic programming
Transaction logic is an extension of logic programming with a logical theory of state-modifying updates. It has both a model-theoretic semantics and a procedural one. An implementation of a subset of Transaction logic is available in the Flora-2 system. Other prototypes are also available.
See also
- Automated theorem proving
- Boolean satisfiability problem
- Constraint logic programming
- Control theory
- Datalog
- Fril
- Functional programming
- Fuzzy logic
- Inductive logic programming
- Linear logic
- Logic in computer science (includes Formal methods)
- Logic programming languages
- Programmable logic controller
- R++
- Reasoning system
- Rule-based machine learning
- Satisfiability
- Syntax and semantics of logic programming
Citations
- Tärnlund, S.Å. (1977). "Horn clause computability". BIT Numerical Mathematics. 17 (2): 215–226. doi:10.1007/BF01932293. S2CID 32577496.
- Andréka, H.; Németi, I. (1978). "The generalised completeness of Horn predicate-logic as a programming language". Acta Cybernetica. 4 (1): 3–10.
- Green, Cordell. Application of Theorem Proving to Problem Solving (PDF). IJCAI 1969.
- Foster, J.M.; Elcock, E.W. (1969). ABSYS 1: An Incremental Compiler for Assertions: an Introduction. Fourth Annual Machine Intelligence Workshop. Machine Intelligence. Vol. 4. Edinburgh, UK: Edinburgh University Press. pp. 423–429.
- Kowalski, R. A. (1988). "The early years of logic programming" (PDF). Communications of the ACM. 31: 38–43. doi:10.1145/35043.35046. S2CID 12259230.
- Hewitt, Carl. Planner: A Language for Proving Theorems in Robots (PDF). IJCAI 1969.
- Winograd, Terry (1972). "Understanding natural language". Cognitive Psychology. 3 (1): 1–191. doi:10.1016/0010-0285(72)90002-3.
- Jeff Rulifson; Jan Derksen; Richard Waldinger (November 1973). QA4, A Procedural Calculus for Intuitive Reasoning (PDF) (Technical report). SRI AI Center Technical Note 73.
- Davies, J.M., 1971. POPLER: a POP-2 planner. Edinburgh University, Department of Machine Intelligence and Perception.
- McDermott, D.V.; Sussman, G.J. (May 1972). The Conniver reference manual (Technical report). Artificial Intelligence Memo No. 259.
- Reboh, R.; Sacerdoti, E.D. (August 1973). A preliminary QLISP manual (Technical report). Artificial Intelligence Center, SRI International.
- Kornfeld, W.A.; Hewitt, C.E. (1981). "The scientific community metaphor". IEEE Transactions on Systems, Man, and Cybernetics. 11 (1): 24–33. doi:10.1109/TSMC.1981.4308575. hdl:1721.1/5693. S2CID 1322857.
- Hayes, Pat (1973). "Computation and Deduction". Proceedings of the 2nd MFCS Symposium. Czechoslovak Academy of Sciences. pp. 105–118.
- Robinson, J. (1965). "Automatic deduction with hyper-resolution". International Journal of Computer Mathematics. 1 (3): 227–234. doi:10.2307/2272384. JSTOR 2272384.
- Kowalski, Robert; Kuehner, Donald (1971). "Linear Resolution with Selection Function" (PDF). Artificial Intelligence. 2 (3–4): 227–260. doi:10.1016/0004-3702(71)90012-9.
- Kowalski, Robert (1973). "Predicate Logic as a Programming Language" (PDF). Department of Artificial Intelligence, Edinburgh University. Memo 70. Also in Proceedings IFIP Congress, Stockholm, North Holland Publishing Co., 1974, pp. 569–574.
- Warren, D.H.; Pereira, L.M.; Pereira, F. (1977). "Prolog-the language and its implementation compared with Lisp". ACM SIGPLAN Notices. 12 (8): 109–115. doi:10.1145/872734.806939.
- Ueda, K., 2018. Logic/constraint programming and concurrency: The hard-won lessons of the fifth generation computer project. Science of Computer Programming, 164, pp.3-17.
- H.P. Newquist, 2020. The Brain Makers: The History Of Artificial Intelligence. The Relayer Group.
- Gallaire, Hervé; Minker, John 'Jack', eds. (1978), "Logic and Data Bases, Symposium on Logic and Data Bases, Centre d'études et de recherches de Toulouse, 1977", Advances in Data Base Theory, New York: Plenum Press, ISBN 978-0-306-40060-5.
- ^ Warren, D.S. (2023). "Introduction to Prolog". In Warren, D.S.; Dahl, V.; Eiter, T.; Hermenegildo, M.V.; Kowalski, R.; Rossi, F. (eds.). Prolog: The Next 50 Years. Lecture Notes in Computer Science(). Vol. 13900. Springer, Cham. pp. 3–19. doi:10.1007/978-3-031-35254-6_1. ISBN 978-3-031-35253-9.
- Robinson, J. Alan (2001). "Invited Editorial". Theory and Practice of Logic Programming. 1 (1). Cambridge University Press: 1. doi:10.1017/s1471068400000028 (inactive 1 November 2024).
{{cite journal}}
: CS1 maint: DOI inactive as of November 2024 (link) - R.A.Kowalski (July 1979). "Algorithm=Logic + Control". Communications of the ACM. 22 (7): 424–436. doi:10.1145/359131.359136. S2CID 2509896.
- Bruynooghe, M.; Pereira, L.M. (1984). "Deduction revision by intelligent backtracking". Implementations of Prolog. Chichester, England: Ellis Horwood. pp. 194–215.
- Nakamura, K. (July 1985). Heuristic Prolog: logic program execution by heuristic search. Conference on Logic Programming. Berlin, Heidelberg: Springer Berlin Heidelberg. pp. 148–155.
- Genesereth, M.R.; Ginsberg, M.L. (1985). "Logic programming". Communications of the ACM. 28 (9): 933–941. doi:10.1145/4284.4287. S2CID 15527861.
- Swift, T.; Warren, D.S. (January 2012). "XSB: Extending Prolog with tabled logic programming". Theory and Practice of Logic Programming. 12 (1–2): 157–187. arXiv:1012.5123. doi:10.1017/S1471068411000500. S2CID 6153112.
- ^ Daniel Friedman; William Byrd; Oleg Kiselyov; Jason Hemann (2018). The Reasoned Schemer, Second Edition. The MIT Press.
- A. Casas, D. Cabeza, M. V. Hermenegildo. A Syntactic Approach to Combining Functional Notation, Lazy Evaluation and Higher-Order in LP Systems. The 8th International Symposium on Functional and Logic Programming (FLOPS'06), pages 142-162, April 2006.
- Kersting, K., Mladenov, M. and Tokmakov, P., 2017. Relational linear programming. Artificial Intelligence, 244, pp.188-216.
- Beyer, D., 2006, May. Relational programming with CrocoPat. In Proceedings of the 28th International Conference on Software engineering (pp. 807-810).
- MacLennan, B.J., 1983. Overview of relational programming. ACM SIGPLAN Notices, 18(3), pp.36-45.
- Behnke, R., Berghammer, R., Meyer, E. and Schneider, P., 1998. RELVIEW—A system for calculating with relations and relational programming. In Fundamental Approaches to Software Engineering: First International Conference, FASE'98 Held as Part of the Joint European Conferences on Theory and Practice of Software, ETAPS'98 Lisbon, Portugal, March 28–April 4, 1998 Proceedings 1 (pp. 318-321). Springer Berlin Heidelberg.
- Van Emden, M.H.; Kowalski, R.A. (October 1976). "The semantics of predicate logic as a programming language". Journal of the ACM. 23 (4): 733–742. doi:10.1145/321978.321991. S2CID 11048276.
- Clark, K.L. (1977). "Negation as Failure". Logic and Data Bases. Boston, MA: Springer US. pp. 293–322. doi:10.1007/978-1-4684-3384-5_11. ISBN 978-1-4684-3386-9.
- Gelfond, M.; Przymusinska, H.; Przymusinski, T. (1989). "On the relationship between circumscription and negation as failure". Artificial Intelligence. 38 (1): 75–94. doi:10.1016/0004-3702(89)90068-4.
- Shepherdson, J.C. (1984). "Negation as failure: a comparison of Clark's completed data base and Reiter's closed world assumption". The Journal of Logic Programming. 1 (1): 51–79. doi:10.1016/0743-1066(84)90023-2.
- Denecker, M.; Ternovska, E. (2008). "A logic of nonmonotone inductive definitions". ACM Transactions on Computational Logic. 9 (2): 14:1–14:52. arXiv:cs/0501025. doi:10.1145/1342991.1342998. S2CID 13156469.
- Rao, P.; Sagonas, K.; Swift, T.; Warren, D.S.; Freire, J. (July 28–31, 1997). XSB: A system for efficiently computing well-founded semantics. Logic Programming And Nonmonotonic Reasoning: 4th International Conference, LPNMR'97. Dagstuhl Castle, Germany: Springer Berlin Heidelberg. pp. 430–440. doi:10.1007/3-540-63255-7_33.
- W. Chen; D. S. Warren (January 1996). "Tabled Evaluation with Delaying for General Logic Programs". Journal of the ACM. 43 (1): 20–74. doi:10.1145/227595.227597. S2CID 7041379.
- Phan Minh Dung (1995). "On the acceptability of arguments and its fundamental role in nonmonotonic reasoning, logic programming, and n–person games". Artificial Intelligence. 77 (2): 321–357. doi:10.1016/0004-3702(94)00041-X.
- Colmerauer, A. and Roussel, P., 1996. The birth of Prolog. In History of programming languages---II (pp. 331-367).
- ^ Warren, D.H., Pereira, L.M. and Pereira, F., 1977. Prolog-the language and its implementation compared with Lisp. ACM SIGPLAN Notices, 12(8), pp.109-115.
- Thagard, Paul (2005). Mind: Introduction to Cognitive Science. The MIT Press. p. 11. ISBN 9780262701099.https://www.google.co.uk/books/edition/Mind_second_edition/gjcR1U2HT7kC?hl=en&gbpv=1&pg=PP11&printsec=frontcover
- Stenning, Keith; van Lambalgen, Michiel (2008). Human reasoning and cognitive science. MIT Press. ISBN 978-0-262-19583-6.https://philpapers.org/archive/STEHRA-5.pdf
- Van Lambalgen, M. and Hamm, F., 2008. The proper treatment of events. John Wiley & Sons. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=3126320bb6e37ca3727fed404828b53fc56ff063
- Reiter, R., 1991. The frame problem in the situation calculus: A simple solution (sometimes) and a completeness result for goal regression. Artificial and Mathematical Theory of Computation, 3.
- Merritt, D., 2012. Building expert systems in Prolog. Springer Science & Business Media. https://ds.amu.edu.et/xmlui/bitstream/handle/123456789/4434/%28Text%20Book%29%20Building%20Expert%20Systems%20in%20Prolog.pdf?sequence=1&isAllowed=y
- Sergot, M.J.; Sadri, F.; Kowalski, R.A.; Kriwaczek, F.; Hammond, P; Cory, H.T. (1986). "The British Nationality Act as a logic program" (PDF). Communications of the ACM. 29 (5): 370–386. doi:10.1145/5689.5920. S2CID 5665107.
- Prakken, H.; Sartor, G. (October 2015). "Law and logic: a review from an argumentation perspective" (PDF). Artificial Intelligence. 227: 214–245. doi:10.1016/j.artint.2015.06.005. S2CID 4261497.
- Satoh, K., 2023. PROLEG: Practical legal reasoning system. In Prolog: The Next 50 Years (pp. 277-283). Cham: Springer Nature Switzerland.
- ^ Körner, Philipp; Leuschel, Michael; Barbosa, João; Costa, Vítor Santos; Dahl, Verónica; Hermenegildo, Manuel V.; Morales, Jose F.; Wielemaker, Jan; Diaz, Daniel; Abreu, Salvador; Ciatto, Giovanni (November 2022). "Fifty Years of Prolog and Beyond". Theory and Practice of Logic Programming. 22 (6): 776–858. arXiv:2201.10816. doi:10.1017/S1471068422000102. ISSN 1471-0684.
- ^ Bonner, A.J. and Kifer, M., 1993, February. Transaction Logic Programming. In ICLP (Vol. 93, pp. 257-279).
- Genesereth, M., 2023. Dynamic logic programming. In Prolog: The Next 50 Years (pp. 197-209). Cham: Springer Nature Switzerland.
- Kowalski, R., Sadri, F., Calejo, M. and Dávila, J., 2023. Combining logic programming and imperative programming in LPS. In Prolog: The Next 50 Years (pp. 210-223). Cham: Springer Nature Switzerland.
- Aho, A.V. and Ullman, J.D., 1979, January. Universality of data retrieval languages. In Proceedings of the 6th ACM SIGACT-SIGPLAN symposium on Principles of programming languages (pp. 110-119).
- Maier, D., Tekle, K.T., Kifer, M. and Warren, D.S., 2018. Datalog: concepts, history, and outlook. In Declarative Logic Programming: Theory, Systems, and Applications (pp. 3-100).
- Eiter, T., Ianni, G. and Krennwallner, T., 2009. Answer Set Programming: A Primer. In Reasoning Web. Semantic Technologies for Information Systems: 5th International Summer School 2009, Brixen-Bressanone, Italy, August 30-September 4, 2009, Tutorial Lectures (pp. 40-110).
- Arias, J.; Carro, M.; Salazar, E.; Marple, K.; Gupta, G. (2018). "Constraint Answer Set Programming without Grounding". Theory and Practice of Logic Programming. 18 (3–4): 337–354. arXiv:1804.11162. doi:10.1017/S1471068418000285. S2CID 13754645.
- Denecker, M.; Kakas, A.C. (July 2000). "Special issue: abductive logic programming". Journal of Logic Programming. 44 (1–3): 1–4. doi:10.1016/S0743-1066(99)00078-3.
- Eshghi, K., 1988, August. Abductive Planning with Event Calculus. In ICLP/SLP (pp. 562-579).
- Eshghi, K. and Kowalski, R.A., 1989, June. Abduction Compared with Negation by Failure. In ICLP (Vol. 89, pp. 234-255).
- Nienhuys-Cheng, Shan-hwei; Wolf, Ronald de (1997). Foundations of inductive logic programming. Lecture notes in computer science Lecture notes in artificial intelligence. Berlin Heidelberg: Springer. p. 173. ISBN 978-3-540-62927-6.
- Flach, P.A. and Kakas, A.C., 2000. On the relation between abduction and inductive learning. In Abductive Reasoning and Learning (pp. 1-33). Dordrecht: Springer Netherlands.
- Cropper, A. and Dumančić, S., 2022. Inductive logic programming at 30: a new introduction. Journal of Artificial Intelligence Research, 74, pp.765-850.
- Russell, S., 2019. Human compatible: Artificial intelligence and the problem of control. Penguin.
- Shunichi Uchida and Kazuhiro Fuchi. Proceedings of the FGCS Project Evaluation Workshop. Institute for New Generation Computer Technology (ICOT). 1992.
- Hewitt, Carl (27 April 2016). "Inconsistency Robustness for Logic Programs". Hal Archives. pp. 21–26. Retrieved 7 November 2016.
- Saraswat, V.A. and Rinard, M., 1989, December. Concurrent constraint programming. In Proceedings of the 17th ACM SIGPLAN-SIGACT symposium on Principles of programming languages (pp. 232-245).
- Chen, Weidong; Kifer, Michael; Warren, David S. (February 1993). "HiLog: A foundation for higher-order logic programming". Journal of Logic Programming. 15 (3): 187–230. doi:10.1016/0743-1066(93)90039-J.
- Miller, D.A. and Nadathur, G., 1986, July. Higher-order logic programming. In International Conference on Logic Programming (pp. 448-462). Berlin, Heidelberg: Springer Berlin Heidelberg.
- Andreoli, Jean-Marc (1 June 1992). "Logic Programming with Focusing Proofs in Linear Logic". Journal of Logic and Computation. 2 (3): 297–347. doi:10.1093/logcom/2.3.297.
- Hodas, Joshua; Miller, Dale (1994). "Logic Programming in a Fragment of Intuitionistic Linear Logic". Information and Computation. 110 (2): 327–365. doi:10.1006/inco.1994.1036.
- Kobayashi, Naoki; Yonezawa, Akinori (1994). Asynchronous communication model based on linear logic. US/Japan Workshop on Parallel Symbolic Computing. pp. 279–294. CiteSeerX 10.1.1.42.8749.
- Miller, Dale (30 September 1996). "Forum: A Multiple-Conclusion Specification Logic". Theoretical Computer Science. 165 (1): 201–232. doi:10.1016/0304-3975(96)00045-X.
- Kifer, M. and Lausen, G., 1989, June. F-logic: a higher-order language for reasoning about objects, inheritance, and scheme. In Proceedings of the 1989 ACM SIGMOD international conference on Management of data (pp. 134-146).
- de Moura, P.J.L., 2003. Design of an Object-Oriented Logic Programming Language (Doctoral dissertation, Universidade da Beira Interior).
- Yang, G. and Kifer, M., 2000, July. FLORA: Implementing an efficient DOOD system using a tabling logic engine. In International Conference on Computational Logic (pp. 1078-1093). Berlin, Heidelberg: Springer Berlin Heidelberg.
Sources
General introductions
- Baral, C.; Gelfond, M. (1994). "Logic programming and knowledge representation" (PDF). The Journal of Logic Programming. 19–20: 73–148. doi:10.1016/0743-1066(94)90025-6.
- Kowalski, R. A. (1988). "The early years of logic programming" (PDF). Communications of the ACM. 31: 38–43. doi:10.1145/35043.35046. S2CID 12259230.
- Lloyd, J. W. (1987). Foundations of Logic Programming (2nd ed.). Springer-Verlag.
Other sources
- John McCarthy. "Programs with common sense". Symposium on Mechanization of Thought Processes. National Physical Laboratory. Teddington, England. 1958.
- Miller, Dale; Nadathur, Gopalan; Pfenning, Frank; Scedrov, Andre (1991). "Uniform proofs as a foundation for logic programming". Annals of Pure and Applied Logic. 51 (1–2): 125–157. doi:10.1016/0168-0072(91)90068-W.
- Ehud Shapiro (Editor). Concurrent Prolog. MIT Press. 1987.
- James Slagle. "Experiments with a Deductive Question-Answering Program". CACM. December 1965.
- Gabbay, Dov M.; Hogger, Christopher John; Robinson, J.A., eds. (1993-1998). Handbook of Logic in Artificial Intelligence and Logic Programming.Vols. 1–5, Oxford University Press.
Further reading
- Carl Hewitt. "Procedural Embedding of Knowledge in Planner". IJCAI 1971.
- Carl Hewitt. "The Repeated Demise of Logic Programming and Why It Will Be Reincarnated". AAAI Spring Symposium: What Went Wrong and Why: Lessons from AI Research and Applications 2006: 2–9.
- Evgeny Dantsin, Thomas Eiter, Georg Gottlob, Andrei Voronkov: Complexity and expressive power of logic programming. ACM Comput. Surv. 33(3): 374–425 (2001)
- Ulf Nilsson and Jan Maluszynski, Logic, Programming and Prolog
External links
- Logic Programming Virtual Library entry
- Bibliographies on Logic Programming Archived 2008-12-04 at the Wayback Machine
- Association for Logic Programming (ALP)
- Theory and Practice of Logic Programming (journal)
- Logic programming in C++ with Castor
- Logic programming Archived 2011-09-03 at the Wayback Machine in Oz
- Prolog Development Center
- Racklog: Logic Programming in Racket
Types of programming languages | |
---|---|
Level | |
Generation |