LTN: Logic Tensor Networks

Paulo Shakarian, Chitta Baral, Gerardo I. Simari, Bowen Xi, Lahari Pokala

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

In this chapter, we provide an overview of Logic Tensor Networks (LTNs, for short), a formalism that makes use of tensor embeddings—n-dimensional vector representations—of elements tied to a logical syntax, which has seen traction in NSR literature in the past few years. After briefly recalling Real Logic, the underlying language of LTNs, we discuss the representation of different kinds of knowledge in formalism, the three main tasks that can be addressed with them (learning, reasoning, and query answering), and finally, describe several use cases that have shown the usefulness of LTNs in many tasks that are central to the construction of intelligent systems.

Original languageEnglish (US)
Title of host publicationSpringerBriefs in Computer Science
PublisherSpringer
Pages33-41
Number of pages9
DOIs
StatePublished - 2023

Publication series

NameSpringerBriefs in Computer Science
VolumePart F1425
ISSN (Print)2191-5768
ISSN (Electronic)2191-5776

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'LTN: Logic Tensor Networks'. Together they form a unique fingerprint.

Cite this