
A groundbreaking theoretical linguistic framework
The meaning-text theory (MTT) is a unique AI method that uses lexical functions to compute semantics. We’ve implemented software based on the way meaning-text theory conceives natural language from lexicon to semantics, which has led our linguistic team to create detailed and specific descriptions of the lexical units in a number of different languages.
The two men behind the MTT model
First put forward in Moscow by Aleksandr Žolkovskij and Igor Mel’čuk, MTT is a theoretical linguistic framework for the construction of models of natural language. A theory that provides a large and elaborate basis for linguistic description and, due to its formal character, lends itself particularly well to computer AI applications.

Aleksandr Žolkovskij

Igor Mel’čuk
The power of lexical function
One important discovery of meaning–text linguistics was the recognition that the elements in the lexicon (lexical units) of a language can be related to one another in an abstract semantic sense. These relations are represented in MTT as lexical functions (LF). Thus, the description of the lexicon in a crucial aspect of our software.
Lexical functions are a tool designed to formally represent the relations between lexical units. This allows us to formalize and describe — in a relatively simple manner — the complex network of lexical relationships within languages and assign a corresponding semantic weight to each element in a sentence. Most importantly, they allow us to relate analogous meanings, no matter which form they’re presented in.

The meaning in MTT
Natural languages are more restrictive than they may seem at first glance. In the majority of cases, we encounter frozen expressions sooner or later. And although these have varying degrees of rigidity, ultimately they are fixed, and must be described according to some characteristic, for example:
✓ Obtain a result
✓ Do a favor
✓ Ask / pose a question
✓ Raise a building
All of these examples show us that it’s the lexicon that imposes selection restrictions since we would hardly find “do a question” or “raise a favor” in a text. Indeed the most important factor when analyzing these phrases is that, from a meaning point of view, the elements don’t have the same semantic value. As illustrated in the examples above, the first element provides little information, with all of the meaning or semantic weight provided by the second element. The crucial matter here is that the semantic relationship between the first and second element is exactly the same in every example. Roughly, what we’re saying is “make X” (a result, a joke, a favor, a question, a building). This type of relation can be represented by the “Oper” lexical function.

Complexity is easy for our linguists
MTT collects around 60 different types of lexical functions. This allows, among other things, the description of relations such as synonymy (buying and purchasing are identical actions), hypernymy and hyponymy (a dog is a type of animal) and other relations among lexical units at the sentence level. This includes the Oper that we mentioned before, or ones expressing the concept “a lot”, i.e., if you smoke a lot you are a heavy smoker, but if you sleep a lot, you are not a “heavy sleeper”. All we can say is that you sleep like a log.
Our linguists adapt the principles of the meaning-text theory while describing the languages supported. User questions may be completely different on the surface but the questions underlying meaning is the same, and thus correctly understood by our semantic-based searches. The upshot is that users get fast, accurate results from their queries.

Another example of MTT in action
Let’s take these user questions:
“Purchasing a ticket for a child”.
“I want to buy a ticket for my son”.
Even though the words are different, the meaning conveyed is the same way in both cases. So both will get the same answer from a virtual assistant. At Inbenta, our semantic search engine is built within a rich and complex network of lexical relations so that it understands what users mean with their queries, regardless of the exact words they use to pose their questions.

Integrating Inbenta with your business
Request a demo to see live examplesand results procured by Inbenta’s patented NLP