Jump to content

Lexicon-grammar

From Wikipedia, the free encyclopedia

Lexicon-Grammar is a method and a praxis of formalized description of human languages. It was developed by Maurice Gross since the end of the 1960s.

Its theoretical basis is Zellig S. Harris's[1] [2] distributionalism, and notably the notion of transformational grammar. The notation conventions are meant to be as clear and comprehensible as possible.

The method of Lexicon-Grammar is inspired from hard sciences. It focuses on data collection, hence on real use of language, both from a quantitative and qualitative point of view.

Lexicon-grammar also requires formalization. The results of the description must be sufficiently formal to allow an application to parsing, in particular through the realization of syntax analyzers. The formal model is such that the results of the description take the form of double-entry tables, also called matrixes. Lexicon-grammar tables code lexical entries together with their syntaxico-semantic properties. As a result, they formalize syntaxico-semantic information.

Theoretical basis

[edit]

The theoretical basis of Lexicon-grammar is Zellig Harris' distributionalism,[2] and in particular the notion of transformation in the sense of Zellig Harris. In fact, Maurice Gross was a student of Zellig Harris. The conventions for the presentation of grammatical information are intended to be as simple and transparent as possible. This concern comes from Zellig Harris, whose theory is oriented towards the directly observable "surface"; it is a difference from Generative grammar, which normally uses abstract structures such as deep structures.

Fact collection

[edit]

Lexicon-grammatical method is inspired by experimental science. It emphasizes the collection of facts, and thus the confrontation with the reality of language use, from a quantitative and a qualitative point of view.[3]

Quantitatively: a lexicon-grammar includes a program of systematic description of the lexicon. This involves large-scale work, which can be carried out by teams and not by individual specialists. The exclusive search for general rules of syntax, independent of the lexical material they handle, is denounced as a dead end. This is different from Generative grammar generative grammar, which values the notion of generalization.

Qualitatively: methodological precautions are applied to ensure good reproducibility of the observations, and in particular to guard against the risks associated with constructed examples. One of these precautions is to take as a minimum unit of meaning the basic sentence. Indeed, a word acquires a precise meaning only in a context; moreover, by inserting a word in a sentence, one has the advantage of manipulating a sequence that may be judged acceptable or unacceptable. It is at this price that syntaxico-semantic properties could be considered as defined with sufficient precision to make sense to compare them with the whole lexicon. These precautions have evolved in line with needs and the appearance of new technical means. Thus, from the beginning of the 1990s, the contributors of the lexicon-grammar have been able to use more and more easily the use of attested examples in corpora. This new precaution has simply been added to the previous ones, making the lexicon-grammatical method a method that belongs to both introspective linguistics and corpus linguistics, much as advocated by Fillmore. The American projects FrameNet and VerbNet show a relative convergence towards objectives close to those of Lexicon-grammar.

Formalisation

[edit]

Lexicon-grammar also requires formalization. The results of the description must be sufficiently formal to allow for :

- a verification by comparison with the reality of language use;

- an application to the automatic processing of languages, and more particularly to deep linguistic processing, in particular through the development of syntax analysers by computer scientists.

References

[edit]
  1. ^ Harris, Zellig (1964). "Transformations in Linguistic Structure". Proceedings of the American Philosophical Society. 108 (5).
  2. ^ a b Harris, Zellig (1976). Notes du cours de syntaxe. Paris: Seuil.
  3. ^ Laporte, Eric (2015). "The Science of Linguistics". Inference. International Review of Science. 1 (2). doi:10.37282/991819.15.4.

See also

[edit]
[edit]

Selected bibliography

[edit]