The following abbreviations are used: 1 = first person; 2 = second person; 3 = third person; AD = agent disambiguator
; ABS = absolutive; AUX = auxiliary; DAT = dative; DECL = declarative; DEM = demonstrative; ERG = ergative; FUT = future; IO = indirect object; IPFV = imperfective; IRR = irrealis; L = locative; N = non-; OBJ = object; PFV = perfective; PL = plural; PNCT = punctual; poss = possessive; prs = present; PST = past; Q = question marker; SBJ = subject; SF = stem formative; SG = singular; STAT = stative; TNS = tense.
In the analysis phase a series of modules (tokenizer, morphological analyzer, part of speech tagger, chunker, named entity recognizer, parser, and word sense disambiguator
) generate an intermediate representation that is easy to transfer (syntax transfer, lexical transfer, and some transliteration).
For the present study we have made use of a sentence builder, syllabifier, morphological analyzer, disambiguator
In the following subsections we are going to give details about the three most important components of the framework that concern the disambiguation procedure, the linguistic analyzer, the ontologies processor and the semantic disambiguator
However, TRADUNED is a prototype of knowledge-driven word sense disambiguator
which relies on the idea that all the information necessary for the discrimination of either homonymous senses or polysemous ones in an input text can be found in the Collins Cobuild English Language Dictionary entries.
In their research on establishing a lower-bound baseline for measuring the significance of a disambiguator
's accuracy, Gale et al.
The SPIRAL procedure  cycles information through Guo's seed senses, Slator's LDOCE parser, the Genus Disambiguator
, and Plate and McDonald's distributional network so as to yield a sense-tagging of the definition words in frames.
Though (13a) is the only pp-occurrence without any formal disambiguator
in SUC and hence could easily have been excluded from the data, sentences showing formal markers may also have been affected by discourse-induced information.
Two examples were presented in earlier sections: the Necker cube (Figures 1, 3, and 4) and Cottrell's disambiguator
These are quite potent disambiguators
, but it is easy to construct examples that remain ambiguous even if these factors are all taken into account.
We know from the performance of disambiguators
for the standard POS that the baseline performance is 90 percent or better.