Apertium has moved from SourceForge to GitHub.
If you have any questions, please come and talk to us on #apertium on irc.freenode.net or contact the GitHub migration team.

CG hybrid tagging

From Apertium
Jump to: navigation, search

[edit] Tagging

The tagger is more robust against missing ambiguity sets. If it encounters a new ambiguity set it picks the a) smallest b) most frequent of them (in that order). This using of the "nearest" ambiguity set is used in other places too.

Apart from feeding in ambiguity sets as is after CG as is the current common practice before this work, tagging using a mix of untagged and CG, discarding CG analysis in favour of untagged analysis when there is any ambiguity.


[edit] Tagger training

Both supervised and unsupervised:

Model part
0 1 2 3 4
Ambiguity classes Dictionary CG tagged Dictionary Dictionary CG tagged + trimming
Ambiguity class frequency Untagged CG tagged Untagged Untagged CG tagged
Corpus Untagged CG tagged CG tagged (nearest) Mix CG tagged (nearest)

Note that in the case of supervised training the corpus is used in conjunction with the tagged corpus.

[edit] Results

Compare with Comparison of part-of-speech tagging systems.

Personal tools