Difference between revisions of "Tatar and Russian"

From Apertium
Jump to navigation Jump to search
m
Line 1: Line 1:
 
{{TOCD}}
 
{{TOCD}}
This is a language pair translating between [[Tatar]] and [[Russian]]. The pair is currently located in [https://svn.code.sf.net/p/apertium/svn/nursery/apertium-tat-rus/ nursery].
+
This is a language pair translating from [[Tatar]] to [[Russian]]. The pair is currently located in [https://svn.code.sf.net/p/apertium/svn/nursery/apertium-tat-rus/ nursery].
   
 
== Current state ==
 
== Current state ==

Revision as of 22:03, 18 May 2014

This is a language pair translating from Tatar to Russian. The pair is currently located in nursery.

Current state

TODO: add a stats table here in the manner it was done on pages for monolingual modules. Essential things to track (following Turkic-Turkic translator page:

  • testvoc (clean or not)
  • trimmed coverage
  • number of stems in bidix
  • WER on the development corpus
  • WER on unseen text(s)

Workplan (GSoC 2014)

This is a workplan for development efforts for the Tatar to Russian translator in Google Summer of Code 2014.

Major goals

  • Clean testvoc
  • 10000 top stems in bidix and at least 80% trimmed coverage
  • Constraint grammar of Tatar containing at least 1000 rules, which makes 90-95% of all words unambiguous, with at least 95% retaining the correct analysis.
  • Average WER on unseen texts below 50

Terminology

  • Trimmed coverage means the coverage the morphological analyser after being trimmed according to the bilingual dictionary of the pair, that is, only containing stems which are also in the bilingual dictionary.
  • Testvoc-lite for a category means taking one word per each sub-category and making the full paradigm of the word pass through the translator without debug symbols. E.g. "testvoc-lite for nouns clean" means that if we leave only one word in each of the N1, N-COMPOUND-PX etc. lexicons, they will pass through translator without [@#] errors.
  • Evaluation is taking words and performing an evaluation for post-edition word error rate (WER). The output for those words should be clean.
Week Dates Goal
1 19/05—25/05 Testvoc-lite for nouns clean