Difference between revisions of "Kazakh and Tatar/Work plan"

From Apertium
Jump to navigation Jump to search
m (Minor cleanup)
Line 5: Line 5:
* Evaluation is taking <math>n</math> words and performing an [[evaluation]] for post-edition word error rate (WER). The output for those <math>n</math> words should be clean.
* Evaluation is taking <math>n</math> words and performing an [[evaluation]] for post-edition word error rate (WER). The output for those <math>n</math> words should be clean.


("news" is <code>tat.news.2005-2011_300K-sentences.txt.bz2</code>, "wp" - <code>kaz.fixed.wikipedia.2010_100k-sentences.txt.bz2</code> <ref>Despite the fact that both lexicons have the same amount of stems, transducer for Tatar usually has a lower coverage. This is because some forms aren't implemented yet, e.g. some verbal forms. Here I gave numbers for both directions, although the domains are different. Not too sure about it, hope this is ok.</ref>)
("news" is <code>tat.news.2005-2011_300K-sentences.txt.bz2</code>, "wp" - <code>kaz.fixed.wikipedia.2010_100k-sentences.txt.bz2</code> <ref>Despite the fact that both lexicons have the same amount of stems, transducer for Tatar usually has a lower coverage. This is probably because some forms aren't implemented yet, e.g. some verbal forms. Here I give numbers for both directions. Note that domains are different.</ref>)


{|class=wikitable
{|class=wikitable
Line 35: Line 35:
| 11 || <s>30/07&mdash;05/08</s> || 80% || 79.9%,&nbsp;77.4% || - || || - || || Work on disambiguation.
| 11 || <s>30/07&mdash;05/08</s> || 80% || 79.9%,&nbsp;77.4% || - || || - || || Work on disambiguation.
|-
|-
| 12 || 06/08&mdash;12/08 || 80% || || ''all categories clean'' || || 500 words || || '''[http://wiki.apertium.org/wiki/Kazakh_and_Tatar/Evaluation Final evaluation]'''. Tidying up, releasing
| 12 || <s>06/08&mdash;12/08</s> || 80% || || ''all categories clean'' || || 500 words || || '''[http://wiki.apertium.org/wiki/Kazakh_and_Tatar/Evaluation Final evaluation]'''. Tidying up, releasing
|-
|-
|}
|}

Revision as of 10:28, 5 February 2014

This is a workplan for development efforts for Kazakh and Tatar translator in Google Summer of Code 2012.

  • Trimmed coverage means the coverage of both of the morphological analysers after being trimmed according to the bilingual dictionary of the pair, that is, only containing stems which are also in the bilingual dictionary.
  • Testvoc for a category means that the category is testvoc clean, in both translation directions.
  • Evaluation is taking words and performing an evaluation for post-edition word error rate (WER). The output for those words should be clean.

("news" is tat.news.2005-2011_300K-sentences.txt.bz2, "wp" - kaz.fixed.wikipedia.2010_100k-sentences.txt.bz2 [1])

Week Dates B-trimmed Cov. Goal Reached (news, wp) Testvoc Completed Evaluation WER Notes
0 23/04—21/05 45% ??, ?? <postadv> <ij> 500 words 13.39 % Preliminary evaluation. Translate the story total coverage and without diagnostics (in both directions). Get a baseline WER. Work on disambiguation, the morphological ambiguities in the story should be resolved.
1 21/05—27/05 50% ??, ?? <num> <post> - Basic numerals, and postpositions should be clean.
2 28/05—03/06 53% 56.7%, 57.4% <cnjcoo> <cnjadv> <cnjsub> -
3 04/06—10/06 59% 59.3%, 59.8% <adv> <abbr> 200 words 29.51% Work on disambiguation.
4 11/06—17/06 63% 59.3%, 59.9% <prn> <det> No -
5 18/06—24/06 68% 67.1%, 64.3%[2] <adj> <adj><advl> - Adjectives with attributive and adverbial function should be clean. Numerals + affixes should have their phonology solved.
6 25/06—01/07 70% 73.4%, 70.3% <n> <num><subst> <np> <adj><subst> 500 words 23.79%[3] Midterm evaluation. Work on disambiguation.
7 02/07—08/07 73% 73.5%, 70.4% - Worked on translation of kaz.crp.txt
8 09/07—15/07 75% 76.2%, 72.7% <prn> No - Pronouns should be testvoc clean.
9 16/07—22/07 77% 77.7%, 75.5% - 200 words Work on disambiguation.
10 23/07—29/07 80% 79.1%, 76.8% <v> No - Verbs should be testvoc clean.
11 30/07—05/08 80% 79.9%, 77.4% - - Work on disambiguation.
12 06/08—12/08 80% all categories clean 500 words Final evaluation. Tidying up, releasing

Notes

  1. Despite the fact that both lexicons have the same amount of stems, transducer for Tatar usually has a lower coverage. This is probably because some forms aren't implemented yet, e.g. some verbal forms. Here I give numbers for both directions. Note that domains are different.
  2. As of Tuesday 26th June
  3. apertium-kaz-tat$ perl ../../trunk/apertium-eval-translator/apertium-eval-translator.pl -t texts/3story.kaz-tat.12072012.txt -r texts/3story.kaz-tat.postedit.txt