Difference between revisions of "Apertium-uzb-kaa"
Jump to navigation
Jump to search
Firespeaker (talk | contribs) |
(update apertium-uzb stats) |
||
Line 6: | Line 6: | ||
! Week |
! Week |
||
! Dates |
! Dates |
||
! |
! Stems goal |
||
! Achieved? |
! Achieved? |
||
! Stems |
! Stems |
||
! Cov uzb |
|||
! Cov kaa |
|||
! Corpus testvoc uzb→kaa <br/> (no *, no */@, no */@/# errors) |
! Corpus testvoc uzb→kaa <br/> (no *, no */@, no */@/# errors) |
||
! Corpus testvoc kaa→uzb <br/> (no *, no */@, no */@/# errors) |
! Corpus testvoc kaa→uzb <br/> (no *, no */@, no */@/# errors) |
||
Line 18: | Line 20: | ||
| 0 |
| 0 |
||
| 20/05—26/05 |
| 20/05—26/05 |
||
| |
| -- |
||
| -- |
| -- |
||
| |
| |
||
Line 24: | Line 26: | ||
* (uzb) 4083 |
* (uzb) 4083 |
||
* (uzb-kaa) 412 |
* (uzb-kaa) 412 |
||
| |
|||
* jam(75.22) |
|||
* udhr(34.21) |
|||
* bible(78.80) |
|||
| |
| |
||
* jam(17.7, 17.7, 14.8) |
* jam(17.7, 17.7, 14.8) |
||
Line 38: | Line 44: | ||
* jam(106.15, 102.91) |
* jam(106.15, 102.91) |
||
* bible() |
* bible() |
||
| || || |
|||
|- |
|||
| 1 |
|||
| 27/05—02/06 |
|||
| 5000 from rus-{uzb,kaa}<br/>to each monodix |
|||
| ✘ |
|||
| |
|||
* (kaa) 5757 |
|||
* (uzb) '''5934''' |
|||
* (uzb-kaa) 412 |
|||
| |
| |
||
* jam(79.61%) |
|||
* udhr(34.21%) |
|||
* bible(79.58%) |
|||
| || || || || || || uzb: 6c3887d |
|||
|- |
|- |
||
| 1 || 27/05—02/06 || 5412 || || || || || || || |
|||
|- |
|- |
||
| 2 || 03/06—09/06 || |
| 2 || 03/06—09/06 || 5000 from rus-{uzb,kaa}<br/>to each monodix || || || || || || || || || || |
||
|- |
|- |
||
| 3 || 10/06—16/06 || |
| 3 || 10/06—16/06 || 5000 from rus-{uzb,kaa}<br/>to each monodix || || || || || || || || || || |
||
|- |
|- |
||
| 4 || 17/06—23/06 || |
| 4 || 17/06—23/06 || 5000 from rus-{uzb,kaa}<br/>to each monodix || || || || || || || || || || |
||
|- |
|- |
||
|} |
|} |
Revision as of 23:51, 3 June 2019
Work plan
Week | Dates | Stems goal | Achieved? | Stems | Cov uzb | Cov kaa | Corpus testvoc uzb→kaa (no *, no */@, no */@/# errors) |
Corpus testvoc kaa→uzb (no *, no */@, no */@/# errors) |
WER,PER uzb→kaa | WER,PER kaa→uzb | Evaluation | Notes |
---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 20/05—26/05 | -- | -- |
|
|
|
|
|
|
|||
1 | 27/05—02/06 | 5000 from rus-{uzb,kaa} to each monodix |
✘ |
|
|
uzb: 6c3887d | ||||||
2 | 03/06—09/06 | 5000 from rus-{uzb,kaa} to each monodix |
||||||||||
3 | 10/06—16/06 | 5000 from rus-{uzb,kaa} to each monodix |
||||||||||
4 | 17/06—23/06 | 5000 from rus-{uzb,kaa} to each monodix |
Comments
- Bidix: number of stems in the bilingual dictionary.
- Number of stems is taken from the header "aperium-dixtools format-1line 10 55 apertium-uzb-kaa.uzb-kaa.dix" produces.
- Corpus testvoc: cat <txt> | apertium-uzb-kaa/testvoc/corpus/trimmed-coverage.sh {uzb-kaa,kaa-uzb}. Corpora can be found here.
- WER is calculated like shown below. Currently WER numbers are only a crude approximation, since input text and -ref text are probably independent translations (not parallel).
- Evaluation is taking words and performing an evaluation for post-edition word error rate (WER). The output for those words should be clean.
Related terms:
- Trimmed coverage means the coverage of both of the morphological analysers after being trimmed according to the bilingual dictionary of the pair, that is, only containing stems which are also in the bilingual dictionary.
- Testvoc for a category means that the category is testvoc clean, in both translation directions.
apertium-uzb-kaa$ perl ../../../../../../src/sourceforge-apertium/trunk/apertium-eval-translator/apertium-eval-translator.pl -test <(cat ../../../data4apertium/corpora/jam/uzb.txt | apertium -d . uzb-kaa) -ref ../../../data4apertium/corpora/jam/kaa.txt Test file: '/dev/fd/63' Reference file '../../../data4apertium/corpora/jam/kaa.txt' Statistics about input files ------------------------------------------------------- Number of words in reference: 341 Number of words in test: 309 Number of unknown words (marked with a star) in test: 284 Percentage of unknown words: 91.91 % Results when removing unknown-word marks (stars) ------------------------------------------------------- Edit distance: 325 Word error rate (WER): 95.31 % Number of position-independent correct words: 26 Position-independent word error rate (PER): 92.38 % Results when unknown-word marks (stars) are not removed ------------------------------------------------------- Edit distance: 328 Word Error Rate (WER): 96.19 % Number of position-independent correct words: 21 Position-independent word error rate (PER): 93.84 % Statistics about the translation of unknown words ------------------------------------------------------- Number of unknown words which were free rides: 3 Percentage of unknown words that were free rides: 1.06 %