Difference between revisions of "Translation quality statistics"

From Apertium
Jump to navigation Jump to search
(sortable)
Line 1: Line 1:
 
This page aims to give an overview of the ''quality'' of various translators available in the Apertium platform.
 
This page aims to give an overview of the ''quality'' of various translators available in the Apertium platform.
* [[wikipedia:Word error rate|Word Error Rate]] (WER) and Position-independent Word Error Rate (PWER) are measures of post-edition effort. The number gives the expected number of words needed to be corrected in 100 words of running text. So, a WER of 4.7% indicates that in a given 100 words of text, 4.7 of them will need to be corrected by the post-editor.
+
* [[wikipedia:Word error rate|Word Error Rate]] (WER) and Position-independent Word Error Rate (PWER) are measures of post-edition effort. The number gives the expected number of words needed to be corrected in 100 words of running text. So, a WER of 4.7% indicates that in a given 100 words of text, 4.7 of them will need to be corrected by the post-editor – '''for WER, lower is better'''.
* [[wikipedia:BLEU|Bilingual Evaluation Understudy]] (BLEU) varies from 0 (bad) to 1 (perfect).
+
* [[wikipedia:BLEU|Bilingual Evaluation Understudy]] (BLEU) varies from 0 (bad) to 1 (perfect), so '''for BLEU, higher is better'''.
   
 
Precise numbers may vary due to differences in how sentences are selected to be evaluated. In some pairs, unknown words may be taken into account, in others not. Evaluations where unknown words are allowed will likely give me accurate numbers for postedition error, providing the corpus on which the evaluation was made resembles the corpus on which further translations will be made. Evaluations not allowing unknown words will give a better indication of "best-case" working of transfer rules.
 
Precise numbers may vary due to differences in how sentences are selected to be evaluated. In some pairs, unknown words may be taken into account, in others not. Evaluations where unknown words are allowed will likely give me accurate numbers for postedition error, providing the corpus on which the evaluation was made resembles the corpus on which further translations will be made. Evaluations not allowing unknown words will give a better indication of "best-case" working of transfer rules.

Revision as of 13:26, 4 September 2014

This page aims to give an overview of the quality of various translators available in the Apertium platform.

  • Word Error Rate (WER) and Position-independent Word Error Rate (PWER) are measures of post-edition effort. The number gives the expected number of words needed to be corrected in 100 words of running text. So, a WER of 4.7% indicates that in a given 100 words of text, 4.7 of them will need to be corrected by the post-editor – for WER, lower is better.
  • Bilingual Evaluation Understudy (BLEU) varies from 0 (bad) to 1 (perfect), so for BLEU, higher is better.

Precise numbers may vary due to differences in how sentences are selected to be evaluated. In some pairs, unknown words may be taken into account, in others not. Evaluations where unknown words are allowed will likely give me accurate numbers for postedition error, providing the corpus on which the evaluation was made resembles the corpus on which further translations will be made. Evaluations not allowing unknown words will give a better indication of "best-case" working of transfer rules.

Translator Date Version Direction Unknown
words
WER PWER BLEU Reference / Notes
apertium-eo-fr 11th February 2011 fr → eo Yes 22.4 % 20.6 % - French_and_Esperanto/Quality_tests
eo → fr - - -
apertium-mk-en 19th September 2010 0.1.0 mk → en No 43.96% 31.22% - Percentage is average of 1,000 words from SETimes and 1,000 from Wikipedia
en → mk - -
apertium-mk-bg 31st August 2010 0.1.0 mk → bg Yes 26.67 % 25.39 % - -
bg → mk - -
apertium-nn-nb 12th October 2009 0.6.1 nn → nb Yes - - - Unhammer and Trosterud, 2009
(two reference translations)
nb → nn 32.5%, 17.7% - 0.74
apertium-br-fr March 2010 0.2.0 br → fr No 38 % 22 % - Tyers, 2010
fr → br - - -
apertium-sv-da 12th October 2009 0.5.0 sv → da Yes 30.3 % 27.7 % - Swedish_and_Danish/Evaluation
da → sv - - -
apertium-eu-es 2nd September 2009 eu → es Unknown 72.4 % 39.8 % - Ginestí-Rosell et al., 2009
es → eu - - -
apertium-cy-en 2nd January 2009 cy → en Unknown 55.7 % 30.5 % - Tyers and Donnelly, 2009
en → cy - - -
apertium-eo-en 8th May 2009 0.9.0 en → eo Unknown 21.0 % 19,0 % - English_and_Esperanto/Evaluation
eo → en - - -
apertium-es-pt 15th May 2006 es → pt Unknown 4.7 % - - Armentano et al., 2006
pt → es 11.3 % - -
apertium-oc-ca 10th May 2006 oc → ca Unknown 9.6 % - - Armentano and Forcada, 2006
ca → oc - - -
apertium-pt-ca 28th July 2008 pt → ca Unknown 16.6% - - Armentano and Forcada, 2008
ca → pt 14.1% - -
apertium-en-es May 2009 en → es Unknown - - 0.1851 Sánchez-Martínez, 2009
es → en - - 0.1881


References