Difference between revisions of "Task ideas for Google Code-in/Evaluation of translation of an existing pair"
Jump to navigation
Jump to search
Khannatanmai (talk | contribs) (Created page with "Given a language pair, evaluate the translation quality on a corpus of 100 sentences. You must evaluate each sentence in three ways: # Overall translation quality (0-5) # Flu...") |
Khannatanmai (talk | contribs) |
||
Line 3: | Line 3: | ||
You must evaluate each sentence in three ways: |
You must evaluate each sentence in three ways: |
||
# Overall translation quality (0-5) |
# Overall translation quality (0-5) |
||
# Fluency (0-5): How well-formed is the target sentence in the target language. |
|||
# Fluency (0-5) |
|||
# Adequacy/Fidelity (0-5) |
# Adequacy/Fidelity (0-5): How well does the translation capture the meaning of the original sentence. |
||
Ask for a corpus if you claim this task. |
Ask for a corpus if you claim this task. |
Revision as of 15:14, 4 December 2019
Given a language pair, evaluate the translation quality on a corpus of 100 sentences.
You must evaluate each sentence in three ways:
- Overall translation quality (0-5)
- Fluency (0-5): How well-formed is the target sentence in the target language.
- Adequacy/Fidelity (0-5): How well does the translation capture the meaning of the original sentence.
Ask for a corpus if you claim this task.