Difference between revisions of "Task ideas for Google Code-in/Evaluation of translation of an existing pair"
Jump to navigation
Jump to search
Khannatanmai (talk | contribs) (Created page with "Given a language pair, evaluate the translation quality on a corpus of 100 sentences. You must evaluate each sentence in three ways: # Overall translation quality (0-5) # Flu...") |
Popcorndude (talk | contribs) |
||
(2 intermediate revisions by one other user not shown) | |||
Line 1: | Line 1: | ||
− | Given a language pair, evaluate the translation quality on a corpus of |
+ | Given a language pair, evaluate the translation quality on a corpus of 50 sentences. |
You must evaluate each sentence in three ways: |
You must evaluate each sentence in three ways: |
||
# Overall translation quality (0-5) |
# Overall translation quality (0-5) |
||
+ | # Fluency (0-5): How well-formed is the target sentence in the target language. |
||
− | # Fluency (0-5) |
||
− | # Adequacy/Fidelity (0-5) |
+ | # Adequacy/Fidelity (0-5): How well does the translation capture the meaning of the original sentence. |
Ask for a corpus if you claim this task. |
Ask for a corpus if you claim this task. |
||
+ | |||
+ | [[Category:Tasks_for_Google_Code-in|Evaluation of translation of an existing pair]] |
Latest revision as of 19:56, 12 April 2021
Given a language pair, evaluate the translation quality on a corpus of 50 sentences.
You must evaluate each sentence in three ways:
- Overall translation quality (0-5)
- Fluency (0-5): How well-formed is the target sentence in the target language.
- Adequacy/Fidelity (0-5): How well does the translation capture the meaning of the original sentence.
Ask for a corpus if you claim this task.