User:Francis Tyers/Sandbox
Lexical selection
Information
- Surface form -- tud etc.
- Lemma -- den etc.
- Category -- n.f etc.
- Syntax -- @SUBJ etc.
Ideas
For some things linguistic knowledge is better, or easier. It is also better for hacking. For other things, statistics are better. Wider coverage for cheaper. The lexical selection module(s) should allow both the use of rules and of statistics. Rules for things we "know", statistics for those we don't.
Inferring rules from collocations
Rules as described below are already used in apertium-cy-en
, apertium-br-fr
and apertium-sme-smj
. This stage
would be the first pass of lexical selection.
- The bilingual dictionary has several translations for each ambiguous word.
- Rules are created to select between them based on context.
- For each word in the bilingual dictionary, collocations (n-grams) are extracted from a source language corpus.
+ in, skyldi ég þá á munúð hyggja, þar sem bóndi minn er einnig gamall?`` + ,Drottinn hefir séð raunir mínar. Nú mun bóndi minn elska mig.`` þunguð og ól son. Þá sagði hún: ,,Nú mun bóndi minn loks hænast að mér, því að é ,,Guð hefir gefið mér góða gjöf. Nú mun bóndi minn búa við mig, því að ég hefi af, þá haldi hann bótum uppi, slíkum sem bóndi konunnar kveður á hann, og greiði l niður fyrir húsdyrum mannsins, þar sem bóndi hennar var inni, og lá þar, uns b - 27 En er bóndi hennar reis um morguninn og lauk + kubúinn hafi soltið til þess að franskur bóndi þurfi ekki
- For each ambiguous word, these collocations are run with each of the entries in the bilingual dictionary through the translator.
- not the translator but just the bilingual dictionary? --Mlforcada 10:30, 10 October 2009 (UTC)
- how wide is the window around the problem word? is it symmetrical? --Mlforcada 10:30, 10 October 2009 (UTC)
as my farmer is Now #remember my farmer love Now #remember my farmer the lid Now #remember my farmer live *slíkum as the woman's farmer composes as her farmer was But is her farmer rose to French farmer need not as my husband is Now #remember my husband love Now #remember my husband the lid Now #remember my husband live *slíkum as the woman's husband composes as her husband was But is her husband rose to French husband need not
- Translations are scored on a target language corpus. -- The target language model training corpora would need to be preprocessed in some cases, to, for example give the word in POS or syntactic context.
n _farmer_ prn.pos, n _husband_ prn.pos
etc. The number of target words would be limited to the number of correspondences in the bilingual dictionary.
- What do you mean by the number of target words? --Mlforcada 10:30, 10 October 2009 (UTC)
- Wouldn't it be similar to do this as in Sánchez-Martínez et al. (2008), that is, run all "disambiguations" through the dictionary and score the translations themselves? --Mlforcada 10:30, 10 October 2009 (UTC)
Vector Element0 : -6.13119,as my farmer Vector Element1 : -1.5997,as my husband Vector Element0 : -5.93468,Now remember my farmer Vector Element1 : -3.19992,Now remember my husband Vector Element0 : -6.13119,slíkum as my farmer Vector Element1 : -1.5997,slíkum as my husband Vector Element0 : -5.55918,as her farmer Vector Element1 : -2.81087,as her husband Vector Element0 : -5.58205,But is her farmer Vector Element1 : -2.83373,But is her husband Vector Element0 : -4.54752,to French farmer Vector Element1 : -5.27222,to French husband
- Where the difference in score between one translation and another reaches a threshold, a rule is created in the form of:
MAP (husband) ("bóndi") IF (1 ("minn"));
- Morphology or syntax could also be included.
MAP (husband) ("bóndi") IF (1 PrnPos);
MAP (husband) ("bóndi") IF (-1 Genitive);
- It would be interesting to see if rules can be learnt which use different discriminators (e.g. surface form, syntax) etc.
- To select the winner, one could use a maximum-entropy approach in which the absence or presence of particular trigger words in the context would be treated as a feature. Then the winner would be chosen maximizing the probability. There is the work by Márquez et al. and also Armando Suárez's DLSI thesis. However, these fall quite far from being applicable in Apertium, so some engineering would be needed. --10:30, 10 October 2009 (UTC)
- Another interesting question is: instead of rules, could you detect (in some cases) clear multiwords that would go directly into dictionaries? --Mlforcada 10:30, 10 October 2009 (UTC)
- Advantages
- Fairly straightforward -- the rules can be created automatically in constraint grammar.
- Human readable / editable.
- Doesn't require parallel corpus -- although might work better with one.
- Unsupervised.
- Disadvantages
- Many rules will be slow.
- That is why probably it is a good idea to move as much inferred stuff as possible to the dictionary --Mlforcada 10:30, 10 October 2009 (UTC)
- Might not work very well.
- Relevant prior work
- Jin Yang (1999) "Towards the Automatic Acquisition of Lexical Selection Rules"
- Eckhard Bick (2005) "Dan2eng: Wide-Coverage Danish-English Machine Translation"
- Examples
Pediñ can translate as 'prier' or 'inviter'. If it is used transitively it means "inviter", intransitively it means "prier"
- o huñvreal muioc'h eget o pediñ .
- Leur *huñvreal plus que en train de prier .
- Koulskoude e tiviz Francis pediñ e zaou vreur d'ober ...
- Pourtant il décide Francis prier ses deux frères à faire ...
- O fal a zo pediñ arzourien a bep seurt evel kizellerien
- Leur objectif il est inviter des artistes de toute sorte comme les sculpteurs
- ... bleunioù ha peadra da yac'haat o zreid hag o pediñ evito ...
- ... de fleurs et des moyens à guérir leurs pieds et en train de prier pour eux ...
- ha tu a oa bet d'al labourerien pediñ o familhoù hag o mignoned
- ... et il y avait moyen été aux travailleurs prier leurs familles et leurs amis ...
- Raktresoù all a zo ivez : pediñ skrivagnerien a-benn eskemm ganto
- ... de Projets autres il est aussi : inviter des écrivains pour échanger avec eux ...
- Sharon Stone eo bet an hini gwellañ evit pediñ an embregerien da zisammañ
- *Sharon *Stone il a été les ceux le plus mieux pour prier les entrepreneurs à décharger ...
The current rule says: SUBSTITUTE (vblex) (vblex tv) ("pediñ" vblex) (1C NC);
, that is "choose 'inviter' if the next word can only be a common noun". Obviously, this fails in the case of definite NPs, o familhoù 'their families'.
- To read
- Hinrich Schütze "Automatic Word Sense Discrimination"
- Hang Li and Cong Li "Word Translation Disambigation Useing Bilingual Bootstrapping"
- E. Crestan "Which length for a Multi-level view of content for WSD"
- Noah Coccaro "Towards better integration of semantic prediction in statistical language modelling"
- Vickrey David "Word-sense disambiguation for machine translation"
- Lucia Specia "Multilingual versus monolingual WSD"
- Lucia Specia "Mining rules for WSD"
- SUPERTAGS.
Pipeline
You need, a tagged source language corpus:
^L'/El<det><def><mf><sg>$ ^origen/origen<n><m><sg>$ ^de/de<pr>$ ^l'/el<det><def><mf><sg>$ ^àbac/àbac<n><m><sg>$ ^està/estar<vblex><pri><p3><sg>$ ^literalment/literalment<adv>$ ^perdut/perdre<vblex><pp><m><sg>$ ^en/en<pr>$ ^el/el<det><def><m><sg>$ ^temps/temps<n><m><sp>$
A list of ambiguities extracted from your bilingual dictionary,
time<n>:<:temps<n><:0> weather<n>:<:temps<n><:1> languge<n>:<:llengua<n><:0> tongue<n>:<:llengua<n><:1> history<n>:<:història<n><:0> story<n>:<:història<n><:1> station<n>:<:estació<n><:0> season<n>:<:estació<n><:1>
Only the first tag is taken into account.
The script generate_sl_ambig_corpus.py
generates the possible paths in the test
corpus, by replacing the tag with the tag and the translation equivalent number and numbers the sentences for later recombination.
[1 ].[] ^L'/El<det><def><mf><sg>$ ^origen/origen<n><m><sg>$ ^de/de<pr>$ ^l'/el<det><def><mf><sg>$ ^àbac/àbac<n><m><sg>$ ^està/estar<vblex><pri><p3><sg>$ ^literalment/literalment<adv>$ ^perdut/perdre<vblex><pp><m><sg>$ ^en/en<pr>$ ^el/el<det><def><m><sg>$ ^temps/temps<n><:1><m><sp>$ [1 ].[] ^L'/El<det><def><mf><sg>$ ^origen/origen<n><m><sg>$ ^de/de<pr>$ ^l'/el<det><def><mf><sg>$ ^àbac/àbac<n><m><sg>$ ^està/estar<vblex><pri><p3><sg>$ ^literalment/literalment<adv>$ ^perdut/perdre<vblex><pp><m><sg>$ ^en/en<pr>$ ^el/el<det><def><m><sg>$ ^temps/temps<n><:0><m><sp>$
These are then translated with the rest of the Apertium pipeline:
[2 ].[] The origin of the abacus is literally lost in the weather [2 ].[] The origin of the abacus is literally lost in the time