Difference between revisions of "Apertium-neural"
Jump to navigation
Jump to search
m (Francis Tyers moved page Neural machine translation to Apertium-neural) |
|||
Line 15: | Line 15: | ||
apertium-destxt | apertium-preprocess | apertium-encode | apertium-decode | apertium-retxt |
apertium-destxt | apertium-preprocess | apertium-encode | apertium-decode | apertium-retxt |
||
</pre> |
</pre> |
||
Backend: |
|||
* Own built(?) |
|||
* DyNet --- forked? |
|||
[[Category:Ideas]] |
[[Category:Ideas]] |
Revision as of 12:04, 28 June 2020
Apertium was originally developed to offer a free/open-source framework for creating RBMT systems. It was modelled on existing systems, but targetted at related languages, trying to do one thing well.
What might an Apertium NMT system for lesser-resourced and marginalised languages look like?
Thoughts:
- Trains without GPU or large compute
- Optimised for small corpora (under 100k parallel sentences)
- Includes linguistic tricks
- C++, autotools
- Works with existing tools (formatters, APY etc.)
Pipeline(?):
apertium-destxt | apertium-preprocess | apertium-encode | apertium-decode | apertium-retxt
Backend:
- Own built(?)
- DyNet --- forked?