For sites that want the latest word additions to translators, but still don't want to update to a possibly-breaking SVN release, you need some kind of Continuous Integration (CI) system.
We currently have "partly" CI with Tino's nightly packages, which handle these criteria:
- the package and all its dependencies compile and install correctly
- "make test" runs – but only on a few select pairs. It is on Tino's ToDo to add for all released pairs, along with a set of varied sentences where translation output should be stable, but small enough that the test runs fast (a few minutes at most).
 Idea for Wikimedia
However, even with the above criteria, it's quite possible that translation quality overall worsens.
One possible idea for Wikimedia's Content Translation installation would be to have the criteria that lexical coverage does not decrease, and that WER does not decrease (too much) on a their corpus of previous translations. So
- take the latest nightly build (which passes Tino's existing tests),
- check on CT dump that coverage is at least as good as previously used build,
- check that WER is not more than 1% (or something) worse than previously used build.
If these criteria pass, we move forwards in SVN revision to that build, otherwise we stay on the build we're on.
A decreased WER doesn't necessarily mean worse translations, but since we're only going forwards in SVN revisions, a failure to improve WER at one commit might be counteracted by a later commit. And if the developer makes a real release, we always upgrade to that.