Difference between revisions of "Task ideas for Google Code-in/Apy pipedebug"
Jump to navigation
Jump to search
(Created page with " Apy pipedebug") |
|||
Line 1: | Line 1: | ||
==APY endpoint== |
|||
To start on this work, first look at what [[Apertium-viewer does]]. |
|||
Now have a look at the /stats endpoint, to see a very simple example of an APY endpoint. |
|||
We already have an APY endpoint /translate that does full translation, but that part of the code is rather complex since it has to keep pipelines open between requests; our new /pipedebug endpoint should ''not'' reuse those pipelines, but open its own on every request. |
|||
Example call, where we've written a \0 in the q to signify a NUL: |
|||
curl 'http://localhost:2737/translate?langpair=isl|eng&q=te\0ost' |
|||
Example output: |
|||
{ |
|||
"responseDetails": null, |
|||
"responseStatus": 200, |
|||
"responseData": { |
|||
"output": [ |
|||
"^te/te<n><m>/te<vblex><inf>$^./.<sent><clb>$[][\n]\0^ost/ose<vblex><pp>/ost<n><m>$^./.<sent><clb>$[][\n]\0", |
|||
"^te<n><m>$^./.<sent><clb>$[][\n]\0^ost<n><m>$^./.<sent><clb>$[][\n]\0\0", |
|||
"^tea<n>$^./.<sent><clb>$[][\n]\0^cheese<n>$^./.<sent><clb>$[][\n]\0\0\0", |
|||
"^Tea<n>$^./.<sent><clb>$[][\n]\0^Cheese<n>$^./.<sent><clb>$[][\n]\0\0\0\0", |
|||
"Tea.[][\n]\0^Cheese.[][\n]\0\0\0\0\0" |
|||
], |
|||
"pipeline": [ |
|||
"lt-proc -z isl-eng.automorf.bin", |
|||
"apertium-tagger -z -g isl-eng.prob", |
|||
"lt-proc -z isl-eng.autobil.bin", |
|||
"apertium-transfer -z apertium-isl-eng.t1x isl-eng.t1x.bin", |
|||
"lt-proc -z -g isl-eng.autogen.bin", |
|||
] |
|||
} |
|||
} |
|||
You can use translate.py's parseModeFile() to grab the command line, but you can't use startPipeline() since we want to keep track of output between each step. |
|||
==apertium-viewer.html== |
|||
[[Category:Tasks for Google Code-in|Apy pipedebug]] |
[[Category:Tasks for Google Code-in|Apy pipedebug]] |
Revision as of 09:11, 18 November 2014
APY endpoint
To start on this work, first look at what Apertium-viewer does.
Now have a look at the /stats endpoint, to see a very simple example of an APY endpoint.
We already have an APY endpoint /translate that does full translation, but that part of the code is rather complex since it has to keep pipelines open between requests; our new /pipedebug endpoint should not reuse those pipelines, but open its own on every request.
Example call, where we've written a \0 in the q to signify a NUL:
curl 'http://localhost:2737/translate?langpair=isl%7Ceng&q=te\0ost'
Example output:
{ "responseDetails": null, "responseStatus": 200, "responseData": { "output": [ "^te/te<n><m>/te<vblex><inf>$^./.<sent><clb>$[][\n]\0^ost/ose<vblex><pp>/ost<n><m>$^./.<sent><clb>$[][\n]\0", "^te<n><m>$^./.<sent><clb>$[][\n]\0^ost<n><m>$^./.<sent><clb>$[][\n]\0\0", "^tea<n>$^./.<sent><clb>$[][\n]\0^cheese<n>$^./.<sent><clb>$[][\n]\0\0\0", "^Tea<n>$^./.<sent><clb>$[][\n]\0^Cheese<n>$^./.<sent><clb>$[][\n]\0\0\0\0", "Tea.[][\n]\0^Cheese.[][\n]\0\0\0\0\0" ], "pipeline": [ "lt-proc -z isl-eng.automorf.bin", "apertium-tagger -z -g isl-eng.prob", "lt-proc -z isl-eng.autobil.bin", "apertium-transfer -z apertium-isl-eng.t1x isl-eng.t1x.bin", "lt-proc -z -g isl-eng.autogen.bin", ] } }
You can use translate.py's parseModeFile() to grab the command line, but you can't use startPipeline() since we want to keep track of output between each step.