Hi,

there has been some work on this.

See Section 2.4 in
http://www.aclweb.org/anthology/W16-2347
for some references.

The main ideas are to look for properties of machine translation - i.e.,
that is very literal and may have certain systematic errors.

This is a hard problem, though.

-phi

On Thu, Apr 5, 2018 at 10:14 PM, Ryan Coughlin <rya...@sis.edu> wrote:

> Hi all,
>
>   Hope you all had a happy Easter and are having a good spring (or autumn).
>
>   One of the admins from Open Subtitles asked me if I knew of a way to
> detect if a translation was machine or machine or human translation. The
> website seems to have a lot of submissions that are simply Google
> translated, and they're looking for a script to flag these submissions. Do
> you have any ideas how they might go about this?
>
>   I was thinking of running the SL against a few of the popular SMT
> programs, and seeing if the BLEU scores were too similar, but the method
> seems too random and computationally heavy. I'd appreciate any thoughts.
>
> thank you,
> Ryan
>
> _______________________________________________
> Moses-support mailing list
> Moses-support@mit.edu
> http://mailman.mit.edu/mailman/listinfo/moses-support
>
>
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to