Meaningful Pose-Based Sign Language Evaluation

Jiang, Zifan, Leong, Colin, Moryossef, Amit, Göhring, Anne, Rios, Annette, Cory, Oliver, Ivashechkin, Maksym, Tarigopula, Neha, Zhang, Biao, Sennrich, Rico, Ebling, Sarah

arXiv.org Artificial Intelligence 

We present a comprehensive study on meaningfully evaluating sign language utterances in the form of human skeletal poses. The study covers keypoint distance-based, embedding-based, and back-translation-based metrics. We show tradeoffs between different metrics in different scenarios through automatic meta-evaluation of sign-level retrieval and a human correlation study of text-to-pose translation across different sign languages. Our findings and the open-source pose-evaluation toolkit provide a practical and reproducible way of developing and evaluating sign language translation or generation systems.