TrInk: Ink Generation with Transformer Network
Jin, Zezhong, Desai, Shubhang, Chen, Xu, Fang, Biyi, Huang, Zhuoyi, Li, Zhe, Gan, Chong-Xin, Tu, Xiao, Mak, Man-Wai, Lu, Yan, Liu, Shujie
–arXiv.org Artificial Intelligence
In this paper, we propose TrInk, a Transformer-based model for ink generation, which effectively captures global dependencies. To better facilitate the alignment between the input text and generated stroke points, we introduce scaled positional embeddings and a Gaussian memory mask in the cross-attention module. Additionally, we design both subjective and objective evaluation pipelines to comprehensively assess the legibility and style consistency of the generated handwriting. Experiments demonstrate that our Transformer-based model achieves a 35.56\% reduction in character error rate (CER) and an 29.66% reduction in word error rate (WER) on the IAM-OnDB dataset compared to previous methods. We provide an demo page with handwriting samples from TrInk and baseline models at: https://akahello-a11y.github.io/trink-demo/
arXiv.org Artificial Intelligence
Sep-1-2025