TGEA 2.0 Supplementary Materials A Appendix
–Neural Information Processing Systems
Table 2: The number of erroneous texts generated with different decoding strategies. Figure 2: The distribution of MiSEW over the number of tokens contained in each MiSEW . We have fine-tuned several commonly used Chinese PLMs as baselines. All models have 12 attention heads and the hidden size is 768. We train these models on 8 Tesla P100 with 16G memory.
Neural Information Processing Systems
Aug-19-2025, 00:11:46 GMT
- Country:
- Asia > China
- Anhui Province (0.04)
- Beijing > Beijing (0.04)
- Guangdong Province > Guangzhou (0.04)
- Tianjin Province > Tianjin (0.04)
- Oceania > Australia
- Asia > China
- Industry:
- Health & Medicine > Therapeutic Area (0.94)
- Law (1.00)
- Technology: