Improving Neural Diarization through Speaker Attribute Attractors and Local Dependency Modeling
Palzer, David, Maciejewski, Matthew, Fosler-Lussier, Eric
–arXiv.org Artificial Intelligence
ABSTRACT In recent years, end-to-end approaches have made notable progress in addressing the challenge of speaker diarization, which involves segmenting and identifying speakers in multi-talker recordings. One such approach, Encoder-Decoder Attractors (EDA), has been proposed to handle variable speaker counts as well as better guide the network during training. In this study, we extend the attractor paradigm by moving beyond direct speaker modeling and instead focus on representing more detailed'speaker attributes' through a multistage process of intermediate representations. Additionally, we enhance the architecture by replacing transformers with conformers, a convolution-augmented transformer, to model local dependencies. Experiments demonstrate improved di-arization performance on the CALLHOME dataset.
arXiv.org Artificial Intelligence
Jun-9-2025