Multilingual Machine Translation with Quantum Encoder Decoder Attention-based Convolutional Variational Circuits

Dikshit, Subrit, Tiwari, Ritu, Jain, Priyank

arXiv.org Artificial Intelligence 

In the 2000s, artificial intelligence and deep learning - based systems became prevalent and took over the world by storm . Many modern multilingual state - of - the - art [ 1 ] networks and cloud - based translation services like Google Translate, Microsoft Translator, ChatGPT [ 2 ], DeepSeek [ 3 ] emerged and became available during this era . These Multilingual Large Language Networks are architected around Gated Recurrent Unit Networks ( GRU) [ 4 ], Long Short - Term Memory ( LSTM) [ 5 ], Bidirectional Encoder Representations from Transformers ( BERT) [ 6 ], Generative pre - trained transformer ( GPT) [ 7 ], Text - to - Text Transfer Transformer ( T5) [ 8 ] and similar attention - based transformers [ 9 ] networks with finer and improve d amendments to architectures. W hile m ost academicians, researchers, and organisations focused on these classical computing realm aspects and less emphasis was put on multilingual machine tr a nslation in the quantum computing realm . S ome practitioners and scholars who emphasis ed quantum computing for machine tr a nslation and their associated works are discussed in the Related Works section later. However, these researches under - u tilize d simulat ion and execution on quantum computing hardware along with under - exploit ing the novel perceptions of quantum convolution [ 10 ], quantum pooling [ 11 ], quantum variational circuit [ 12 ] and quantum attention [ 13 ] as quantum - based software amendments that are studie d, demonstrate d and stunned as shortcomings in QEDACVC system .