Goto

Collaborating Authors

 Dao, Dung


Multi-Agent Collaboration Mechanisms: A Survey of LLMs

arXiv.org Artificial Intelligence

With recent advances in Large Language Models (LLMs), Agentic AI has become phenomenal in real-world applications, moving toward multiple LLM-based agents to perceive, learn, reason, and act collaboratively. These LLM-based Multi-Agent Systems (MASs) enable groups of intelligent agents to coordinate and solve complex tasks collectively at scale, transitioning from isolated models to collaboration-centric approaches. This work provides an extensive survey of the collaborative aspect of MASs and introduces an extensible framework to guide future research. Our framework characterizes collaboration mechanisms based on key dimensions: actors (agents involved), types (e.g., cooperation, competition, or coopetition), structures (e.g., peer-to-peer, centralized, or distributed), strategies (e.g., role-based or model-based), and coordination protocols. Through a review of existing methodologies, our findings serve as a foundation for demystifying and advancing LLM-based MASs toward more intelligent and collaborative solutions for complex, real-world use cases. In addition, various applications of MASs across diverse domains, including 5G/6G networks, Industry 5.0, question answering, and social and cultural settings, are also investigated, demonstrating their wider adoption and broader impacts. Finally, we identify key lessons learned, open challenges, and potential research directions of MASs towards artificial collective intelligence.


VinaLLaMA: LLaMA-based Vietnamese Foundation Model

arXiv.org Artificial Intelligence

The surge in Large Language Models (LLMs) such as ChatGPT and GPT-4 has significantly advanced the field of artificial intelligence (AI), particularly in language processing. In 2023, Vietnam's AI sector witnessed a notable development with the introduction of several Vietnamese-centric LLMs, including BLOOMZ's Vietcuna, URA-LLaMA, PhoGPT, and dama-2. Amidst this progression, we introduce VinaLLaMA, a foundational LLM designed specifically for the Vietnamese language. VinaL-LaMA, built on top of LLaMA-2, represents a vital stride towards linguistic inclusivity in AI, adeptly addressing the syntactic and semantic intricacies of Vietnamese. Embracing the spirit of collaboration and open innovation, we are pleased to announce VinaLLaMA, an open-weight Foundation Language Model and its chat variant. These models are now accessible on HuggingFace, ensuring compatibility with all'transformers' framework-supported libraries. This endeavor not only contributes to the global AI research landscape but also provides a specialized tool for exploring and enhancing Vietnamese language processing, encouraging a wider engagement and application in AI-driven NLP research.