Goto

Collaborating Authors

 pablo


From Tools to Teammates: Evaluating LLMs in Multi-Session Coding Interactions

Rakotonirina, Nathanaël Carraz, Hamdy, Mohammed, Campos, Jon Ander, Weber, Lucas, Testoni, Alberto, Fadaee, Marzieh, Pezzelle, Sandro, Del Tredici, Marco

arXiv.org Artificial Intelligence

Large Language Models (LLMs) are increasingly used in working environments for a wide range of tasks, excelling at solving individual problems in isolation. However, are they also able to effectively collaborate over long-term interactions? To investigate this, we introduce MemoryCode, a synthetic multi-session dataset designed to test LLMs' ability to track and execute simple coding instructions amid irrelevant information, simulating a realistic setting. While all the models we tested handle isolated instructions well, even the performance of state-of-the-art models like GPT-4o deteriorates when instructions are spread across sessions. Our analysis suggests this is due to their failure to retrieve and integrate information over long instruction chains. Our results highlight a fundamental limitation of current LLMs, restricting their ability to collaborate effectively in long interactions.


Deep Learning with R for Beginners: Design neural network models in R 3.5 using TensorFlow, Keras, and MXNet: Hodnett, Mark, Wiley, Joshua F., Liu, Yuxi (Hayden), Maldonado, Pablo: 9781838642709: Amazon.com: Books

#artificialintelligence

Yuxi (Hayden) Liu is a Software Engineer, Machine Learning at Google. Previously he worked as a machine learning scientist in a variety of data-driven domains and applied his ML expertise in computational advertising, marketing and cybersecurity. He is now developing and improving the machine learning models and systems for ads optimization on the largest search engine in the world. He is an author of a series of machine learning books and an education enthusiast. His first book, also the first edition of Python Machine Learning by Example, ranked the #1 bestseller in Amazon in 2017 and 2018, and was translated into many different languages.


Researchers show how to make a 'computer' out of liquid crystals

#artificialintelligence

Researchers with the University of Chicago Pritzker School of Molecular Engineering have shown for the first time how to design the basic elements needed for logic operations using a kind of material called a liquid crystal--paving the way for a completely novel way of performing computations. The results, published Feb. 23 in Science Advances, are not likely to become transistors or computers right away, but the technique could point the way towards devices with new functions in sensing, computing and robotics. "We showed you can create the elementary building blocks of a circuit--gates, amplifiers, and conductors--which means you should be able to assemble them into arrangements capable of performing more complex operations," said Juan de Pablo, the Liew Family Professor in Molecular Engineering and senior scientist at Argonne National Laboratory, and the senior corresponding author on the paper. The research aimed to take a closer look at a type of material called a liquid crystal. The molecules in a liquid crystal tend to be elongated, and when packed together they adopt a structure that has some order, like the straight rows of atoms in a diamond crystal--but instead of being stuck in place as in a solid, this structure can also shift around as a liquid does.