Goto

Collaborating Authors

 Cambridge


The Good Robot podcast: what makes a drone "good"? with Beryl Pong

AIHub

The Good Robot podcast: what makes a drone "good"? Hosted by Eleanor Drage and Kerry McInerney, The Good Robot is a podcast which explores the many complex intersections between gender, feminism and technology. What makes a drone "good"? In this episode, we talk to Beryl Pong, UKRI Future Leaders Fellow at the University of Cambridge, where she leads the Centre for Drones and Culture. Beryl reflects on what it means to think about drones as "good" or "ethical" technologies and how it can be assessed through its socio-political context.



Chihuahua, boxer, and 10 other dog breeds at risk of breathing troubles

Popular Science

The new study of almost 900 dogs aims to help owners pinpoint breathing issues. Breakthroughs, discoveries, and DIY tips sent six days a week. Despite their popularity, for their seemingly helpless-looking eyes and flat faces, short-skulled (or brachycephalic) dogs like the French bulldog often have serious difficulty breathing. A study published today in the journal found that in 12 breeds, a flat face, collapsing nostrils, and rounded physique puts them at a higher risk for developing common breathing conditions. Pekingese and Japanese chins were noted to be the highest risk.






SLM: A Smoothed First-Order Lagrangian Method for Structured Constrained Nonconvex Optimization

Neural Information Processing Systems

Functional constrained optimization (FCO) has emerged as a powerful tool for solving various machine learning problems. However, with the rapid increase in applications of neural networks in recent years, it has become apparent that both the objective and constraints often involve nonconvex functions, which poses significant challenges in obtaining high-quality solutions. In this work, we focus on a class of nonconvex FCO problems with nonconvex constraints, where the two optimization variables are nonlinearly coupled in the inequality constraint. Leveraging the primal-dual optimization framework, we propose a smoothed first-order Lagrangian method (SLM) for solving this class of problems. We establish the theoretical convergence guarantees of SLM to the Karush-Kuhn-Tucker (KKT) solutions through quantifying dual error bounds. By establishing connections between this structured FCO and equilibrium-constrained nonconvex problems (also known as bilevel optimization), we apply the proposed SLM to tackle bilevel optimization oriented problems where the lower-level problem is nonconvex. Numerical results obtained from both toy examples and hyper-data cleaning problems demonstrate the superiority of SLM compared to benchmark methods.



Data Quality in Imitation Learning

Neural Information Processing Systems

In supervised learning, the question of data quality and curation has been overshadowed in recent years by increasingly more powerful and expressive models that can ingest internet-scale data.