Goto

Collaborating Authors

 hud


A DOGE AI Tool Called SweetREX Is Coming to Slash US Government Regulation

WIRED

Efforts to gut regulation across the US government using AI are well underway. On Wednesday, the Office of the Chief Information Officer at the Office of Management and Budget hosted a video call to discuss an AI tool being used to cut federal regulations, which the office called SweetREX Deregulation AI. The tool, which is still being developed, is built to identify sections of regulations that aren't required by statute, then expedite the process for adopting updated regulations. The development and rollout of what is being formally called the SweetREX Deregulation AI Plan Builder, or SweetREX DAIP, is meant to help achieve the goals laid out in President Donald Trump's "Unleashing Prosperity Through Deregulation" executive order, which aims to "promote prudent financial management and alleviate unnecessary regulatory burdens." Industrial-scale deregulation is a core aim laid out in Project 2025, the document that has served as a playbook for the second Trump administration.


DOGE Put a College Student in Charge of Using AI to Rewrite Regulations

WIRED

A young man with no government experience who has yet to even complete his undergraduate degree is working for Elon Musk's so-called Department of Government Efficiency (DOGE) at the Department of Housing and Urban Development (HUD) and has been tasked with using artificial intelligence to rewrite the agency's rules and regulations. Christopher Sweet was introduced to HUD employees as being originally from San Francisco and most recently a third-year at the University of Chicago, where he was studying economics and data science, in an email sent to staffers earlier this month. "I'd like to share with you that Chris Sweet has joined the HUD DOGE team with the title of special assistant, although a better title might be'Al computer programming quant analyst,'" Scott Langmack, a DOGE staffer and chief operating officer of an AI real estate company, wrote in an email widely shared within the agency and reviewed by WIRED. "With family roots from Brazil, Chris speaks Portuguese fluently. Please join me in welcoming Chris to HUD!" Sweet's primary role appears to be leading an effort to leverage artificial intelligence to review HUD's regulations, compare them to the laws on which they are based, and identify areas where rules can be relaxed or removed altogether.


Mind the Uncertainty in Human Disagreement: Evaluating Discrepancies between Model Predictions and Human Responses in VQA

Lan, Jian, Frassinelli, Diego, Plank, Barbara

arXiv.org Artificial Intelligence

Large vision-language models frequently struggle to accurately predict responses provided by multiple human annotators, particularly when those responses exhibit human uncertainty. In this study, we focus on the Visual Question Answering (VQA) task, and we comprehensively evaluate how well the state-of-the-art vision-language models correlate with the distribution of human responses. To do so, we categorize our samples based on their levels (low, medium, high) of human uncertainty in disagreement (HUD) and employ not only accuracy but also three new human-correlated metrics in VQA, to investigate the impact of HUD. To better align models with humans, we also verify the effect of common calibration and human calibration. Our results show that even BEiT3, currently the best model for this task, struggles to capture the multi-label distribution inherent in diverse human responses. Additionally, we observe that the commonly used accuracy-oriented calibration technique adversely affects BEiT3's ability to capture HUD, further widening the gap between model predictions and human distributions. In contrast, we show the benefits of calibrating models towards human distributions for VQA, better aligning model confidence with human uncertainty. Our findings highlight that for VQA, the consistent alignment between human responses and model predictions is understudied and should become the next crucial target of future studies.


The Road Ahead for Augmented Reality

Communications of the ACM

Automotive head-up displays (HUDs), systems that transparently project critical vehicle information into the driver's field of vision, were developed originally for military aviation use, with the origin of the name stemming from a pilot being able to view information with his or her head positioned "up" and looking forward, rather than positioned "down" to look at the cockpit gauges and instruments. The HUD projects and superimposes data in the pilot's natural field of view (FOV), providing the added benefit of eliminating the pilot's need to refocus when switching between the outside view and the instruments, which can impact reaction time, efficiency, and safety, particularly in combat situations. In cars, the main concern is distracted driving, or the act of taking the driver's attention away from the road. According to the National Highway Transportation Safety Administration, distracted driving claimed 3,142 lives in 2019, the most recent year for which statistics have been published. Looking away from the road for even five seconds at a speed of 55 mph is the equivalent of driving the length of a football field with one's eyes closed.


Building Trust in Autonomous Vehicles: Role of Virtual Reality Driving Simulators in HMI Design

Morra, Lia, Lamberti, Fabrizio, Pratticó, F. Gabriele, La Rosa, Salvatore, Montuschi, Paolo

arXiv.org Artificial Intelligence

The investigation of factors contributing at making humans trust Autonomous Vehicles (AVs) will play a fundamental role in the adoption of such technology. The user's ability to form a mental model of the AV, which is crucial to establish trust, depends on effective user-vehicle communication; thus, the importance of Human-Machine Interaction (HMI) is poised to increase. In this work, we propose a methodology to validate the user experience in AVs based on continuous, objective information gathered from physiological signals, while the user is immersed in a Virtual Reality-based driving simulation. We applied this methodology to the design of a head-up display interface delivering visual cues about the vehicle' sensory and planning systems. Through this approach, we obtained qualitative and quantitative evidence that a complete picture of the vehicle's surrounding, despite the higher cognitive load, is conducive to a less stressful experience. Moreover, after having been exposed to a more informative interface, users involved in the study were also more willing to test a real AV. The proposed methodology could be extended by adjusting the simulation environment, the HMI and/or the vehicle's Artificial Intelligence modules to dig into other aspects of the user experience.


Researchers bring gaming to autonomous vehicles

#artificialintelligence

Researchers have designed multiplayer games occupants of autonomous vehicles can play with other players in nearby self-driving cars. A new study, led by researchers from the University of Waterloo details three games created for level three and higher semi-autonomous vehicles. The researchers also made suggestions for many exciting types of in-car games for future exploration. Level three and higher semi-autonomous vehicles are those that have, at minimum, environmental detection capabilities and can make informed decisions for themselves. "As autonomous vehicles start to replace conventional vehicles, occupants will have much more free time than they used to," said Matthew Lakier, a PhD student in Waterloo's School of Computer Science.


The Dawn of Life in a $5 Toaster Oven - Issue 68: Context

Nautilus

God might just as well have begun with a toaster oven. A few years ago at a yard sale, Nicholas Hud spotted a good candidate: A vintage General Electric model, chrome-plated with wood-grain panels, nestled in an old yellowed box, practically unused. The perfect appliance for cooking up the chemical precursors of life, he thought. He bought it for $5. At home in his basement, with the help of his college-age son, he cut a rectangular hole in the oven's backside, through which an automated sliding table (recycled from an old document scanner) could move a tray of experiments in and out. He then attached a syringe pump to some inkjet printer parts, and rigged the system to periodically drip water onto the tray.