Goto

Collaborating Authors

 ingredient


Viome Full Body Intelligence Test Review: Little Clarity, Pricey Supplements

WIRED

Virtually every aspect of your health can be traced back to your microbiome. But some tests are better than others. Some of the recipes look tasty. I admit it: I'm a sucker for metrics. Fitness trackers that keep tabs on my steps and sleep? A DEXA scan to give me too much information about my body composition?


Best Portable Blenders of 2026: Ninja, Nutribullet, Beast

WIRED

A cordless, portable blender was barely possible a few years ago. But two cordless blenders are ahead of the pack. But battery tech keeps getting better. This means the best portable blender I've tested, the Ninja Blast Max ($100), is now fully able to make a six-pack of crushed-ice margaritas at your next picnic or blend up a berry-filled protein shake at the gym without breaking much of a sweat. Meanwhile, the ingeniously designed Nutribullet Flip ($115) offers more torque than previous-generation blenders, plus enough insulation to keep ice frozen until it's time for lunch (or even dinner).


Samsung Bespoke Fridge with AI review: All the bells and whistles

Engadget

How to claim Verizon's $20 outage credit While Samsung's AI Vision and food tracking is a work in progress, it can still be genuinely useful. At their core, refrigerators are relatively simple devices. If you're the type of person to view every extra feature as a component that could potentially go wrong, basic iceboxes are probably the kind you go for. But for those on the other end of the spectrum, Samsung's latest Bespoke Refrigerators with AI inside have more bells and whistles than you might think possible -- including an optional 32-inch screen. The model we tested for this review came out in the second half of 2025 and will continue to be on sale throughout 2026. Hardware will remain the same, the only changes will come in the form of an OTA software update slated for later this year that will add support for Google Gemini, improved food recognition/labeling and more.


Why is okra so slimy? Blame the mucilage.

Popular Science

Why is okra so slimy? The plant's signature goo helps it thrive in the heat. Okra gets its slime from a substance called mucilage. Breakthroughs, discoveries, and DIY tips sent six days a week. Okra is one of those vegetables with a polarizing reputation.


Chef 'not embarrassed' by one-star hygiene rating at Michelin-starred restaurant

BBC News

The chef behind Wales' only two-Michelin-star restaurant has said he is not embarrassed after it was awarded a one-star hygiene rating. Ynyshir Restaurant and Rooms, near Machynlleth in Ceredigion, which charges nearly £500 per head, received the rating after a visit by food safety officers on 5 November. According to the Food Standards Agency (FSA), a score of one out of five means major improvement is necessary. But chef patron Gareth Ward, a contestant on MasterChef The Professionals, said the restaurant was working at the highest standard in the world and doing something different with how it approaches raw ingredients and techniques. Ynyshir offers a high-end dining experience starting at £468 per person, including a 30-course tasting menu and an in-house DJ.


Toffee Crisp and Blue Riband can't be called chocolate any more

BBC News

Toffee Crisp and Blue Riband can't be called chocolate any more Toffee Crisp and Blue Riband bars can no longer be called chocolate after maker Nestle changed their recipes. To be described as milk chocolate in the UK a product needs to have at least 20% cocoa solids and 20% milk solids, a level each product fell below once a higher amount of cheaper vegetable fat was used. Nestle said its reformulations were needed due to higher input costs but were carefully developed and sensory tested and there were no plans to alter the recipes of other chocolate products. As many ingredient costs, such as cocoa and butter, increased food companies have altered recipes to use less of the expensive ingredients, as well as shrinking serving sizes. Nestle now describes the treats as being encased in a smooth milk chocolate flavour coating rather than being covered in milk chocolate.


CookAnything: A Framework for Flexible and Consistent Multi-Step Recipe Image Generation

Zhang, Ruoxuan, Wen, Bin, Xie, Hongxia, Yao, Yi, Zuo, Songhan, Jiang-Lin, Jian-Yu, Shuai, Hong-Han, Cheng, Wen-Huang

arXiv.org Artificial Intelligence

Cooking is a sequential and visually grounded activity, where each step such as chopping, mixing, or frying carries both procedural logic and visual semantics. While recent diffusion models have shown strong capabilities in text-to-image generation, they struggle to handle structured multi-step scenarios like recipe illustration. Additionally, current recipe illustration methods are unable to adjust to the natural variability in recipe length, generating a fixed number of images regardless of the actual instructions structure. To address these limitations, we present CookAnything, a flexible and consistent diffusion-based framework that generates coherent, semantically distinct image sequences from textual cooking instructions of arbitrary length. The framework introduces three key components: (1) Step-wise Regional Control (SRC), which aligns textual steps with corresponding image regions within a single denoising process; (2) Flexible RoPE, a step-aware positional encoding mechanism that enhances both temporal coherence and spatial diversity; and (3) Cross-Step Consistency Control (CSCC), which maintains fine-grained ingredient consistency across steps. Experimental results on recipe illustration benchmarks show that CookAnything performs better than existing methods in training-based and training-free settings. The proposed framework supports scalable, high-quality visual synthesis of complex multi-step instructions and holds significant potential for broad applications in instructional media, and procedural content creation.


ELR-1000: A Community-Generated Dataset for Endangered Indic Indigenous Languages

Joshi, Neha, Gogoi, Pamir, Mirza, Aasim, Jansari, Aayush, Yadavalli, Aditya, Pandey, Ayushi, Shukla, Arunima, Sudharsan, Deepthi, Bali, Kalika, Seshadri, Vivek

arXiv.org Artificial Intelligence

We present a culturally-grounded multimodal dataset of 1,060 traditional recipes crowdsourced from rural communities across remote regions of Eastern India, spanning 10 endangered languages. These recipes, rich in linguistic and cultural nuance, were collected using a mobile interface designed for contributors with low digital literacy. Endangered Language Recipes (ELR)-1000 -- captures not only culinary practices but also the socio-cultural context embedded in indigenous food traditions. We evaluate the performance of several state-of-the-art large language models (LLMs) on translating these recipes into English and find the following: despite the models' capabilities, they struggle with low-resource, culturally-specific language. However, we observe that providing targeted context -- including background information about the languages, translation examples, and guidelines for cultural preservation -- leads to significant improvements in translation quality. Our results underscore the need for benchmarks that cater to underrepresented languages and domains to advance equitable and culturally-aware language technologies. As part of this work, we release the ELR-1000 dataset to the NLP community, hoping it motivates the development of language technologies for endangered languages.


On Defining Neural Averaging

Lee, Su Hyeong, Ngo, Richard

arXiv.org Artificial Intelligence

What does it even mean to average neural networks? We investigate the problem of synthesizing a single neural network from a collection of pretrained models, each trained on disjoint data shards, using only their final weights and no access to training data. In forming a definition of neural averaging, we take insight from model soup, which appears to aggregate multiple models into a singular model while enhancing generalization performance. In this work, we reinterpret model souping as a special case of a broader framework: Amortized Model Ensembling (AME) for neural averaging, a data-free meta-optimization approach that treats model differences as pseudogradients to guide neural weight updates. We show that this perspective not only recovers model soup but enables more expressive and adaptive ensembling strategies. Empirically, AME produces averaged neural solutions that outperform both individual experts and model soup baselines, especially in out-of-distribution settings. Our results suggest a principled and generalizable notion of data-free model weight aggregation and defines, in one sense, how to perform neural averaging.


From Raw Features to Effective Embeddings: A Three-Stage Approach for Multimodal Recipe Recommendation

Shin, Jeeho, Kim, Kyungho, Shin, Kijung

arXiv.org Artificial Intelligence

Recipe recommendation has become an essential task in web-based food platforms. A central challenge is effectively leveraging rich multimodal features beyond user-recipe interactions. Our analysis shows that even simple uses of multimodal signals yield competitive performance, suggesting that systematic enhancement of these signals is highly promising. We propose TESMR, a 3-stage framework for recipe recommendation that progressively refines raw multimodal features into effective embeddings through: (1) content-based enhancement using foundation models with multimodal comprehension, (2) relation-based enhancement via message propagation over user-recipe interactions, and (3) learning-based enhancement through contrastive learning with learnable embeddings. Experiments on two real-world datasets show that TESMR outperforms existing methods, achieving 7-15% higher Recall@10.