inspection
- Europe > Norway > Eastern Norway > Oslo (0.04)
- Europe > Norway > Central Norway > Trøndelag > Trondheim (0.04)
- North America > United States > Washington (0.04)
- (3 more...)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- North America > Canada > Ontario > Toronto (0.04)
A Dataset for Efforts Towards Achieving the Sustainable Development Goal of Safe Working Environments
Among United Nations' 17 Sustainable Development Goals (SDGs), we highlight SDG 8 on Decent Work and Economic Growth. Specifically, we consider how to achieve subgoal 8.8, protect labour rights and promote safe working environments for all workers [...], in light of poor health, safety and environment (HSE) conditions being a widespread problem at workplaces. In EU alone, it is estimated that more than 4000 deaths occur each year due to poor working conditions. To handle the problem and achieve SDG 8, governmental agencies conduct labour inspections and it is therefore essential that these are carried out efficiently. Current research suggests that machine learning (ML) can be used to improve labour inspections, for instance by selecting organisations for inspections more effectively.
Inside the labs where glasses are redesigned for a hyper-visual world
I went to EssilorLuxottica's Paris facilities to learn how the digital age is reshaping eyes and redefining eyewear. We may earn revenue from the products available on this page and participate in affiliate programs. Restaurants are surprisingly good age tests. When the menu lands, do you squint at the tiny fonts, tilt the page toward some inadequate candle, or blast it with your phone flashlight just to read it? Do you ask a friend to tell you the options because you refuse to wear the readers you know, in your heart, you probably need? And when did restaurants get so loud?
- Information Technology > Hardware (0.70)
- Information Technology > Communications > Mobile (0.47)
- Information Technology > Artificial Intelligence > Robots (0.47)
A Comprehensive Framework for Automated Quality Control in the Automotive Industry
Moraiti, Panagiota, Giannikos, Panagiotis, Mastrogeorgiou, Athanasios, Mavridis, Panagiotis, Zhou, Linghao, Chatzakos, Panagiotis
Abstract-- This paper presents a cutting-edge robotic inspection solution (Figure 1) designed to automate quality control in automotive manufacturing. The system integrates a pair of collaborative robots, each equipped with a high-resolution camera-based vision system to accurately detect and localize surface and thread defects in aluminum high-pressure die casting (HPDC) automotive components. In addition, specialized lenses and optimized lighting configurations are employed to ensure consistent and high-quality image acquisition. The YOLO11n deep learning model is utilized, incorporating additional enhancements such as image slicing, ensemble learning, and bounding-box merging to significantly improve performance and minimize false detections. Furthermore, image processing techniques are applied to estimate the extent of the detected defects. Experimental results demonstrate real-time performance with high accuracy across a wide variety of defects, while minimizing false detections. The proposed solution is promising and highly scalable, providing the flexibility to adapt to various production environments and meet the evolving demands of the automotive industry. Quality control plays a crucial role in automotive manufacturing. Even minor defects introduced during production can result in significant performance issues and safety risks, emphasizing the importance of stringent quality inspections [1]. Traditionally, quality control processes in automotive production have been heavily dependent on skilled human operators to inspect components visually. This approach is not only costly and time-intensive but also susceptible to inconsistencies arising from operator fatigue and subjective decision-making [2].
- North America > United States > Hawaii > Honolulu County > Honolulu (0.04)
- North America > Mexico > Mexico City > Mexico City (0.04)
- Europe > United Kingdom (0.04)
- (9 more...)
IAEA flags damage to Chornobyl nuclear plant's protective shield in Ukraine
What is in the 28-point US plan for Ukraine? 'Ukraine is running out of men, money and time' Can the US get all sides to end the war? Why is Europe opposing Trump's peace plan? IAEA flags damage to Chornobyl nuclear plant's protective shield in Ukraine A drone strike has damaged a protective shield at the Chornobyl nuclear plant in Ukraine, rendering it unable to contain the radioactive material from the 1986 explosion of the plant, the United Nations nuclear watchdog said. The International Atomic Energy Agency (IAEA) said on Friday that the shield can no longer perform its main safety function, following an inspection of the steel structure last week.
- Europe > Ukraine > Kyiv Oblast > Chernobyl (0.85)
- Asia > Russia (0.84)
- Europe > Ukraine > Kyiv Oblast > Kyiv (0.06)
- (6 more...)
- Information Technology > Artificial Intelligence > Robots > Autonomous Vehicles > Drones (0.36)
- Information Technology > Communications (0.32)
High-Speed Event Vision-Based Tactile Roller Sensor for Large Surface Measurements
Khairi, Akram, Sajwani, Hussain, Alkilany, Abdallah Mohammad, AbuAssi, Laith, Halwani, Mohamad, Zaid, Islam Mohamed, Awadalla, Ahmed, Swart, Dewald, Ayyad, Abdulla, Zweiri, Yahya
Abstract-- Inspecting large-scale industrial surfaces like aircraft fuselages for quality control requires precise, high-resolution 3D geometry. Vision-based tactile sensors (VBTSs) offer high local resolution but require slow'press-and-lift' measurements for large areas. Sliding or roller/belt VBTS designs provide continuous measurement but face significant challenges: sliding suffers from friction/wear, while both are speed-limited by camera frame rates and motion blur . Thus, a rapid, continuous, high-resolution method is needed. We introduce a novel neuromorphic tactile roller sensor . It uses a modified event-based multi-view stereo algorithm for 3D reconstruction, leveraging high temporal resolution and motion blur robustness. This reconstruction is most effective for surfaces with distinct edges or sharp features, which are often the most critical for defect detection in industrial inspection tasks. We demonstrate 0.5 m/s scanning speeds with MAE below 100 µm (11x faster than prior methods). A multi-reference Bayesian fusion strategy reduces MAE by 25.2% (vs. Surface metrology and surface inspection are crucial elements in quality assurance across diverse industries, particularly aerospace and automotive manufacturing. Precise inspection is required to identify characteristics like paint quality, coating integrity, and subtle defects such as cracks, nicks, and dents [1], [2], [3]. Often, achieving a resolution of 0.1 mm or lower is necessary to accurately classify these features and ensure component integrity and safety [4]. Traditional contact-based methods, including high-precision profilometers [5], [6] or microscopic techniques [7], [8], [9], offer high resolution locally but become exceedingly time-consuming when applied to large surface areas due to their sequential, point-by-point or small-patch measurement nature. Non-contact optical methods, such as cameras, laser scanners, or structured light systems [2], [10], [11], [12], [13], [14], can significantly accelerate inspection by capturing data over wider areas. However, these methods often lack robustness; their performance can be compromised by variations in ambient lighting, motion blur when attempting high-speed scanning, or challenging surface optical properties like high reflectivity or transparency [15].
- North America > United States > Kansas > Sheridan County (0.04)
- Asia > Middle East > UAE > Abu Dhabi Emirate > Abu Dhabi (0.04)
Hybrid Synthetic Data Generation with Domain Randomization Enables Zero-Shot Vision-Based Part Inspection Under Extreme Class Imbalance
Mei, Ruo-Syuan, Jia, Sixian, Li, Guangze, Lee, Soo Yeon, Musser, Brian, Keller, William, Zakula, Sreten, Arinez, Jorge, Shao, Chenhui
Machine learning, particularly deep learning, is transforming industrial quality inspection. Yet, training robust machine learning models typically requires large volumes of high-quality labeled data, which are expensive, time-consuming, and labor-intensive to obtain in manufacturing. Moreover, defective samples are intrinsically rare, leading to severe class imbalance that degrades model performance. These data constraints hinder the widespread adoption of machine learning-based quality inspection methods in real production environments. Synthetic data generation (SDG) offers a promising solution by enabling the creation of large, balanced, and fully annotated datasets in an efficient, cost-effective, and scalable manner. This paper presents a hybrid SDG framework that integrates simulation-based rendering, domain randomization, and real background compositing to enable zero-shot learning for computer vision-based industrial part inspection without manual annotation. The SDG pipeline generates 12,960 labeled images in one hour by varying part geometry, lighting, and surface properties, and then compositing synthetic parts onto real image backgrounds. A two-stage architecture utilizing a YOLOv8n backbone for object detection and MobileNetV3-small for quality classification is trained exclusively on synthetic data and evaluated on 300 real industrial parts. The proposed approach achieves an mAP@0.5 of 0.995 for detection, 96% classification accuracy, and 90.1% balanced accuracy. Comparative evaluation against few-shot real-data baseline approaches demonstrates significant improvement. The proposed SDG-based approach achieves 90-91% balanced accuracy under severe class imbalance, while the baselines reach only 50% accuracy. These results demonstrate that the proposed method enables annotation-free, scalable, and robust quality inspection for real-world manufacturing applications.
- Research Report > New Finding (0.34)
- Research Report > Promising Solution (0.34)
- Automobiles & Trucks (0.68)
- Information Technology (0.46)
Evaluating Magic Leap 2 Tool Tracking for AR Sensor Guidance in Industrial Inspections
Masuhr, Christian, Koch, Julian, Schüppstuhl, Thorsten
Rigorous evaluation of commercial Augmented Reality (AR) hardware is crucial, yet public benchmarks for tool tracking on modern Head-Mounted Displays (HMDs) are limited. This paper addresses this gap by systematically assessing the Magic Leap 2 (ML2) controllers tracking performance. Using a robotic arm for repeatable motion (EN ISO 9283) and an optical tracking system as ground truth, our protocol evaluates static and dynamic performance under various conditions, including realistic paths from a hydrogen leak inspection use case. The results provide a quantitative baseline of the ML2 controller's accuracy and repeatability and present a robust, transferable evaluation methodology. The findings provide a basis to assess the controllers suitability for the inspection use case and similar industrial sensor-based AR guidance tasks.
- North America > United States (0.04)
- Europe > Switzerland (0.04)
- Europe > Spain > Galicia > Madrid (0.04)
- (2 more...)
- Energy > Renewable (0.68)
- Health & Medicine > Health Care Technology (0.67)