Goto

Collaborating Authors

 irradiance


LumiX: Structured and Coherent Text-to-Intrinsic Generation

Han, Xu, Zhang, Biao, Tang, Xiangjun, Li, Xianzhi, Wonka, Peter

arXiv.org Artificial Intelligence

We present LumiX, a structured diffusion framework for coherent text-to-intrinsic generation. Conditioned on text prompts, LumiX jointly generates a comprehensive set of intrinsic maps (e.g., albedo, irradiance, normal, depth, and final color), providing a structured and physically consistent description of an underlying scene. This is enabled by two key contributions: 1) Query-Broadcast Attention, a mechanism that ensures structural consistency by sharing queries across all maps in each self-attention block. 2) Tensor LoRA, a tensor-based adaptation that parameter-efficiently models cross-map relations for efficient joint training. Together, these designs enable stable joint diffusion training and unified generation of multiple intrinsic properties. Experiments show that LumiX produces coherent and physically meaningful results, achieving 23% higher alignment and a better preference score (0.19 vs. -0.41) compared to the state of the art, and it can also perform image-conditioned intrinsic decomposition within the same framework.


Quantum Fourier Transform Based Kernel for Solar Irrandiance Forecasting

Mechiche-Alami, Nawfel, Rodriguez, Eduardo, Cardemil, Jose M., Droguett, Enrique Lopez

arXiv.org Machine Learning

This study proposes a Quantum Fourier Transform (QFT)-enhanced quantum kernel for short-term time-series forecasting. Exogenous predictors are incorporated by convexly fusing feature-specific kernels. For both quantum and classical models, the only tuned quantities are the feature-mixing weights and the KRR ridge α; classical hyperparameters (γ, r, d) are fixed, with the same validation set size for all models. Experiments are conducted on a noiseless simulator (5 qubits; window length L=32). Limitations and ablations are discussed, and paths toward NISQ execution are outlined. Introduction Quantum Machine Learning (QML) is an emerging discipline that combines the principles of quantum physics with traditional machine learning (ML) to exploit the distinctive characteristics of quantum systems, including superposition and entanglement phenomena [1]. This distinction facilitates the expeditious execution of certain tasks [2], such as classification and dimensionality reduction, where QML has demonstrated significant acceleration [3]. QML applications have extended to time-series data, leveraging quantum phenomena to model complex temporal dependencies. The goal is to enhance the results of traditional tasks by performing computations on qubits, which can process data more efficiently than classical bits [4, 5]. For example, Thakkar et al. [6] demonstrated that quantum machine-learning methods could enhance financial forecasting by improving both churn prediction and credit-risk assessment. Likewise, Kea et al. [7] developed a hybrid quantum-classical Long Short-Term Memory (QLSTM) to improve stock-price forecasting by leveraging quantum data encoding and high-dimensional quantum representations.


Multi-Horizon Time Series Forecasting of non-parametric CDFs with Deep Lattice Networks

Erdmann, Niklas, Bentsen, Lars, Stenbro, Roy, Riise, Heine Nygard, Warakagoda, Narada Dilp, Engelstad, Paal E.

arXiv.org Artificial Intelligence

Probabilistic forecasting is not only a way to add more information to a prediction of the future, but it also builds on weaknesses in point prediction. Sudden changes in a time series can still be captured by a cumulative distribution function (CDF), while a point prediction is likely to miss it entirely. The modeling of CDFs within forecasts has historically been limited to parametric approaches, but due to recent advances, this no longer has to be the case. We aim to advance the fields of probabilistic forecasting and monotonic networks by connecting them and propose an approach that permits the forecasting of implicit, complete, and nonparametric CDFs. For this purpose, we propose an adaptation to deep lattice networks (DLN) for monotonically constrained simultaneous/implicit quantile regression in time series forecasting. Quantile regression usually produces quantile crossovers, which need to be prevented to achieve a legitimate CDF. By leveraging long short term memory units (LSTM) as the embedding layer, and spreading quantile inputs to all sub-lattices of a DLN with an extended output size, we can produce a multi-horizon forecast of an implicit CDF due to the monotonic constraintability of DLNs that prevent quantile crossovers. We compare and evaluate our approach's performance to relevant state of the art within the context of a highly relevant application of time series forecasting: Day-ahead, hourly forecasts of solar irradiance observations. Our experiments show that the adaptation of a DLN performs just as well or even better than an unconstrained approach. Further comparison of the adapted DLN against a scalable monotonic neural network shows that our approach performs better. With this adaptation of DLNs, we intend to create more interest and crossover investigations in techniques of monotonic neural networks and probabilistic forecasting.


Set Phasers to Stun: Beaming Power and Control to Mobile Robots with Laser Light

Carver, Charles J., Schwartz, Hadleigh, Itagaki, Toma, Englhardt, Zachary, Liu, Kechen, Manik, Megan Graciela Nauli, Chang, Chun-Cheng, Iyer, Vikram, Plancher, Brian, Zhou, Xia

arXiv.org Artificial Intelligence

Abstract-- We present Phaser, a flexible system that directs narrow-beam laser light to moving robots for concurrent wireless power delivery and communication. We design a semiautomatic calibration procedure to enable fusion of stereo-vision-based 3D robot tracking with high-power beam steering, and a low-power optical communication scheme that reuses the laser light as a data channel. We fabricate a Phaser prototype using off-the-shelf hardware and evaluate its performance with battery-free autonomous robots. We demonstrate Phaser fully powering gram-scale battery-free robots to nearly 2x higher speeds than prior work while simultaneously controlling them to navigate around obstacles and along paths. Code, an open-source design guide, and a demonstration video of Phaser is available at https: //mobilex.cs.columbia.edu/phaser/. Mobile, autonomous robots play an increasingly important role in today's world, with the potential to perform tasks in warehouses, factories, and homes and conduct advanced environmental explorations [1]. However, the significant power needed for locomotion, on-board computation, and communication presents a key barrier to the broader deployment of such robots. Given the energy density of current batteries [2], most autonomous robots today either remain tethered by charging wires or must routinely return to charging stations, reducing deployment time. This problem is exacerbated in miniaturized robots, which cannot support the 100s of milligrams of battery payload [3]-[7] needed for extended operation, even on their milliwatt power budgets.


Reconstruction of Solar EUV Irradiance Using CaII K Images and SOHO/SEM Data with Bayesian Deep Learning and Uncertainty Quantification

Jiang, Haodi, Li, Qin, Wang, Jason T. L., Wang, Haimin, Criscuoli, Serena

arXiv.org Artificial Intelligence

Solar extreme ultraviolet (EUV) irradiance plays a crucial role in heating the Earth's ionosphere, thermosphere, and mesosphere, affecting atmospheric dynamics over varying time scales. Although significant effort has been spent studying short-term EUV variations from solar transient events, there is little work to explore the long-term evolution of the EUV flux over multiple solar cycles. Continuous EUV flux measurements have only been available since 1995, leaving significant gaps in earlier data. In this study, we propose a Bayesian deep learning model, named SEMNet, to fill the gaps. We validate our approach by applying SEMNet to construct SOHO/SEM EUV flux measurements in the period between 1998 and 2014 using CaII K images from the Precision Solar Photometric Telescope. We then extend SEMNet through transfer learning to reconstruct solar EUV irradiance in the period between 1950 and 1960 using CaII K images from the Kodaikanal Solar Observatory. Experimental results show that SEMNet provides reliable predictions along with uncertainty bounds, demonstrating the feasibility of CaII K images as a robust proxy for long-term EUV fluxes. These findings contribute to a better understanding of solar influences on Earth's climate over extended periods.


Machine Learning Risk Intelligence for Green Hydrogen Investment: Insights for Duqm R3 Auction

Nwafor, Obumneme, Hooti, Mohammed Abdul Majeed Al

arXiv.org Artificial Intelligence

As green hydrogen emerges as a major component of global decarbonisation, Oman has positioned itself strategically through national auctions and international partnerships. Following two successful green hydrogen project rounds, the country launched its third auction (R3) in the Duqm region. While this area exhibits relative geospatial homogeneity, it is still vulnerable to environmental fluctuations that pose inherent risks to productivity. Despite growing global investment in green hydrogen, operational data remains scarce, with major projects like Saudi Arabia's NEOM facility not expected to commence production until 2026, and Oman's ACME Duqm project scheduled for 2028. This absence of historical maintenance and performance data from large-scale hydrogen facilities in desert environments creates a major knowledge gap for accurate risk assessment for infrastructure planning and auction decisions. Given this data void, environmental conditions emerge as accessible and reliable proxy for predicting infrastructure maintenance pressures, because harsh desert conditions such as dust storms, extreme temperatures, and humidity fluctuations are well-documented drivers of equipment degradation in renewable energy systems. To address this challenge, this paper proposes an Artificial Intelligence decision support system that leverages publicly available meteorological data to develop a predictive Maintenance Pressure Index (MPI), which predicts risk levels and future maintenance demands on hydrogen infrastructure. This tool strengthens regulatory foresight and operational decision-making by enabling temporal benchmarking to assess and validate performance claims over time. It can be used to incorporate temporal risk intelligence into auction evaluation criteria despite the absence of historical operational benchmarks.


Artificial Intelligence for Green Hydrogen Yield Prediction and Site Suitability using SHAP-Based Composite Index: Focus on Oman

Nwafor, Obumneme Zimuzor, Hooti, Mohammed Abdul Majeed Al

arXiv.org Artificial Intelligence

As nations seek sustainable alternatives to fossil fuels, green hydrogen has emerged as a promising strategic pathway toward decarbonisation, particularly in solar-rich arid regions. However, identifying optimal locations for hydrogen production requires the integration of complex environmental, atmospheric, and infrastructural factors, often compounded by limited availability of direct hydrogen yield data. This study presents a novel Artificial Intelligence (AI) framework for computing green hydrogen yield and site suitability index using mean absolute SHAP (SHapley Additive exPlanations) values. This framework consists of a multi-stage pipeline of unsupervised multi-variable clustering, supervised machine learning classifier and SHAP algorithm. The pipeline trains on an integrated meteorological, topographic and temporal dataset and the results revealed distinct spatial patterns of suitability and relative influence of the variables. With model predictive accuracy of 98%, the result also showed that water proximity, elevation and seasonal variation are the most influential factors determining green hydrogen site suitability in Oman with mean absolute shap values of 2.470891, 2.376296 and 1.273216 respectively. Given limited or absence of ground-truth yield data in many countries that have green hydrogen prospects and ambitions, this study offers an objective and reproducible alternative to subjective expert weightings, thus allowing the data to speak for itself and potentially discover novel latent groupings without pre-imposed assumptions. This study offers industry stakeholders and policymakers a replicable and scalable tool for green hydrogen infrastructure planning and other decision making in data-scarce regions.


A Feed-Forward Artificial Intelligence Pipeline for Sustainable Desalination under Climate Uncertainties: UAE Insights

Nwafor, Obumneme, Nwafor, Chioma, Zakaria, Amro, Nwankwo, Nkechi

arXiv.org Artificial Intelligence

The United Arab Emirates (UAE) relies heavily on seawater desalination to meet over 90% of its drinking water needs. Desalination processes are highly energy intensive and account for approximately 15% of the UAE's electricity consumption, contributing to over 22% of the country's energy-related CO2 emissions. Moreover, these processes face significant sustainability challenges in the face of climate uncertainties such as rising seawater temperatures, salinity, and aerosol optical depth (AOD). AOD greatly affects the operational and economic performance of solar-powered desalination systems through photovoltaic soiling, membrane fouling, and water turbidity cycles. This study proposes a novel pipelined two-stage predictive modelling architecture: the first stage forecasts AOD using satellite-derived time series and meteorological data; the second stage uses the predicted AOD and other meteorological factors to predict desalination performance efficiency losses. The framework achieved 98% accuracy, and SHAP (SHapley Additive exPlanations) was used to reveal key drivers of system degradation. Furthermore, this study proposes a dust-aware rule-based control logic for desalination systems based on predicted values of AOD and solar efficiency. This control logic is used to adjust the desalination plant feed water pressure, adapt maintenance scheduling, and regulate energy source switching. To enhance the practical utility of the research findings, the predictive models and rule-based controls were packaged into an interactive dashboard for scenario and predictive analytics. This provides a management decision-support system for climate-adaptive planning.


On the Importance of Clearsky Model in Short-Term Solar Radiation Forecasting

Voyant, Cyril, Despotovic, Milan, Notton, Gilles, Saint-Drenan, Yves-Marie, Asloune, Mohammed, Garcia-Gutierrez, Luis

arXiv.org Artificial Intelligence

Clearsky models are widely used in solar energy for many applications such as quality control, resource assessment, satellite-base irradiance estimation and forecasting. However, their use in forecasting and nowcasting is associated with a number of challenges. Synchronization errors, reliance on the Clearsky index (ratio of the global horizontal irradiance to its cloud-free counterpart) and high sensitivity of the clearsky model to errors in aerosol optical depth at low solar elevation limit their added value in real-time applications. This paper explores the feasibility of short-term forecasting without relying on a clearsky model. We propose a Clearsky-Free forecasting approach using Extreme Learning Machine (ELM) models. ELM learns daily periodicity and local variability directly from raw Global Horizontal Irradiance (GHI) data. It eliminates the need for Clearsky normalization, simplifying the forecasting process and improving scalability. Our approach is a non-linear adaptative statistical method that implicitely learns the irradiance in cloud-free conditions removing the need for an clear-sky model and the related operational issues. Deterministic and probabilistic results are compared to traditional benchmarks, including ARMA with McClear-generated Clearsky data and quantile regression for probabilistic forecasts. ELM matches or outperforms these methods, providing accurate predictions and robust uncertainty quantification. This approach offers a simple, efficient solution for real-time solar forecasting. By overcoming the stationarization process limitations based on usual multiplicative scheme Clearsky models, it provides a flexible and reliable framework for modern energy systems.


Research on visual simultaneous localization and mapping technology based on near infrared light

Ma, Rui, Liu, Mengfang, Li, Boliang, Li, Xinghui

arXiv.org Artificial Intelligence

SLAM originated from the probabilistic SLAM problem at the IEEE Robot and Automation Conference held in San Francisco in 1986[2], and experienced three stages of initial theoretical exploration (1986-2004), algorithmic framework development (2004-2015), and system robustness improvement (2015-now)[3]. According to the sensor classification, the SLAM technology can be divided into laser SLAM, visual SLAM, and multi-sensor fusion SLAM. Laser SLAM is scanned by lidar, who are suitable for indoor environment but inaccurate positioning in a single repeated environment[4-6]. Visual SLAM captures images through the camera, acquires positions and maps through image pixels and features, and is suitable for textured rich scenes. In addition, visual SLAM has the advantages of low cost and small size, which can provide intuitive visual input[7-9].