Goto

Collaborating Authors

 O'Neill, Kevin


RGB to Hyperspectral: Spectral Reconstruction for Enhanced Surgical Imaging

arXiv.org Artificial Intelligence

This study investigates the reconstruction of hyperspectral signatures from RGB data to enhance surgical imaging, utilizing the publicly available HeiPorSPECTRAL dataset from porcine surgery and an in-house neurosurgery dataset. Various architectures based on convolutional neural networks (CNNs) and transformer models are evaluated using comprehensive metrics. Transformer models exhibit superior performance in terms of RMSE, SAM, PSNR and SSIM by effectively integrating spatial information to predict accurate spectral profiles, encompassing both visible and extended spectral ranges. Qualitative assessments demonstrate the capability to predict spectral profiles critical for informed surgical decision-making during procedures. Challenges associated with capturing both the visible and extended hyperspectral ranges are highlighted using the MAE, emphasizing the complexities involved. The findings open up the new research direction of hyperspectral reconstruction for surgical applications and clinical use cases in real-time surgical environments.


Sketching the Heat Kernel: Using Gaussian Processes to Embed Data

arXiv.org Machine Learning

This paper introduces a novel, non-deterministic method for embedding data in low-dimensional Euclidean space based on computing realizations of a Gaussian process depending on the geometry of the data. This type of embedding first appeared in (Adler et al, 2018) as a theoretical model for a generic manifold in high dimensions. In particular, we take the covariance function of the Gaussian process to be the heat kernel, and computing the embedding amounts to sketching a matrix representing the heat kernel. The Karhunen-Lo\`eve expansion reveals that the straight-line distances in the embedding approximate the diffusion distance in a probabilistic sense, avoiding the need for sharp cutoffs and maintaining some of the smaller-scale structure. Our method demonstrates further advantage in its robustness to outliers. We justify the approach with both theory and experiments.


CA-PCA: Manifold Dimension Estimation, Adapted for Curvature

arXiv.org Machine Learning

Much of modern data analysis in high dimensions relies on the premise that data, while embedded in a high-dimensional space, lie on or near a submanifold of lower dimension. This allows one to embed the data in a space of lower dimension while preserving much of the essential structure, with benefits including faster computation and data visualization. This lower dimension, hereafter referred to as the intrinsic dimension (ID) of the underlying manifold, often enters as a parameter of the dimension-reduction scheme. For instance, in each of the Johnson-Lindenstrauss-type results for manifolds by [13] and [4] the target dimension depends on the ID. Furthermore, the ID is a parameter of popular dimension reduction methods such as t-SNE [28] and multidimensional scaling [12, 16]. Therefore, it may be beneficial to estimate the ID before running further analysis since compressing the data too much may destroy underlying structure and it may be computationally expensive to re-run algorithms with a new dimension parameter, if such an error is even detectable.