Goto

Collaborating Authors

 seafloor


How listening to light waves could prevent subsea cables sabotage

Popular Science

Breakthroughs, discoveries, and DIY tips sent every weekday. The lifeblood of global communication flows through more than 807,800 miles worth of garden hose-wide cables woven across the sea floor. These cables, which reportedly transmit over 10 trillion worth of financial data every day, are vulnerable to extreme weather, decay, and, if recent reports are to be believed, acts of sabotage. The Associated Press estimates that at least 11 cables have been damaged since October 2023 in the Baltic Sea alone. Finnish and German authorities traced several of those incidents back to dragged anchors, which they allege may have been intentionally deployed to cause damage for political ends.


A Landmark-Aided Navigation Approach Using Side-Scan Sonar

Davenport, Ellen, Nguyen, Khoa, Jang, Junsu, Ma, Clair, Fish, Sean, Lenain, Luc, Meyer, Florian

arXiv.org Artificial Intelligence

Cost-effective localization methods for Autonomous Underwater Vehicle (AUV) navigation are key for ocean monitoring and data collection at high resolution in time and space. Algorithmic solutions suitable for real-time processing that handle nonlinear measurement models and different forms of measurement uncertainty will accelerate the development of field-ready technology. This paper details a Bayesian estimation method for landmark-aided navigation using a Side-scan Sonar (SSS) sensor. The method bounds navigation filter error in the GPS-denied undersea environment and captures the highly nonlinear nature of slant range measurements while remaining computationally tractable. Combining a novel measurement model with the chosen statistical framework facilitates the efficient use of SSS data and, in the future, could be used in real time. The proposed filter has two primary steps: a prediction step using an unscented transform and an update step utilizing particles. The update step performs probabilistic association of sonar detections with known landmarks. We evaluate algorithm performance and tractability using synthetic data and real data collected field experiments. Field experiments were performed using two different marine robotic platforms with two different SSS and at two different sites. Finally, we discuss the computational requirements of the proposed method and how it extends to real-time applications.


Watch a huge 'No Boys Allowed' shark slumber party

Popular Science

It appears that no boy sharks were invited to this gathering of sleeping female Port Jackson sharks (Heterodontus portusjacksoni) in Australia. The fish were spotted snuggled up along the seafloor at Beagle Marine Park in the central Bass Strait. "There were thousands of sharks tightly packed like a carpet spread across the seafloor," voyage leader and University of Tasmania quantitative marine spatial ecologist Jacquomo Monk said in a statement. "Port Jackson sharks grow to 1.65 meters [5.4 feet] in length and are found across southern Australia." Scientists supported by Australia's National Environmental Science Program from the South Australian Research and Development Institute's research vessel MRV Ngerin were operating an underwater robot when they spotted and recorded the gathering.


A 'history-changing' discovery: 3,000-year-old ship containing wine jugs found 56 miles off the Israeli coast by underwater robots shows ancient seafarers were more daring than previously thought

Daily Mail - Science & tech

An ancient ship containing hundreds of stunningly-preserved wine jugs has been found on the floor of the Mediterranean. The 40-foot vessel, found 1 mile deep on the seafloor 56 miles from Israel's coast, dates back 3,300 years to the late Bronze Age, experts say. It's thought to be the oldest ship found this deep in the Med, as previous shipwrecks from this era never ventured this far away from land. This suggests ancient seafarers were more capable at navigating the deep seas than historians previously thought. The ship likely sunk either from a storm or after coming under attack by pirates, the discoverers believe.


NeuRSS: Enhancing AUV Localization and Bathymetric Mapping with Neural Rendering for Sidescan SLAM

Xie, Yiping, Zhang, Jun, Bore, Nils, Folkesson, John

arXiv.org Artificial Intelligence

Implicit neural representations and neural rendering have gained increasing attention for bathymetry estimation from sidescan sonar (SSS). These methods incorporate multiple observations of the same place from SSS data to constrain the elevation estimate, converging to a globally-consistent bathymetric model. However, the quality and precision of the bathymetric estimate are limited by the positioning accuracy of the autonomous underwater vehicle (AUV) equipped with the sonar. The global positioning estimate of the AUV relying on dead reckoning (DR) has an unbounded error due to the absence of a geo-reference system like GPS underwater. To address this challenge, we propose in this letter a modern and scalable framework, NeuRSS, for SSS SLAM based on DR and loop closures (LCs) over large timescales, with an elevation prior provided by the bathymetric estimate using neural rendering from SSS. This framework is an iterative procedure that improves localization and bathymetric mapping. Initially, the bathymetry estimated from SSS using the DR estimate, though crude, can provide an important elevation prior in the nonlinear least-squares (NLS) optimization that estimates the relative pose between two loop-closure vertices in a pose graph. Subsequently, the global pose estimate from the SLAM component improves the positioning estimate of the vehicle, thus improving the bathymetry estimation. We validate our localization and mapping approach on two large surveys collected with a surface vessel and an AUV, respectively. We evaluate their localization results against the ground truth and compare the bathymetry estimation against data collected with multibeam echo sounders (MBES).


Terrain characterisation for online adaptability of automated sonar processing: Lessons learnt from operationally applying ATR to sidescan sonar in MCM applications

Guerneve, Thomas, Loizou, Stephanos, Munafo, Andrea, Mignotte, Pierre-Yves

arXiv.org Artificial Intelligence

The performance of Automated Recognition (ATR) algorithms on side-scan sonar imagery has shown to degrade rapidly when deployed on non benign environments. Complex seafloors and acoustic artefacts constitute distractors in the form of strong textural patterns, creating false detections or preventing detections of true objects. This paper presents two online seafloor characterisation techniques to improve explainability during Autonomous Underwater Vehicles (AUVs) missions. Importantly and as opposed to previous work in the domain, these techniques are not based on a model and require limited input from human operators, making it suitable for real-time onboard processing. Both techniques rely on an unsupervised machine learning approach to extract terrain features which relate to the human understanding of terrain complexity. The first technnique provides a quantitative, application-driven terrain characterisation metric based on the performance of an ATR algorithm. The second method provides a way to incorporate subject matter expertise and enables contextualisation and explainability in support for scenario-dependent subjective terrain characterisation. The terrain complexity matches the expectation of seasoned users making this tool desirable and trustworthy in comparison to traditional unsupervised approaches. We finally detail an application of these techniques to repair a Mine Countermeasures (MCM) mission carried with SeeByte autonomy framework Neptune.


Bridging the Preference Gap between Retrievers and LLMs

Ke, Zixuan, Kong, Weize, Li, Cheng, Zhang, Mingyang, Mei, Qiaozhu, Bendersky, Michael

arXiv.org Artificial Intelligence

Large Language Models (LLMs) have demonstrated superior results across a wide range of tasks, while retrieval has long been established as an effective means of obtaining task-relevant information for humans. Retrieval-augmented Generation (RAG) are known for their effectiveness in knowledge-intensive tasks by locating relevant information and placing it within the context window of the LLM. However, the relationship between retrievers and LLMs is still under-investigated. Most existing work treats the retriever and the LLM as independent components and leaves a gap between retrieving human-friendly information and assembling a LLM-friendly context. In this work, we examine a novel bridge model, validate the ranking and selection assumptions in retrievers in the context of RAG, and propose a training framework that chains together supervised and reinforcement learning to learn a bridge model. Empirical results demonstrate the effectiveness of our method in both question-answering and personalized generation tasks.


A Fully-automatic Side-scan Sonar SLAM Framework

Zhang, Jun, Xie, Yiping, Ling, Li, Folkesson, John

arXiv.org Artificial Intelligence

Side-scan sonar (SSS) is a lightweight acoustic sensor that is frequently deployed on autonomous underwater vehicles (AUVs) to provide high-resolution seafloor images. However, using side-scan images to perform simultaneous localization and mapping (SLAM) remains a challenge when there is a lack of 3D bathymetric information and discriminant features in the side-scan images. To tackle this, we propose a feature-based SLAM framework using side-scan sonar, which is able to automatically detect and robustly match keypoints between paired side-scan images. We then use the detected correspondences as constraints to optimize the AUV pose trajectory. The proposed method is evaluated on real data collected by a Hugin AUV, using as a ground truth reference both manually-annotated keypoints and a 3D bathymetry mesh from multibeam echosounder (MBES). Experimental results demonstrate that our approach is able to reduce drifts from the dead-reckoning system. The framework is made publicly available for the benefit of the community.


Seafloor Classification based on an AUV Based Sub-bottom Acoustic Probe Data for Mn-crust survey

Neettiyath, Umesh, Sugimatsu, Harumi, Thornton, Blair

arXiv.org Artificial Intelligence

The possibility of automatically classifying high frequency sub-bottom acoustic reflections collected from an Autonomous Underwater Robot is investigated in this paper. In field surveys of Cobalt-rich Manganese Crusts (Mn-crusts), existing methods relies on visual confirmation of seafloor from images and thickness measurements using the sub-bottom probe. Using these visual classification results as ground truth, an autoencoder is trained to extract latent features from bundled acoustic reflections. A Support Vector Machine classifier is then trained to classify the latent space to idetify seafloor classes. Results from data collected from seafloor at 1500m deep regions of Mn-crust showed an accuracy of about 70%.


CUREE: A Curious Underwater Robot for Ecosystem Exploration

Girdhar, Yogesh, McGuire, Nathan, Cai, Levi, Jamieson, Stewart, McCammon, Seth, Claus, Brian, Soucie, John E. San, Todd, Jessica E., Mooney, T. Aran

arXiv.org Artificial Intelligence

The current approach to exploring and monitoring complex underwater ecosystems, such as coral reefs, is to conduct surveys using diver-held or static cameras, or deploying sensor buoys. These approaches often fail to capture the full variation and complexity of interactions between different reef organisms and their habitat. The CUREE platform presented in this paper provides a unique set of capabilities in the form of robot behaviors and perception algorithms to enable scientists to explore different aspects of an ecosystem. Examples of these capabilities include low-altitude visual surveys, soundscape surveys, habitat characterization, and animal following. We demonstrate these capabilities by describing two field deployments on coral reefs in the US Virgin Islands. In the first deployment, we show that CUREE can identify the preferred habitat type of snapping shrimp in a reef through a combination of a visual survey, habitat characterization, and a soundscape survey. In the second deployment, we demonstrate CUREE's ability to follow arbitrary animals by separately following a barracuda and stingray for several minutes each in midwater and benthic environments, respectively.