Goto

Collaborating Authors

 Red Sea


Flexible and efficient spatial extremes emulation via variational autoencoders

arXiv.org Machine Learning

Many real-world processes have complex tail dependence structures that cannot be characterized using classical Gaussian processes. More flexible spatial extremes models exhibit appealing extremal dependence properties but are often exceedingly prohibitive to fit and simulate from in high dimensions. In this paper, we develop a new spatial extremes model that has flexible and non-stationary dependence properties, and we integrate it in the encoding-decoding structure of a variational autoencoder (XVAE), whose parameters are estimated via variational Bayes combined with deep learning. The XVAE can be used as a spatio-temporal emulator that characterizes the distribution of potential mechanistic model output states and produces outputs that have the same statistical properties as the inputs, especially in the tail. As an aside, our approach also provides a novel way of making fast inference with complex extreme-value processes. Through extensive simulation studies, we show that our XVAE is substantially more time-efficient than traditional Bayesian inference while also outperforming many spatial extremes models with a stationary dependence structure. To further demonstrate the computational power of the XVAE, we analyze a high-resolution satellite-derived dataset of sea surface temperature in the Red Sea, which includes 30 years of daily measurements at 16703 grid cells. We find that the extremal dependence strength is weaker in the interior of Red Sea and it has decreased slightly over time.


A.R. Rahman, Shekhar Kapur Talk Metaverse, VR, AI at Goa Festival - Variety A.R. Rahman, Shekhar Kapur Talk Metaverse, VR, AI at Goa Festival – Variety

#artificialintelligence

Machines can never replace human creativity and technology should be in mankind's service were the biggest takeaways from a heavyweight panel looking to the future of content at the International Film Festival of India, Goa, on Sunday. The panel was devised and led by eminent filmmaker Shekhar Kapur (Red Sea Film Festival opener "What's Love Got to Do with It?"). Participants included Oscar-winning "Slumdog Millionaire" composer A.R. Rahman, Ronald Menzel, co-founder and chief strategy officer at Dreamscape Immersive, with tech maven Pranav Mistry, who was formerly CEO and president of Samsung Technology and Advanced Research, joining via video link. The panelists discussed the concept of the metaverse, which is still in is nascency. Mistry envisaged a future powered by VR, AR and AI where the audience participated in an MCU movie and solved world problems.


A robotic avatar for deep-sea exploration

VideoLectures.NET

The promise of oceanic discovery has intrigued scientists and explorers, whether to study underwater ecology and climate change, or to uncover natural resources and historic secrets buried deep at archaeological sites. To meet the challenge of accessing oceanic depths, Stanford University, working with KAUST's Red Sea Research Center and MEKA Robotics, developed Ocean One, a bimanual force-controlled humanoid robot that affords immediate and intuitive haptic interaction in oceanic environments.


Casetext raises $12 million for legal research assistant CARA

#artificialintelligence

Legal research company Casetext has raised $12 million in a new round of funding. The money will be used to expand its software platform that offers insights into cases cited in legal documents and further develop CARA (Case Analysis Research Assistant), an AI-powered assistant for lawyers. Using natural language understanding, Casetext scans the text of legal briefs to locate and analyze case citations. The company also offers access to 10 million court cases and statutes annotated by a community of litigators. The $12 million funding round was led by Canvas Ventures, with participation from Union Square Ventures, 8VC, and Red Sea Ventures.


'Squishy Finger' Soft Robot Hands Allow Sampling of Delicate Corals

National Geographic

Their squishy robotic hands can gather coral samples more delicately than robots, and in places humans can't reach. Developed with support from a National Geographic Innovation Challenge Grant, the hands were first tested in tanks in March 2015 and then taken to the Red Sea in May. After a successful expedition, Wood and Gruber hope the technology may have even broader applications.


Stanford's humanoid robot diver explores its first shipwreck

Engadget

Stanford's five-foot "virtual diver" was originally built for studying coral reefs in the Red Sea where a delicate touch is necessary, but the depths go well beyond the range of meat-based divers. The "tail" section contains the merbot's onboard batteries, computers and array of eight thrusters, but it is the front half that looks distinctly humanoid with two eyes for stereoscopic vision and two nimble, articulated arms. Those arms are what make OceanOne ideal for fragile reef environments or priceless shipwrecks like La Lune, which sank off the coast of France over 350 years ago and hasn't been touched until now. Force sensors in each wrist transmit haptic feedback to the pilot, allowing them to feel the object's weight while staying high and dry on a dive ship. The robot's "brain" works with the tactile sensors to ensure the hands don't crush fragile objects, while the navigation system can automatically keep the body steady in turbulent seas.