If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
The process of predicting accurate weather patterns has always been a challenge for meteorologists. The complexity is in observing and processing vast amounts of data, and because most of the weather stations would never be able to collect, process and store so much information. With traditional models, the systems have to read huge data sets from several weather stations that would take many hours to predict the weather accurately. The good news is artificial intelligence (AI) has been able to outperform these traditional methods by streamlining the weather forecasting process while bringing accuracy and reliability in assessing weather reports. The amount of data received from global sensors, weather stations, satellites, and radar, is staggering. The estimation often goes well beyond trillions of data points and is expected to continue to grow.
Relying on last year's weather to predict this year's power outages is an increasingly risky proposition. Climate change is shifting weather patterns in every region, increasing the frequency and severity of storms, wind, and drought. For example, in the wake of the recent tropical storm Isaias, Con Edison suffered its second-largest outage ever, mainly due to damage from trees in high winds. According to Con Ed: "The storm's gusting winds shoved trees and branches onto power lines, bringing those lines and other equipment down and leaving 257,000 customers out of power. The destruction surpassed Hurricane Irene, which caused 204,000 customer outages in August 2011."
Insurance is an industry that thrives on predictability. The more certain the outcome, the more insurance firms can be sure to offer fair rates and generate value for customers and shareholders alike. As such, it's an industry that has been slow to adopt new technologies and adapt to global change. Today, however, change is here, and more is on the way. Global megatrends, from the imminent arrival of the self-driving car to accelerating climate change, threaten to disrupt the insurance sector in a way that's never been seen before.
Rice University engineers have created a deep learning computer system that taught itself to accurately predict extreme weather events, like heat waves, up to five days in advance using minimal information about current weather conditions. Ironically, Rice's self-learning "capsule neural network" uses an analog method of weather forecasting that computers made obsolete in the 1950s. During training, it examines hundreds of pairs of maps. Each map shows surface temperatures and air pressures at five-kilometers height, and each pair shows those conditions several days apart. The training includes scenarios that produced extreme weather -- extended hot and cold spells that can lead to deadly heat waves and winter storms.
Deep-learning software may help scientists predict extreme weather patterns more accurately than relying on today's weather prediction models alone. Simulations involving complex differential equations are run on supercomputers to predict the weather. The accuracy of forecasts using this approach have improved over time, though it's still tricky to pinpoint extreme events like cold spells or heat waves. "It may be that we need faster supercomputers to solve the governing equations of the numerical weather prediction models at higher resolutions," Pedram Hassanzadeh, an assistant professor at the United States' Rice University's Department of Mechanical Engineering, said on Tuesday. "But because we don't fully understand the physics and precursor conditions of extreme-causing weather patterns, it's also possible that the equations aren't fully accurate, and they won't produce better forecasts, no matter how much computing power we put in." Here's where AI may come in handy.
Google is throwing the power of its AI and machine-learning algorithms behind developing faster and more accurate weather forecasts. In a blog post, Google describes a new model developed by the company called'nowcasting' which it says has shown initial success in being able to accurately predict weather patterns with'nearly instantaneous' results. According to a new paper, the method is able to produce forecasts for up to six hours in advance in only five to 10 minutes - figures that it says outperform traditional models even in early stages. While some traditional forecasts generate massive amounts of data, they can also take hours to complete. 'A significant advantage of machine learning is that inference is computationally cheap given an already-trained model, allowing forecasts that are nearly instantaneous and in the native high resolution of the input data,' Google writes.
Launched in January this year, the Maha Agri Tech project seeks to use technology to address various cultivation risks ranging from poor rains to pest attacks, accurately predict crop-wise and area-wise yield and eventually to use this data to inform policy decisions including pricing, warehousing and crop insurance. When farmers in six districts of Maharashtra begin sowing for the coming rabi season, this project will enter its second phase where artificial intelligence and satellite imagery will be used to mitigate risks. Fields of the farmers that are part of the project will be monitored via satellite images at every stage right until the harvest. In its first phase the Maha Agri Tech project used satellite images and analysis from the Maharashtra Remote Sensing Application Centre (MRSAC) and the National Remote Sensing Centre (NRSC) in Hyderabad to assess the acreage and the conditions of select crops in select talukas. In its second phase, various data sets from diverse data providers will be combined to build yield modelling and a geospatial database of soil nutrients, rainfall, moisture stress and other parameters to facilitate location-specific advisories to farmers.
Scientists have extracted a 400-year record of El Niño events using coral reef cores drilled from the Pacific Ocean, revealing crucial new insight on how these weather patterns have changed. And, the data so far suggest something'unusual' has been happening in recent decades. According to the new research, El Niño events appear to be cropping up more frequently in the central Pacific than they have in past centuries, and while eastern El Niños may be getting stronger. El Niño is caused by a shift in the distribution of warm water in the Pacific Ocean around the equator. Usually the wind blows strongly from east to west, due to the rotation of the Earth, causing water to pile up in the western part of the Pacific.
It is one of the marvels of human innovation but artificial intelligence (AI) offers tough competition for us. The days of speculating rain and sunshine may soon fade with artificial intelligence's capability to predict right conditions with precision to an extent. It comprises one of the basic aspects of precision agriculture (PA) promoted even by the government to boost productivity and in turn, farmers' income. AI-based sowing advisories lead to 30 per cent higher yields as Microsoft, in collaboration with ICRISAT, developed an AI Sowing App powered by Microsoft Cortana Intelligence Suite including Machine Learning and Power BI. The app sends sowing advisories to participating farmers on the optimal date to sow without them installing any sensors in their fields or any additional cost; all they need is a phone capable of receiving text messages.
Along America's west coast, the world's most valuable companies are racing to make artificial intelligence smarter. Google and Facebook have boasted of experiments using billions of photos and thousands of high-powered processors. But late last year, a project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government. The record-setting project involved the world's most powerful supercomputer, Summit, at Oak Ridge National Lab.