Data Science


Lockheed Martin and NEC to Enhance Satellites, Space Travel with Artificial Intelligence

#artificialintelligence

TOKYO, Dec 14, 2017 - (JCN Newswire) - Lockheed Martin and NEC Corporation (TSE: 6701) today announced that Lockheed Martin will use NEC's System Invariant Analysis Technology (SIAT) in the space domain. SIAT's advanced analytics engine uses data collected from sensors to learn the behavior of systems, including computer systems, power plants, factories and buildings, enabling the system itself to automatically detect inconsistencies and prescribe resolutions. NEC's advanced Artificial Intelligence (AI) capabilities and Lockheed Martin's space domain expertise offer new opportunities in developing enhanced integrated satellite and spacecraft operations with uniquely developed prescriptive analytics. These include rapid assessments of changes in performance and the space environment, such as the potential influence of space weather on electronics. With this information, operators can improve product performance and lifecycle efficiency.


Lockheed selects NEC artificial intelligence software to study space data - SpaceNews.com

#artificialintelligence

The United States' largest military contractor Lockheed Martin Corp. announced Wednesday it will start using artificial intelligence software from NEC to analyze data collected by sensors in space. Intelligent machines are taking the technology world by storm, and have started to move to outer space. One of the most advantageous uses of smart software is to analyze data, and increasingly governments and industries see AI as the answer to the big-data deluge, much of it coming from space. "AI can revolutionize how we use information from space, both in orbit and on deep space missions, including crewed missions to Mars and beyond," said Carl Marchetto, vice president of new ventures at Lockheed Martin Space, based in Denver, Colorado. NEC Corporation is a global information technology firm headquartered in Tokyo, Japan.


AWS Announces Five New Machine Learning Services and the World's First Deep Learning-Enabled Video Camera for Developers

#artificialintelligence

Amazon SageMaker is a fully managed service for developers and data scientists to quickly build, train, deploy, and manage their own machine learning models. AWS also introduced AWS DeepLens, a deep learning-enabled wireless video camera that can run real-time computer vision models to give developers hands-on experience with machine learning. And, AWS announced four new application services that allow developers to build applications that emulate human-like cognition: Amazon Transcribe for converting speech to text; Amazon Translate for translating text between languages; Amazon Comprehend for understanding natural language; and, Amazon Rekognition Video, a new computer vision service for analyzing videos in batches and in real-time. Today, implementing machine learning is complex, involves a great deal of trial and error, and requires specialized skills. Developers and data scientists must first visualize, transform, and pre-process data to get it into a format that an algorithm can use to train a model.


Alteryx Helps Accelerate Deployment and Management of Machine Learning Models on Amazon Web Services

#artificialintelligence

WIRE)--Alteryx, Inc. (NYSE: AYX), a leader in self-service data analytics, today announced that customers can take advantage of one-click, self-service predictive model deployment capabilities of the Alteryx solution on Amazon Web Services (AWS). Alteryx has also achieved Amazon Web Services (AWS) Machine Learning (ML) Competency status. The designation recognizes Alteryx for providing business analysts, data scientists and ML practitioners with automated, cutting-edge tools to create and deploy predictive models on AWS. Deployment of predictive models continues to be a major challenge for many companies, with only 13% of data scientists surveyed by Rexer Analytics saying their models always get deployed. Alteryx Promote was designed to help customers address the labor-intensive process of getting models into production by providing an end-to-end data science system for developing, deploying and managing predictive models and real-time decision APIs.


Open Data Science Conference 2018: The Future of AI is Here

#artificialintelligence

ODSC brings together some of the brightest minds shaping the future of AI and data science at conferences around the world, including Boston, London, and San Francisco. Artificial intelligence and data science are poised to dominate the technology market over the next few years. Data scientists and AI experts are leveraging this technology to make autonomous machines, conversational AI, machine vision and many related technologies ubiquitous. Companies moving quickly to exploit the competitive advantages made possible by these technologies will undoubtedly dominate their industries. However, unlike some of the technologies of the past, much of AI and data science does not lend itself easily to turn-key solutions.


Signals Marketplace Connects Traders With Data Scientists and Machine Learning Strategies - Bitsonline

@machinelearnbot

Bitcoin Press Release: Signals Network provides sophisticated machine learning algorithms to help cryptotraders build their investment strategies. November 22, 2017, Prague, Czech Republic -- Crypto trading strategies are about to become a lot smarter. Signals, a Prague based startup, is building a platform to connect traders with data scientists. Signals will have an interface where traders will be able to assemble machine learning-powered trading strategy with a few clicks. Signals is going to offer sophisticated machine learning algorithms to anyone, and its team wants to achieve that by building a network open to cryptotraders and data science developers.


Domino Data Lab Achieves AWS Machine Learning Competency Status

#artificialintelligence

SAN FRANCISCO--(BUSINESS WIRE)--Domino Data Lab (Domino), a leading solution for data science acceleration, announced today that it has achieved Amazon Web Services (AWS) Machine Learning (ML) Competency status. This designation recognizes Domino for providing business analysts, data scientists and ML practitioners with automated, cutting-edge tools to create and deploy predictive models on AWS. While already an Advanced Technology Partner in the AWS Partner Network (APN), achieving the AWS ML Competency differentiates Domino as an APN member that has built solutions to help organizations solve their data challenges, enable machine learning and data science workflows or offer API-based capabilities that enhance end applications with machine intelligence. Attaining the AWS ML Competency demonstrates to customers that Domino has proven expertise in AI and ML on AWS. "We've found that data science workloads are perfectly suited for the cloud, and many mature data science organizations run Domino on AWS," said Scott Armstrong, head of business development at Domino.


H2O.ai Raises $40 Million to Democratize Artificial Intelligence for the Enterprise

#artificialintelligence

WIRE)--H2O.ai, the leading company bringing AI to enterprises, today announced it has completed a $40 million Series C round of funding led by Wells Fargo and NVIDIA with participation from New York Life, Crane Venture Partners, Nexus Venture Partners and Transamerica Ventures, the corporate venture capital fund of Transamerica and Aegon Group. The Series C round brings H2O.ai's total amount of funding raised to $75 million. The new investment will be used to further democratize advanced machine learning and for global expansion and innovation of Driverless AI, an automated machine learning and pipelining platform that uses "AI to do AI." H2O.ai continued its juggernaut growth in 2017 as evidenced by new platforms and partnerships. The company launched Driverless AI, a product that automates AI for non-technical users and introduces visualization and interpretability features that explain the data modeling results in plain English, thus fostering further adoption and trust in artificial intelligence.


Danske Bank and Teradata Implement AI Engine that Monitors Fraud in Real Time

#artificialintelligence

Teradata has announced today that Danske Bank, a financial services leader in the Nordics, has worked with Think Big Analytics, a Teradata company, to create and launch a state-of-art, AI-driven fraud detection platform that is already expected to meet 100 percent ROI in its first year of production. The engine uses machine leaning to analyze tens of thousands of latent features, scoring millions of online banking transactions in real-time to provide actionable insight regarding true, and false, fraudulent activity. By significantly reducing the cost of investigating false-positives, Danske Bank increases its overall efficiency and is now poised for substantial savings. "Application fraud is a critical, top of the agenda issue for banks, and there is evidence that criminals are becoming savvier by the day; employing sophisticated machine learning techniques to attack, so it's critical to use advanced techniques, such as machine learning to catch them," said Nadeem Gulzar, Head of Advanced Analytics, Danske Bank. "The bank understands that fraud is set to get worse in the near and long-term future due to the increased digitization of banking and the prevalence of mobile banking applications.


Deep learning and big data: Wall Street and the new data paradigm

#artificialintelligence

Wall Street is big business, and it is about to become even bigger with the rise of big data. It is every investor's dream to have prior knowledge of the direction of the market before it happens, which is why financial investment firms are driven to mine for data rather than for gold in the information economy. Traditionally, investors have based their decisions on fundamentals, intuition, and analysis drawn from traditional data sources, such as quarterly earnings reports, financial statement filings to the U.S. Securities and Exchange Commission (SEC), historical market data, institutional research reports and sometimes the so-called "expert networks." The new data-driven paradigm, fueled by new alternative data sources, high performance computing and predictive analytics, offers a more robust framework to generate data-driven investment theses. Data – from satellite images of areas of interest, automated drones, people-counting sensors, container ships' positions, credit card transactional data, jobs and layoffs reports, cell phones, social media, news articles, tweets, online search queries – is now the most valuable commodity for Wall Street.