Big Data


Enterprise AI Goes Mainstream, but Maturity Must Wait - InformationWeek

#artificialintelligence

Artificial intelligence's emergence into the mainstream of enterprise computing raises significant issues -- strategic, cultural, and operational -- for businesses everywhere. What's clear is that enterprises have crossed a tipping point in their adoption of AI. A recent O'Reilly survey shows that AI is well on the road to ubiquity in businesses throughout the world. The key finding from the study was that there are now more AI-using enterprises -- in other words, those that have AI in production, revenue-generating apps -- than organizations that are simply evaluating AI. Taken together, organizations that have AI in production or in evaluation constitute 85% of companies surveyed.


Global Big Data Conference

#artificialintelligence

Job seekers interact more with advancing tech than they realize as more companies turn to automated tools in talent acquisition. The hiring process has come a long way from the days of paper resumés and cold calls via landline. Online job sites are now staples in talent acquisition, but artificial intelligence (AI) and machine learning are elevating the recruiting and hiring landscape. When asked about the current status of AI and machine learning in hiring, Mark Brandau, principal analyst on Forrester's CIO team said, "All vendors are moving in that direction without question. The power of AI lies in its ability to process high volumes of data at fast speeds, improving efficiency and productivity for organizations. Those same features and benefits can also be applied to the hiring process. "As organizations look to AI and machine learning to enhance their practices, there are two goals in mind," said Lauren Smith, vice president of Gartner's HR practice. "The first is how do we drive more efficiency in the process?


Making IoT Data Meaningful with AI-Powered Cognitive Computing

#artificialintelligence

Today, the world is all about industry 4.0 and the technologies brought in by it. From Artificial Intelligence (AI) to Big Data Analytics, all technologies are transforming one or the other industries in some ways. AI-powered Cognitive Computing is one such technology that provides high scale automation with ubiquitous connectivity. More so, it is redefining how IoT technology operates. The need for Cognitive computing in the IoT emerges from the significance of information in present-day business.


Interview: How artificial intelligence will change medicine

#artificialintelligence

Question: You lead the "Scientific Data Management" research group at TIB – Leibniz Information Centre for Science and Technology. You focus your research on how big data technologies can be used in the health sector to improve health care. What exactly are you researching? The amount of available big data has grown drastically in the last decade, and it is expected a faster growth rate in the coming years. Specifically, in the biomedical domain, there are a wide variety of methods, e.g.


ThetaRay, Provider of Big Data and AI-enhanced Analytics Tools, Joins Microsoft's Partner Program to Offer AML Solution

#artificialintelligence

ThetaRay, a provider of Big Data and artificial intelligence (AI)-enhanced analytics tools, has joined Microsoft's (NASDAQ:MSFT) partner program, One Commercial Partner, which provides various cloud-powered solutions. ThetaRay's anti-money laundering (AML) solution for correspondent banking can be accessed through Microsoft's Azure Marketplace. A large US bank has reportedly signed an agreement to use the solution. "We are proud to join the One Commercial Partner program and offer Microsoft Azure customers access to our industry-leading AML for Correspondent Banking solution." "Global banks are increasingly de-risking or abandoning their correspondent banking relationships due to a lack of transparency and fears of money laundering and regulatory fines. Our solution provides banks with the … ability to reverse the trend and grow their business by allowing full visibility into all links of the cross-border payment chain, from originator to beneficiary."


Global Big Data Conference

#artificialintelligence

A challenge on the data science community site Kaggle is asking great minds to apply machine learning to battle the COVID-19 coronavirus pandemic. As COVID-19 continues to spread uncontrolled around the world, shops and restaurants have closed their doors, information workers have moved home, other businesses have shut down entirely, and people are social distancing and self-isolating to "flatten the curve." It's only been a few weeks, but it feels like forever. If you listen to the scientists, we have a way to go still before we can consider reopening and reconnecting. The worst is yet to come for many areas.


Global Big Data Conference

#artificialintelligence

The explosion of breakthroughs, investments, and entrepreneurial activity around artificial intelligence over the last decade has been driven exclusively by deep learning, a sophisticated statistical analysis technique for finding hidden patterns in large quantities of data. A term coined in 1955--artificial intelligence--was applied (or mis-applied) to deep learning, a more advanced version of an approach to training computers to perform certain tasks--machine learning--a term coined in 1959. The recent success of deep learning is the result of the increased availability of lots of data (big data) and the advent of Graphics Processing Units (GPUs), significantly increasing the breadth and depth of the data used for training computers and reducing the time required for training deep learning algorithms. The term "big data" first appeared in computer science literature in an October 1997 article by Michael Cox and David Ellsworth, "Application-controlled demand paging for out-of-core visualization," published in the Proceedings of the IEEE 8th conference on Visualization. They wrote that "Visualization provides an interesting challenge for computer systems: data sets are generally quite large, taxing the capacities of main memory, local disk, and even remote disk. We call this the problem of big data. When data sets do not fit in main memory (in core), or when they do not fit even on local disk, the most common solution is to acquire more resources."


Aarki Named Winner in 2020 Artificial Intelligence Excellence Awards

#artificialintelligence

The Business Intelligence Group announced that Aarki was named a winner in its Artificial Intelligence Excellence Awards program. Aarki is a leading AI-enabled mobile marketing platform that helps companies grow and re-engage their mobile users, using machine learning (MI), big data, and engaging creative. Aarki is recognized for its advancement in prediction capabilities in mobile advertising. With their Pointwise Mutual Information (PMI) model, Aarki is able to effectively model Aarki-specific user conversion funnels while pre-training on non-attributed omnichannel event data. This allows for better calculation of the users' purchase probability and thus improving the return on investment (ROI) prediction and the install probability to decrease the cost per install (CPI).


Primer: Demystifying Data Science - The New Stack

#artificialintelligence

This is the first part of a series by Levon Paradzhanyan that demystifies data science, machine learning, deep learning, and artificial intelligence down while explaining how they all tie into one another. Artificial Intelligence emerged in our lives many years ago. First, as science fiction and today embedded in real products. It has since been followed by newer buzzwords such as data science, machine learning, and deep learning. Yet there are many misconceptions related to these terms.


When Artificial Intelligence Meets Big Data

#artificialintelligence

"Gone are the days of data engineers manually copying data around again and again, delivering datasets weeks after a data scientist requests it"-these are Steven Mih's words about the revolution that artificial intelligence is bringing about, in the scary world of big data. By the time the term "big data" was coined, data had already accumulated massively with no means of handling it properly. In 1880, the US Census Bureau estimated that it would take eight years to process the data it received in that year's census. The government body also predicted that it would take more than 10 years to process the data it would receive in the following decade. Fortunately, in 1881, Herman Hollerith created the Hollerith Tabulating Machine, inspired by a train conductor's punch card.