Goto

Collaborating Authors

Big Data


Using Artificial Intelligence in Big Data

#artificialintelligence

Simply put, Artificial Intelligence (AI) is the level of intelligence exhibited by machines, in comparison to natural intelligence exhibited by human beings and animals. Therefore it is sometimes referred to as Machine Intelligence. When taught, a machine can effectively perceive its environment and take certain actions to better its chances of achieving set goals successfully. How can a machine be taught? The root of Machine learning involves writing codes or commands using a programming language that the machine understands.


NLP: Some useful notes about Text Processing

#artificialintelligence

Many times we listen to speak about machine learning, but it is important to know that there are other pipelines before machine learning which play a significant role in the study of Big Data. Some examples are ETL (extract, transform and load) or NLP(natural language processing). Nowadays, in particular, NLP pipeline is taking more and more space in Data Science. So, what is Natural Language Processing? Natural Language Processing is a process that permits Data Scientist or Data Analyst to extract important information from human language. For example, with NLP it is possible to find an important pattern by studying texts in posts or comments available on a social network.


How to make your data team efficient for times of crisis

#artificialintelligence

Times have changed and caught most of us unprepared. It is always a part of Bolt's culture to move quickly and adapt -- and the crisis situation that is unfolding due to a pandemic definitely requires significant adaptation. This is a look from inside Bolt's data team -- data analysts, data engineers, data scientists -- as we share our experience and advice for times of crisis with all the similar teams out there. Most of the resources are thrown into surviving and, for some, even on seizing new opportunities. Data teams definitely have a role to play in this.


Success in the next normal starts with employee education and reskilling

ZDNet

The first time I met Adam Rauh was earlier this year at the 2020 National Retail Federation (NRF) in New York. Adam is a lead solutions engineer at Tableau, a Salesforce company, and was at NRF with the purpose of demonstrating some of the most advanced analytics use cases in the retail industry. Rauh and I did a quick video demonstration of retailers can use analytics and machine learning to improve revenue, profitability, and customer experience. Our video had nearly 10K views at the conference. When data is visualized, everyone can take action.@tableau


Why businesses must prepare for hyper automation now

#artificialintelligence

Automation has been used for decades in a wide range of industries to boost efficiency and productivity, reduce waste and ensure quality and safety. Emerging technologies such as Artificial Intelligence (AI), Natural Language Processing (NLP) and big data analytics are now being combined with automation, to deal with more complex problems and bring further improvements to business processes. This convergence of automation and intelligence is known as hyper automation. Also known as cognitive or smart automation, hyper automation is at the forefront of the 4th Industrial Revolution and is gradually making its way into every aspect of business, delivering unprecedented results. There are a number of factors driving the adoption of hyper automation among enterprises, including the ability to improve operational and service performance.


An Essential Component In Any Insurtech Solution Tech-stack - Suyati Technologies

#artificialintelligence

The insurance industry is way past its time when timely response and a balanced price-quality relationship were enough to define customer experience. The advent of Artificial Intelligence, Machine Learning, and Advanced Analytics have disrupted the insurance industry and have reshaped the way it operates. Insurtech firms these days are using their AI and ML capabilities to drive high-quality customer experiences, increased loyalty, generate new revenue while simultaneously reducing the costs. The vision of the insurance firms today and for the future is where customers and customer experience comes first. The combination of AI and ML models built on top of the Customer Data Platform leads to improved customer experience through hyper-personalization.


Global Big Data Conference

#artificialintelligence

A major marketing firm has turned to IBM Watson Studio, and its data, to create an interactive platform that predicts the risk, readiness and recovery periods for counties hit by the coronavirus. Global digital marketing firm Wunderman Thompson launched its Risk, Readiness and Recovery map, an interactive platform that helps enterprises and governments make market-level decisions, amid the coronavirus pandemic. The platform, released May 21, uses Wunderman Thompson's data, as well as machine learning technology from IBM Watson, to predict state and local government COVID-19 preparedness and estimated economic recovery timetables for businesses and governments. The idea for the Risk, Readiness and Recovery map, a free version of which is available on Wunderman Thompson's website, originated two months ago as the global pandemic accelerated, said Adam Woods, CTO at Wunderman Thompson Data. "We were looking at some of the visualizations that were coming in around COVID-19, and we were inspired to really say, let's look at the insight that we have and see if that can make a difference," Woods said.


Global Big Data Conference

#artificialintelligence

B2B software sales and marketing teams love hearing the term "artificial intelligence" (AI). AI has a smoke and mirrors effect. But, when we say "AI is doing this," our buyers often know so little about AI that they don't ask the hard questions. In industries like the DevTools space, it is crucial that buyers understand both what products do and what their limitations are to ensure that these products meet their needs. If the purpose of AI is to make good decisions for humans, to accept that "AI is doing this" is to accept that we don't really know how the product works or if it is making good decisions for us.


Global Big Data Conference

#artificialintelligence

Last Tuesday, Google shared a blog post highlighting the perspectives of three women of color employees on fairness and machine learning. I suppose the comms team saw trouble coming: The next day NBC News broke the news that diversity initiatives at Google are being scrapped over concern about conservative backlash, according to eight current and former employees speaking on condition of anonymity. The news led members of the House Tech Accountability Caucus to send a letter to CEO Sundar Pichai on Monday. Citing Google's role as a leader in the U.S. tech community, the group of 10 Democrats questioned why, despite corporate commitments over years, Google diversity still lags behind the diversity of the population of the United States. The 10-member caucus specifically questioned whether Google employees working with AI receive additional bias training.


Announcing the First ODSC Europe 2020 Virtual Conference Speakers

#artificialintelligence

ODSC's first virtual conference is a wrap, and now we've started planning for our next one, the ODSC Europe 2020 Virtual Conference from September 17th to the 19th. We're thrilled to announce the first group of expert speakers to join. During the event, speakers will cover topics such as NLP machine learning quant finance deep learning data visualization data science for good image classification transfer learning recommendation systems and much, much more. Dr. Jiahong Zhong is the Head of Data Science at Zopa LTD, which facilitates peer-to-peer lending and is one of the United Kingdom's earliest fintech companies. Before joining Zopa, Zhong worked as a researcher on the Large Hadron Collider Project at CERN, focusing on statistics, distributed computing, and data analysis.