Data and advanced analytics lie at the core of every financial institution wanting to build stronger engagement capabilities. Unfortunately, many organizations continue to struggle to apply data that will improve the customer journey, or to move from reactive to proactive communication. To build a successful consumer engagement strategy, banks and credit unions have to better understand -- and in real time -- the consumer opportunities and threats that data reveals. The challenge: Most organizations have cumbersome data and analytic back offices and outdated data policies. And they lack sufficient talent to make the application of insights timely and reliable.
There is quite a bit of attention focused on big data, machine learning and artificial intelligence, with these enabling technologies having a significant impact on businesses across the globe. However, there are some who are still resilient to change and find it difficult to integrate these methodologies and processes into their day-to-day work life. As a result of this, businesses are often confronted with a range of headwinds against these technologies which desperately need to be dispelled if the organisations affected are to thrive in this data-led world. We've broken down some of the most commonly encountered prejudices into four statements frequently heard by business leaders: "Why would I need to change if my processes are working just fine?" One of the most common responses heard when discussing the need for analytics is that it isn't needed.
AI has become advanced over the years. AI algorithms can now make texts that can fool people, which can potentially provide a way to mass-produce fake news, bogus reviews, and even fake social accounts. Fortunately, AI can also be used to identify fake text. You can try it out for yourself via this link. Know more about the research over at Technology Review.
A particularly exciting subject is machine learning. The idea of cognitive artificial intelligence sounds simple yet revolutionary. However, its application at Bayer is proving to be more difficult than expected, and is not fully implemented at the present stage. Do you have an idea for a new use of machine learning in everyday work? Could machine learning perhaps improve data quality, automatic processes, or even flexible pricing strategies?
The NVIDIA Jetson Nano is a low cost AI computer designed for learners and developers. The SparkFun JetBot AI Robot is a cool kit powered by Jetson Nano that comes with everything you need to learn practical AI applications such as object tracking and collision avoidance. The robot is compatible with popular AI frameworks such as TensorFlow, PyTorch, Caffe, and MXNet.
This story started in mid 60 in the last century. Scientists and engineers found a lot of problems that were too complicated for traditional algorithms that it was not possible to create a program that could have possibly solved the problem. Imagine a case where you have an object that is descirbed by 21 properties (as an input) and based on these properties you need to classify it to group A or to the group B. How would you solve the problem? Would you need to analyze all the 21 attributes and for all their possible combinations say whether your object is in group A or B. That means you would need to write 5.1090942e 19 instructions like IF statements. This is not possible at all I think!
AI isn't science fiction or a future technology we're waiting to adopt. It is, right now, affecting every aspect of our daily lives, and that includes how we develop applications, products, and services. Every few years, there's a new buzzword technology that drives mass hype as it promises to disrupt the status quo: software, mobile, IoT, 3D printing, virtual reality, blockchain. In 2016, every company desperately wanted to latch on to artificial intelligence (AI). So while the earliest innovators (think Alan Turing) were studying how computers could mimic humans in the 1950s, we just recently witnessed a hype cycle triggered by the potential for AI to cause the next generational shift in computing.
The Fourth Industrial Revolution1 and emerging technologies--such as the Internet of Things, artificial intelligence, robotics and additive manufacturing--are spurring the development of new production techniques and business models that will fundamentally transform production. Both the speed and the scope of technological change, combined with the emergence of other trends, add a layer of complexity to the already challenging task of developing and implementing industrial strategies that promote productivity and inclusive growth. Further, recent changes put the competitiveness paradigm of low-cost manufacturing exports as a means for growth and development at risk. Countries need to decide how to best respond in this new production paradigm vis-à-vis their national strategies and their ambition to leverage production as a national capability. This requires countries to first understand the factors and conditions that have the greatest impact on the transformation of their production systems and then assess their readiness for the future.