business rule


Combating Insurance Fraud With Machine Learning Fintech Finance

#artificialintelligence

Most insurance companies depend on human expertise and business rules-based software to protect themselves from fraud. And the drive for digital transformation and process automation means data and scenarios change faster than you can update the rules. Machine learning has the potential to allow insurers to move from the current state of "detect and react" to "predict and prevent." It excels at automating the process of taking large volumes of data, analysing multiple fraud indicators in parallel – which taken individually may often be quite normal – and finding potential fraud. Generally, there are two ways to teach or train a machine learning algorithm, which depend on the available data: supervised and unsupervised learning.


Big Data Quotes of the Week - Dec. 18, 2019

#artificialintelligence

"Leaders don't understand what adoption of AI means. Many companies feel pressured to adopt AI by any means necessary -- without thinking through the why and how ... The metaphor that comes to mind is a fish lured to the next shiny bauble, only to realize too late that the hook will be its last meal." "People with disabilities constitute an untapped pool of critically skilled talent. AI, augmented reality (AR), virtual reality (VR) and other emerging technologies have made work more accessible for employees with disabilities." "The emergence of AI has prompted a fierce debate in English education over whether a knowledge-based curriculum is still appropriate for children who will potentially have to compete with robots and other AI technology in a future jobs market."


How to integrate robotic process automation in big data projects

#artificialintelligence

Information Services Group (ISG) reported in 2018 that 92% of companies were aiming to adopt robotic process automation (RPA) by 2020 because they wanted to increase operational efficiencies. This large number reflects how eager companies are to automate routine business processes. One of the easiest places to employ RPA is in very simple, highly repetitive business processes that rely on transactional data that comes in fixed record lengths, with data fields always in the same locations. This data is highly predictable, and automation tools like RPA that depend on recognizing repetitive data patterns are in strong positions to excel. However, even the most routine business process consists of unstructured and semi-structured big data, as well as the more traditional fixed record data.


Beyond microservices; Software architecture driven by machine learning

#artificialintelligence

It's not a question of if, it's a question of when and how AI and machine learning will change our programming and software development paradigms. Today's coding models are based on data storage, business logic, services, UX, and presentation. A full stack developer elects to build a three-tiered web architecture using an MVC framework. An IoT application calls for an event-driven architecture with services processing events and broadcasting state changes. These two architecture paradigms converge with microservice architectures where user interfaces are just one type of interaction node fulfilling high level functions by interfacing with many services.


Beyond microservices; Software architecture driven by machine learning

#artificialintelligence

It's not a question of if, it's a question of when and how AI and machine learning will change our programming and software development paradigms. Today's coding models are based on data storage, business logic, services, UX, and presentation. A full stack developer elects to build a three-tiered web architecture using an MVC framework. An IoT application calls for an event-driven architecture with services processing events and broadcasting state changes. These two architecture paradigms converge with microservice architectures where user interfaces are just one type of interaction node fulfilling high level functions by interfacing with many services.


Things I Have Learned About Data Science - KDnuggets

#artificialintelligence

If you think your data is clean, perhaps you have not looked into it yet; if you think your data is messy, it's even messier. Nobody cares how you did it; just do it correctly. People do not care how much you know until they know how much you care (about them and their business). In 2-3 years, nobody will talk about Big Data anymore. It always pays off to be damn good at numbers, Excel, and PowerPoint (and yes, presentation skills); Tableau is a big plus. Downloading some code and data and running them does not make you a data scientist. The same is true for doing data science courses. Participating in Kaggle competitions does not make you a data scientist, although it can help you learn from others. Winning Kaggle competitions does not necessarily make you a good data scientist. ETL is always needed - be good at it and learn a good tool for it (Talend is a good one). Also, learn scripting languages for ETL. Deep learning is cool, but it's still cool if you don't use it when you don't need it, and in 99% of cases you don't need it. Algorithms are commodities, your data is not. Ideas are commodities, execution is not. Deep learning expertise will soon become a commodity; problem-solving skills won't.


Machine learning: What every risk and compliance professional needs to know

#artificialintelligence

Machine learning is a subset of artificial intelligence (AI) that utilizes algorithms and computer power to sharpen the judgments organizations make about voluminous and disparate data. Simply put, it allows machines to learn how to perform certain tasks without being explicitly programmed to do so. Machine learning already permeates virtually every facet of our lives--it is used every day for image and speech recognition, digital assistants, cyber protection, consumer marketing, medical diagnoses, ferreting out proscribed content on social media platforms, ride-sharing apps, law enforcement, and in countless other applications. Machines could learn how to detect fraud simply by being provided with examples of previously seen fraud cases, and without the need for manually coded business rules. Machine learning helps optimize the mix between humans and machines in an intelligent, accretive process that "learns" as it goes along.


Smart contracts and business rules: The keys to revolutionary blockchain use cases - The developerWorks Blog

#artificialintelligence

The blockchain wave is gathering strength. More and more enterprises are embarking on concrete initiatives, either alone or in collaboration with their peers, their business partners, their clients, or their suppliers. Most of the use cases I've seen so far, apart from a couple of exceptions, are exploratory and mostly focus on using blockchain as a shared and trusted database of assets and transactions. The really interesting use cases that realize the exponential and disruptive benefit of the blockchain are yet to come. The most visionary players have revolutionary use cases in their roadmaps, but right now, enterprises are taking foundational steps necessary to pave the way for innovative use cases that might disrupt entire industries.


4 Design Principles for Data Processing

#artificialintelligence

The practice of Design Patterns is most popular in Object-Oriented Programming (OOP), which has been effectively explained and summarized in the classic book "Design Patterns: Elements of Reusable Object-Oriented Software" by Erich Gamma and Richard Helm. "A software design pattern is a general, reusable solution to a commonly occurring problem within a given context in software design. It is not a finished design that can be transformed directly into source or machine code. It is a description or template for how to solve a problem that can be used in many different situations. Design patterns are formalized best practices that the programmer can use to solve common problems when designing an application or system." For data science, many people may have asked the same question: does data science programming have design patterns?


Combating Insurance Fraud With Machine Learning

#artificialintelligence

Most insurance companies depend on human expertise and business rules-based software to protect themselves from fraud. And the drive for digital transformation and process automation means data and scenarios change faster than you can update the rules. Machine learning has the potential to allow insurers to move from the current state of "detect and react" to "predict and prevent." It excels at automating the process of taking large volumes of data, analysing multiple fraud indicators in parallel – which taken individually may often be quite normal – and finding potential fraud. Generally, there are two ways to teach or train a machine learning algorithm, which depend on the available data: supervised and unsupervised learning.