The use of formal statistical methods to analyse quantitative data in data science has increased considerably over the last few years. One such approach, Bayesian Decision Theory (BDT), also known as Bayesian Hypothesis Testing and Bayesian inference, is a fundamental statistical approach that quantifies the tradeoffs between various decisions using distributions and costs that accompany such decisions. In pattern recognition it is used for designing classifiers making the assumption that the problem is posed in probabilistic terms, and that all of the relevant probability values are known. Generally, we don't have such perfect information but it is a good place to start when studying machine learning, statistical inference, and detection theory in signal processing. BDT also has many applications in science, engineering, and medicine.
In simple terms, that "most important role" is the cycle of observation followed by critical thinking followed by action. It's important to bear in mind that the proper goal of Machine Learning (ML) is not abdication of human responsibility for decision-making. Rather, it's improving our individual and collective ability to make better decisions by leveraging increased speed, accuracy and absence of bias. Our context here is supply chain planning and execution, but there is no reason to limit the scope of Machine Learning. When it comes to designing and creating technology solutions for supply chain analytics and business intelligence, this is not a throw-away idea buried in a long-forgotten PowerPoint presentation.
The decision tree in the figure is just one of many decision tree structures you could create to solve the marketing problem. The task of finding the optimal decision tree is an intractable problem. For those of you who have taken an analysis of algorithms course, you no doubt recognize this term. For those of you who haven't had this pleasure (he says, gritting his teeth), essentially what this means is that as the amount of test data used to train the decision tree grows, the amount of time it takes to do so grows as well--exponentially. While it may be nearly impossible to find the smallest (or more fittingly, the shallowest) decision tree in a respectable amount of time, it is possible to find a decision tree that is "small enough" using special heuristics.
Welcome back for another edition of Under the Decision Tree. This week we had The Data Science Conference in Seattle and interesting articles that include teaching AI to be sarcastic, predictions of what AI will look like in 2030, and much more. Please send any suggestions to: Decision Tree We would love to hear from you.