Over the same period, but especially since the 1990s, there has been an increasing disconnect between the traditional Fisher-Neyman-Pearson (FNP) math statistics course and the demands for complex analysis in many application areas. The failure of classical maximum likelihood methods to deal effectively with complex models and the success of MCMC-based methods has led to a similar situation: The undergraduate FNP course does not prepare students for these models, and Bayesian MCMC retraining courses are needed to prepare graduates for these applications.
A more dramatic and ruinous example of a failure to appreciate this statistical concept is the NASA Shuttle Challenger disaster in 1986, when engineers assumed that the lack of evidence of O-ring failures during cold weather launches was equivalent to evidence that there would be no O-ring failure during a cold-weather launch (12). In this case, the consequences of this faulty statistical reasoning were catastrophic. This is an extreme case, but neglect of statistical truism #4 is still an example of fallacious reasoning in the era of big data that we should avoid. Where does all of this take us? It leads us to a clear example of correlation with causation – as we venture out into the space age of big data and analytics applications, the use of those applications might correlate with and might cause a lack (or misapplication) of statistical thinking precisely on the home planet of the big data universe: statistics!