AI and the bottom line: 15 examples of artificial intelligence in finance

#artificialintelligence

Artificial intelligence has given the world of banking and the financial industry as a whole a way to meet the demands of customers who want smarter, more convenient, safer ways to access, spend, save and invest their money. We've put together a rundown of how AI is being used in finance and the companies leading the way. A recent study found 77% of consumers preferred paying with a debit or credit card compared to only 12% who favored cash. But easier payment options isn't the only reason the availability of credit is important to consumers. Having good credit aids in receiving favorable financing options, landing jobs and renting an apartment, to name a few examples.


Dynamic Modeling and Equilibria in Fair Decision Making

arXiv.org Machine Learning

Recent studies on fairness in automated decision making systems have both investigated the potential future impact of these decisions on the population at large, and emphasized that imposing ''typical'' fairness constraints such as demographic parity or equality of opportunity does not guarantee a benefit to disadvantaged groups. However, these previous studies have focused on either simple one-step cost/benefit criteria, or on discrete underlying state spaces. In this work, we first propose a natural continuous representation of population state, governed by the Beta distribution, using a loan granting setting as a running example. Next, we apply a model of population dynamics under lending decisions, and show that when conditional payback probabilities are estimated correctly 1) ``optimal'' behavior by lenders can lead to ''Matthew Effect'' bifurcations (i.e., ''the rich get richer and the poor get poorer''), but that 2) many common fairness constraints on the allowable policies cause groups to converge to the same equilibrium point. Last, we contrast our results in the case of misspecified conditional probability estimates with prior work, and show that for this model, different levels of group misestimation guarantees that even fair policies lead to bifurcations. We illustrate some of the modeling conclusions on real data from credit scoring.


Deutsche Bank to pay $7.2 billion in subprime mortgage probe settlement

Los Angeles Times

Two big European banks will give billions of dollars in loan relief to U.S. borrowers to settle civil claims over risky securities that helped spark the 2008 financial crisis. Deutsche Bank said Friday it has agreed on a $7.2-billion settlement with the U.S. Justice Department in connection with its dealings in issuing mortgage-backed bonds. Under the deal, which isn't yet final, Germany's biggest bank agrees to pay $3.1 billion in fines and $4.1 billion in compensation through such measures as easing loan repayment terms for homeowners and borrowers. Credit Suisse said it had agreed to a similar settlement under which it would pay $5.3 billion. The settlements, which focus on activities from 2005 through 2007, revisit an ugly chapter of the global financial crisis, in which banks bundled mortgages from people with shaky credit into bonds whose risks many investors did not understand.


Optimal Experimental Design for Staggered Rollouts

arXiv.org Machine Learning

Experimentation has become an increasingly prevalent tool for guiding policy choices, firm decisions, and product innovation. A common hurdle in designing experiments is the lack of statistical power. In this paper, we study optimal multi-period experimental design under the constraint that the treatment cannot be easily removed once implemented; for example, a government or firm might implement treatment in different geographies at different times, where the treatment cannot be easily removed due to practical constraints. The design problem is to select which units to treat at which time, intending to test hypotheses about the effect of the treatment. When the potential outcome is a linear function of a unit effect, a time effect, and observed discrete covariates, we provide an analytically feasible solution to the design problem where the variance of the estimator for the treatment effect is at most 1+O(1/N^2) times the variance of the optimal design, where N is the number of units. This solution assigns units in a staggered treatment adoption pattern, where the proportion treated is a linear function of time. In the general setting where outcomes depend on latent covariates, we show that historical data can be utilized in the optimal design. We propose a data-driven local search algorithm with the minimax decision criterion to assign units to treatment times. We demonstrate that our approach improves upon benchmark experimental designs through synthetic experiments on real-world data sets from several domains, including healthcare, finance, and retail. Finally, we consider the case where the treatment effect changes with the time of treatment, showing that the optimal design treats a smaller fraction of units at the beginning and a greater share at the end.


Big data means small margins in the mortgage industry of the future

#artificialintelligence

The big story in mortgages today is the rise in mortgage loan rates. For the first time in years, we're seeing 30-year fixed mortgage rates consistently above 4%, and a 5% rate is in sight. Higher rates make sense if you look at it one way: the economy is strong, inflation is climbing, and it's safe to expect Federal Reserve hikes in 2018 and 2019. Industry veterans might be sighing with relief.