It’s common knowledge that data-driven decisions are critical to any organization. It has also been established that “data is the new oil.” So, why are so many institutions just giving away this valuable asset?
In the effort to reduce overhead costs and enhance efficiency, banking institutions have outsourced critical functions over the past decade. While the goal to reduce costs may have been achieved, the loss of control over data may lead to a disaster as the economic climate shifts. In the current state of affairs, the lack of differentiated insight into your business can have serious consequences.
The opportunity to leverage your data as a competitive advantage is lost when ceding control of it to a 3rd party. Best case scenario may be that institutions are putting themselves on par with their peers. Worst case is that the entire peer group is making less than optimal decisions based on a vanilla set of algorithms, parameters, and inputs. With the drastic reduction in storage and compute costs, now is the time for organizations to put this valuable asset to work.
Check out the example that we’ve created from a simple dataset of how to generate valuable insight with our recorded webinar on Loan Risk Analysis.
The above video reveals three key insights. The first insight is that metrics matter. Predictive models are commonly evaluated according to measures like accuracy, precision, recall or AUC. We however show the best evaluator for considering loan risk models is expected profitability. This is superior to the more traditional methods that do not weigh the large downside risk of a default against the modest upside benefit of repayment. Our cost-benefit analysis also considers how each loan varies in amount that affect profits differently.
The second key insight from the video is how the type of predictive model matters. We compare a traditional Logistic Regression against the modern Gradient Boosted Tree (GBT). Not only is GBT better, but we can even estimate that it would have increased profits from $27 million to $34 million. It is remarkable to see how much a lender’s bottom line could be improved simply by a superior algorithm.
The final insight is how to interpret our model. By using feature importance, we are able to crack open the “black box” of machine learning and see which variables influence our selected model. Although the video only shows feature importance at the global level (e.g. which variables matter for the whole model), it is also possible to consider feature importance at the local level (e.g. which variables cause a specific loan to be rejected).
This is just a small taste of how we can help you unlock the value of your data. Contact us to learn more!