--by Andrew Gulledge Where does Knowledge Based Authentication fit into my decisioning strategy? Knowledge Based Authentication can fit into various parts of your authentication process. Some folks choose to put every consumer through KBA, while others only send their riskier transactions through the out-of-wallet questions. Some people use Knowledge Based Authentication to feed a manual review process, while others use a KBA failure as a hard-decline. Uses for KBA are as sundry and varied as the questions themselves. Decision Matrix- As discussed by prior bloggers, a well-engineered fraud score can provide considerable lift to any fraud risk strategy. When possible, it is a good idea to combine both score and questions into the decisioning process. This can be done with a matrixed approach—where you are more lenient on the questions if the applicant has a good fraud score, and more lenient on the score if the applicant did well on the questions. In a decision matrix, a set decision code is placed within various cells, based on fraud risk. Decision Overrides- These provide a nice complement to your standard fraud decisioning strategy. Different fraud solution vendors provide different indicators or flags with which decisioning rules can be created. For example, you might decide to fail a consumer who provides a social security number that is recorded as deceased. These rules can help to provide additional lift to the standard decisioning strategy, whether it is in addition to Knowledge Based Authentication questions alone, questions and score, etc. The overrides can be along the lines of both auto-pass and auto-fail.
By: Wendy Greenawalt In my last blog on optimization we discussed how optimized strategies can improve collection strategies. In this blog, I would like to discuss how optimization can bring value to decisions related to mortgage delinquency/modification. Over the last few years mortgage lenders have seen a sharp increase in the number of mortgage account delinquencies and a dramatic change in consumer mortgage payment trends. Specifically, lenders have seen a shift in consumer willingness from paying their mortgage obligation first, while allowing other debts to go delinquent. This shift in borrower behavior appears unlikely to change anytime soon, and therefore lenders must make smarter account management decisions for mortgage accounts. Adding to this issue, property values continue to decline in many areas and lenders must now identify if a consumer is a strategic defaulter, a candidate for loan modification, or a consumer affected by the economic downturn. Many loans that were modified at the beginning of the mortgage crisis have since become delinquent and have ultimately been foreclosed upon by the lender. Making optimizing decisions related to collection action for mortgage accounts is increasingly complex, but optimization can assist lenders in identifying the ideal consumer collection treatment. This is taking place while lenders considering organizational goals, such as minimizing losses and maximizing internal resources, are retaining the most valuable consumers. Optimizing decisions can assist with these difficult decisions by utilizing a mathematical algorithm that can assess all possible options available and select the ideal consumer decision based on organizational goals and constraints. This technology can be implemented into current optimizing decisioning processes, whether it is in real time or batch processing, and can provide substantial lift in prediction over business as usual techniques.
For the past couple years, the deterioration of the real estate market and the economy as a whole has been widely reported as a national and international crisis. There are several significant events that have contributed to this situation, such as, 401k plans have fallen, homeowners have simply abandoned their now under-valued properties, and the federal government has raced to save the banking and automotive sectors. While the perspective of most is that this is a national decline, this is clearly a situation where the real story is in the details. A closer look reveals that while there are places that have experienced serious real estate and employment issues (California, Florida, Michigan, etc.), there are also areas (Texas) that did not experience the same deterioration in the same manner. Flash forward to November, 2009 – with signs of recovery seemingly beginning to appear on the horizon – there appears to be a great deal of variability between areas that seem poised for recovery and those that are continuing down the slope of decline. Interestingly though, this time the list of usual suspects is changing. In a recent article posted to CNN.com, Julianne Pepitone observes that many cities that were tops in foreclosure a year ago have since shown stabilization, while at the same time, other cities have regressed. A related article outlines a growing list of cities that, not long ago, considered themselves immune from the problems being experienced in other parts of the country. Previous economic success stories are now being identified as economic laggards and experiencing the same pains, but only a year or two later. So – is there a lesson to be taken from this? From a business intelligence perspective, the lesson is generalized reporting information and forecasting capabilities are not going to be successful in managing risk. Risk management and forecasting techniques will need to be developed around specific macro- and micro-economic changes. They will also need to incorporate a number of economic scenarios to properly reflect the range of possible future outcomes about risk management and risk management solutions. Moving forward, it will be vital to understand the differences in unemployment between Dallas and Houston and between regions that rely on automotive manufacturing and those with hi-tech jobs. These differences will directly impact the performance of lenders’ specific footprints, as this year’s “Best Place to Live” according to Money.CNN.com can quickly become next year’s foreclosure capital. ihttp://money.cnn.com/2009/10/28/real_estate/foreclosures_worst_cities/index.htm?postversion=2009102811 iihttp://money.cnn.com/galleries/2009/real_estate/0910/gallery.foreclosures_worst_cities/2.html
By: Wendy Greenawalt Optimization has become a \"buzz word\" in the financial services marketplace, but some organizations still fail to realize all the possible business applications for optimization. As credit card lenders scramble to comply with the pending credit card legislation, optimization can be a quick and easily implemented solution that fits into current processes to ensure compliance with the new regulations. Optimizing decisions Specifically, lenders will now be under strict guidelines of when an APR can be changed on an existing account, and the specific circumstances under which the account must return to the original terms. Optimization can easily handle these constraints and identify which accounts should be modified based on historical account information and existing organizational policies. APR account changes can require a great deal of internal resources to implement and monitor for on-going performance. Implementing an optimized strategy tree within an existing account management strategy will allow an organization to easily identify consumer level decisions. This can be accomplished while monitoring accounts through on-going batch processing. New delivery options are now available for lenders to receive optimized strategies for decisions related to: Account acquisition Customer management Collections Organizations who are not currently utilizing this technology within their processes should investigate the new delivery options. Recent research suggests optimizing decisions can provide an improvement of 7-to-16 percent over current processes.
--by Jeff Bernstein In the current economic environment, many lenders and issuers across the globe are struggling to manage the volume of caseloads coming into collections. The challenge is that as these new collection cases come into collections in early phases of delinquency, the borrower is already in distress, and the opportunity to have a good outcome is diminished. One of the real “hot” items on the list of emerging best practices and innovating changes in collections is the concept of early lifecycle treatment strategy. Essentially, what we are referring to is the treatment of current and non-delinquent borrowers who are exhibiting higher risk characteristics. There are also those who are at-risk of future default at higher levels than average. The challenge is how to identify these customers for early intervention and triage in the collections strategy process. One often-overlooked tool is the use of maturation curves to identify vintages within a portfolio that is performing worse than average. A maturation curve identifies how long from origination until a vintage or segment of the portfolio reaches a normalized rate of delinquency. Let’s assume that you are launching a new credit product into the marketplace. You begin to book new loans under the program in the current month. Beyond that month, you monitor all new loans that were originated/booked during that initial time frame which we can identify as a “vintage” of the portfolio. Each month’s originations are a separate vintage or vintage analysis, and we can track the performance of each vintage over time. How many months will it take before the “portfolio” of loans booked in that initial month reach a normal level of delinquency based on these criteria: the credit quality of the portfolio and its borrowers, typical collections servicing, delinquency reporting standards, and factor of time? The answer would certainly depend upon the aforementioned factors, and could be graphed as follows: Exhibit 1 In Exhibit 1, we examine different vintages beginning with those loans originated during Q2 2002, and by year Q2 2008. The purpose of the analysis is to identify those vintages that have a steeper slope towards delinquency, which is also known as a delinquency maturation curve. The X-axis represents a timeline in months, from month of origination. Furthermore,, the Y-axis represents the 90+ delinquency rate expressed as a percentage of balances in the portfolio. Those vintages that have a steeper slope have reached a normalized level of delinquency sooner, and could in fact, have a trend line suggesting that they overshoot the expected delinquency rate for the portfolio based upon credit quality standards. So how do we use the maturation curve as a tool? In my next blog, I will discuss how to use maturation curves to identify trends across various portfolios. I will also examine differentiate collections issues from originations or lifecycle risk management opportunities.
In my last post I discussed the problem with confusing what I would call “real” Knowledge Based Authentication (KBA) with secret questions. However, I don’t think that’s where the market focus should be. Instead of looking at Knowledge Based Authentication (KBA) today, we should be looking toward the future, and the future starts with risk-based authentication. If you’re like most people, right about now you are wondering exactly what I mean by risk-based authentication. How does it differ from Knowledge Based Authentication, and how we got from point A to point B? It is actually pretty simple. Knowledge Based Authentication is one factor of a risk-based authentication fraud prevention strategy. A risk- based authentication approach doesn’t rely on question/answers alone, but instead utilizes fraud models that include Knowledge Based Authentication performance as part of the fraud analytics to improve fraud detection performance. With a risk-based authentication approach, decisioning strategies are more robust and should include many factors, including the results from scoring models. That isn’t to say that Knowledge Based Authentication isn’t an important part of a risk-based approach. It is. Knowledge Based Authentication is a necessity because it has gained consumer acceptance. Without some form of Knowledge Based Authentication, consumers question an organization’s commitment to security and data protection. Most importantly, consumers now view Knowledge Based Authentication as a tool for their protection; it has become a bellwether to consumers. As the bellwether, Knowledge Based Authentication has been the perfect vehicle to introduce new and more complex authentication methods to consumers, without them even knowing it. KBA has allowed us to familiarize consumers with out-of-band authentication and IVR, and I have little doubt that it will be one of the tools to play a part in the introduction of voice biometrics to help prevent consumer fraud. Is it always appropriate to present questions to every consumer? No, but that’s where a true risk-based approach comes into play. Is Knowledge Based Authentication always a valuable component of a risk based authentication tool to minimize fraud losses as part of an overall approach to fraud best practices? Absolutely; always. DING!
--by Andrew Gulledge Definition and examples Knowledge Based Authentication (KBA) is when you ask a consumer questions to which only they should know the answer. It is designed to prevent identity theft and other kinds of third-party fraud. Examples of Knowledge Based Authentication (also known as out-of-wallet) questions include “What is your monthly car payment?:" or “What are the last four digits of your cell number?” KBA -- and associated fraud analytics -- are an important part of your fraud best practices strategies. What makes a good KBA question? High percentage correct A good Knowledge Based Authentication question will be easy to answer for the real consumer. Thus we tend to shy away from questions for which a high percentage of consumers give the wrong answer. Using too many of these questions will contribute to false positives in your authentication process (i.e., failing a good consumer). False positives can be costly to a business, either by losing a good customer outright or by overloading your manual review queue (putting pressure on call centers, mailers, etc.). High fraud separation It is appropriate to make an exception, however, if a question with a low percentage correct tends to show good fraud detection. (After all, most people use a handful of KBA questions during an authentication session, so you can leave a little room for error.) Look at the fraudsters who successfully get through your authentication process and see which questions they got right and which they got wrong. The Knowledge Based Authentication questions that are your best fraud detectors will have a lower percentage correct in your fraud population, compared to the overall population. This difference is called fraud separation, and is a measure of the question’s capacity to catch the bad guys. High question generability A good Knowledge Based Authentication question will also be generable for a high percentage of consumers. It’s admirable to beat your chest and say your KBA tool offers 150 different questions. But it’s a much better idea to generate a full (and diverse) question set for over 99 percent of your consumers. Some KBA vendors tout a high number of questions, but some of these can only be generated for one or two percent of the population (if that). And, while it’s nice to be able to ask for a consumer’s SCUBA certification number, this kind of question is not likely to have much effect on your overall production.
By: Tom Hannagan Understanding RORAC and RAROC I was hoping someone would ask about these risk management terms…and someone did. The obvious answer is that the “A” and the “O” are reversed. But, there’s more to it than that. First, let’s see how the acronyms were derived. RORAC is Return on Risk-Adjusted Capital. RAROC is Risk-Adjusted Return on Capital. Both of these five-letter abbreviations are a step up from ROE. This is natural, I suppose, since ROE, meaning Return on Equity of course, is merely a three-letter profitability ratio. A serious breakthrough in risk management and profit performance measurement will have to move up to at least six initials in its abbreviation. Nonetheless, ROE is the jumping-off point towards both RORAC and RAROC. ROE is generally Net Income divided by Equity, and ROE has many advantages over Return on Assets (ROA), which is Net Income divided by Average Assets. I promise, really, no more new acronyms in this post. The calculations themselves are pretty easy. ROA tends to tell us how effectively an organization is generating general ledger earnings on its base of assets. This used to be the most popular way of comparing banks to each other and for banks to monitor their own performance from period to period. Many bank executives in the U.S. still prefer to use ROA, although this tends to be those at smaller banks. ROE tends to tell us how effectively an organization is taking advantage of its base of equity, or risk-based capital. This has gained in popularity for several reasons and has become the preferred measure at medium and larger U.S. banks, and all international banks. One huge reason for the growing popularity of ROE is simply that it is not asset-dependent. ROE can be applied to any line of business or any product. You must have “assets” for ROA, since one cannot divide by zero. Hopefully your Equity account is always greater than zero. If not, well, lets just say it’s too late to read about this general topic. The flexibility of basing profitability measurement on contribution to Equity allows banks with differing asset structures to be compared to each other. This also may apply even for banks to be compared to other types of businesses. The asset-independency of ROE can also allow a bank to compare internal product lines to each other. Perhaps most importantly, this permits looking at the comparative profitability of lines of business that are almost complete opposites, like lending versus deposit services. This includes risk-based pricing considerations. This would be difficult, if even possible, using ROA. ROE also tells us how effectively a bank (or any business) is using shareholders equity. Many observers prefer ROE, since equity represents the owners’ interest in the business. As we have all learned anew in the past two years, their equity investment is fully at-risk. Equity holders are paid last, compared to other sources of funds supporting the bank. Shareholders are the last in line if the going gets rough. So, equity capital tends to be the most expensive source of funds, carrying the largest risk premium of all funding options. Its successful deployment is critical to the profit performance, even the survival, of the bank. Indeed, capital deployment, or allocation, is the most important executive decision facing the leadership of any organization. So, why bother with RORAC or RAROC? In short, it is to take risks more fully into the process of risk management within the institution. ROA and ROE are somewhat risk-adjusted, but only on a point-in-time basis and only to the extent risks are already mitigated in the net interest margin and other general ledger numbers. The Net Income figure is risk-adjusted for mitigated (hedged) interest rate risk, for mitigated operational risk (insurance expenses) and for the expected risk within the cost of credit (loan loss provision). The big risk management elements missing in general ledger-based numbers include: market risk embedded in the balance sheet and not mitigated, credit risk costs associated with an economic downturn, unmitigated operational risk, and essentially all of the strategic risk (or business risk) associated with being a banking entity. Most of these risks are summed into a lump called Unexpected Loss (UL). Okay, so I fibbed about no more new acronyms. UL is covered by the Equity account, or the solvency of the bank becomes an issue. RORAC is Net Income divided by Allocated Capital. RORAC doesn’t add much risk-adjustment to the numerator, general ledger Net Income, but it can take into account the risk of unexpected loss. It does this, by moving beyond just book or average Equity, by allocating capital, or equity, differentially to various lines of business and even specific products and clients. This, in turn, makes it possible to move towards risk-based pricing at the relationship management level as well as portfolio risk management. This equity, or capital, allocation should be based on the relative risk of unexpected loss for the different product groups. So, it’s a big step in the right direction if you want a profitability metric that goes beyond ROE in addressing risk. And, many of us do. RAROC is Risk-Adjusted Net Income divided by Allocated Capital. RAROC does add risk-adjustment to the numerator, general ledger Net Income, by taking into account the unmitigated market risk embedded in an asset or liability. RAROC, like RORAC, also takes into account the risk of unexpected loss by allocating capital, or equity, differentially to various lines of business and even specific products and clients. So, RAROC risk-adjusts both the Net Income in the numerator AND the allocated Equity in the denominator. It is a fully risk-adjusted metric or ratio of profitability and is an ultimate goal of modern risk management. So, RORAC is a big step in the right direction and RAROC would be the full step in management of risk. RORAC can be a useful step towards RAROC. RAROC takes ROE to a fully risk-adjusted metric that can be used at the entity level. This can also be broken down for any and all lines of business within the organization. Thence, it can be further broken down to the product level, the client relationship level, and summarized by lender portfolio or various market segments. This kind of measurement is invaluable for a highly leveraged business that is built on managing risk successfully as much as it is on operational or marketing prowess.
Round 1 – Pick your corner There seems to be two viewpoints in the market today about Knowledge Based Authentication (KBA): one positive, one negative. Depending on the corner you choose, you probably view it as either a tool to help reduce identity theft and minimize fraud losses, or a deficiency in the management of risk and the root of all evil. The opinions on both sides are pretty strong, and biases “for” and “against” run pretty deep. One of the biggest challenges in discussing Knowledge Based Authentication as part of an organization’s identity theft prevention program, is the perpetual confusion between dynamic out-of-wallet questions and static “secret” questions. At this point, most people in the industry agree that static secret questions offer little consumer protection. Answers are easily guessed, or easily researched, and if the questions are preference based (like “what is your favorite book?”) there is a good chance the consumer will fail the authentication session because they forgot the answers or the answers changed over time. Dynamic Knowledge Based Authentication, on the other hand, presents questions that were not selected by the consumer. Questions are generated from information known about the consumer – concerning things the true consumer would know and a fraudster most likely wouldn’t know. The questions posed during Knowledge Based Authentication sessions aren’t designed to “trick” anyone but a fraudster, though a best in class product should offer a number of features and options. These may allow for flexible configuration of the product and deployment at multiple points of the consumer life cycle without impacting the consumer experience. The two are as different as night and day. Do those who consider “secret questions” as Knowledge Based Authentication consider the password portion of the user name and password process as KBA, as well? If you want to hold to strict logic and definition, one could argue that a password meets the definition for Knowledge Based Authentication, but common sense and practical use cause us to differentiate it, which is exactly what we should do with secret questions – differentiate them from true KBA. KBA can provide strong authentication or be a part of a multifactor authentication environment without a negative impact on the consumer experience. So, for the record, when we say KBA we mean dynamic, out of wallet questions, the kind that are generated “on the fly” and delivered to a consumer via “pop quiz” in a real-time environment; and we think this kind of KBA does work. As part of a risk management strategy, KBA has a place within the authentication framework as a component of risk- based authentication… and risk-based authentication is what it is really all about.
Many compliance regulations such the Red Flags Rule, USA Patriot Act, and ESIGN require specific identity elements to be verified and specific high risk conditions to be detected. However, there is still much variance in how individual institutions reconcile referrals generated from the detection of high risk conditions and/or the absence of identity element verification. With this in mind, risk-based authentication, (defined in this context as the “holistic assessment of a consumer and transaction with the end goal of applying the right authentication and decisioning treatment at the right time\") offers institutions a viable strategy for balancing the following competing forces and pressures: • Compliance – the need to ensure each transaction is approved only when compliance requirements are met; • Approval rates – the need to meet business goals in the booking of new accounts and the facilitation of existing account transactions; • Risk mitigation – the need to minimize fraud exposure at the account and transaction level. A flexibly-designed risk-based authentication strategy incorporates a robust breadth of data assets, detailed results, granular information, targeted analytics and automated decisioning. This allows an institution to strike a harmonious balance (or at least something close to that) between the needs to remain compliant, while approving the vast majority of applications or customer transactions and, oh yeah, minimizing fraud and credit risk exposure and credit risk modeling. Sole reliance on binary assessment of the presence or absence of high risk conditions and identity element verifications will, more often than not, create an operational process that is overburdened by manual referral queues. There is also an unnecessary proportion of viable consumers unable to be serviced by your business. Use of analytically sound risk assessments and objective and consistent decisioning strategies will provide opportunities to calibrate your process to meet today’s pressures and adjust to tomorrow’s as well.
The value of a good decision can generate $150 or more in customer net present value, while the cost of a bad decision can cost you $1,000 or more. For example, acquiring a new and profitable customer by making good prospecting and approval and pricing decisions and decisioning strategies may generate $150 or much more in customer net present value and help you increase net interest margin and other key metrics. While the cost of a bad decision (such as approving a fraudulent applicant or inappropriately extending credit that ultimately results in a charge-off) can cost you $1,000 or more. Why is risk management decisioning important? This issue is critical because average-sized financial institutions or telecom carriers make as many as eight million customer decisions each year (more than 20,000 per day!). To add to that, very large financial institutions make as many as 50 billion customer decisions annually. By optimizing decisions, even a small 10-to-15 percent improvement in the quality of these customer life cycle decisions can generate substantial business benefit. Experian recommends that clients examine the types of decisioning strategies they leverage across the customer life cycle, from prospecting and acquisition, to customer management and collections. By examining each type of decision, you can identify those opportunities for improvement that will deliver the greatest return on investment by leveraging credit risk attributes, credit risk modeling, predictive analytics and decision-management software.
By: Kennis Wong It\'s true that intent is difficult to prove. It\'s also true that financial situations change. That\'s why financial institutions have not, yet, successfully fought off first-party fraud. However, there are some tell-tale signs of intent when you look at the consumer\'s behavior as a whole, particularly across all his/her financial relationships. For example, in a classic bust out case, you would see that the consumer, with pristine credit history, applies for more and more credit cards while maintaining a relatively low balance and utilization across all issuers. If you graph the number of credit cards and number of credit applications over time, you would see two hockey-stick lines. When the accounts go bad, they do so at almost the same time. This pattern is not always apparent at the time of origination, that\'s why it\'s important to monitor frequently for account review and fraud database alerts. On the other hand, consumers with financial difficulties have different patterns. They might have more credit lines over time, but you would see that some credit lines may go delinquent while others don\'t. You might also see that consumers cure some lines after delinquencies…you can see their struggle of trying to pay. Of course the intent \"pattern\" is not always clear. When dealing with fraudsters in fraud account management, even with the help of the fraud database, fraud trends and fraud alert, change their behaviors and use new techniques.
By: Tracy Bremmer There has been a lot of hype these days about people strategically defaulting on their mortgage loans. In other words, a consumer is underwater on their house and so he/she makes a strategic decision to walk away from it. In these instances, the consumer is current on all of their non-mortgage accounts, but because the value of their home is less than what they owe, they make the decision to default on their mortgage loan. Experian and Oliver Wyman teamed up to really dig into this population and determine these issues: • Does this population really exist? • If so, what are the characteristics of this population, such as assessing credit risk or bankruptcy scores? • How should loan modification strategies be differentiated based on this population? This blog will be one of a three-part series that addresses these questions. Let’s begin with the first question. 1. Does this population really exist? The quick answer is yes – this population does indeed exist. In fact, in 2008 strategic defaulters represented 18 percent of all mortgage defaults, up 500 percent from 2004. When we conducted our study we found there were varying populations that also existed when it came to mortgage defaults. In fact, we classified mortgage defaulters into five categories: strategic defaulter, cash flow manager, distressed defaulter, no non-real estate trades, and pay-downs. We defined these populations as follows: • Strategic defaulter - Borrowers who are delinquent on their mortgages, even when they can afford the payment, because their loan balance exceeds the value of their home, • Cash flow manager - Borrowers facing delinquency issues with their mortgage because of temporary distress, but continue to make payments on all credit obligations, • Distressed defaulter - Borrowers facing potential affordability issues that go delinquent on their mortgage along with other credit obligations, • No non-real estate trades – Borrowers who are delinquent on their mortgage, however they do not have any other non-mortgage trades to evaluate if they have strategically defaulted or are in distress, • Pay-downs – Borrowers who pay down their mortgage loan. In my next blog, I will address the characteristic differences in behavior between these populations. Specifically, I will evaluate what characteristics make strategic defaulters stand out from the rest and what is unique about the cash flow managers. Source: Experian-Oliver Wyman Market Intelligence Reports; Understanding Strategic Default in Mortgage topical study / webinar. August 2009.
On Friday, October 30th, the FTC again delayed enforcement of the “Red Flags” Rule – this time until June 1, 2010 – for financial institutions and creditors subject to the FTC’s enforcement. Here’s the official release: http://www.ftc.gov/opa/2009/10/redflags.shtm. But this doesn’t mean, until then, businesses get a free pass. The extension doesn’t apply to other federal agencies that have enforcement responsibilities for institutions under their jurisdiction. And the extension also doesn’t alleviate an institution’s need to detect and respond to address discrepancies on credit reports. Red Flag compliance Implementing best practices to address the identity theft under the Red Flags Rule is not just the law, it’s good business. The damage to reputations and consumer confidence from a problem gone unchecked or worse yet – unidentified – can be catastrophic. I encourage all businesses – if they haven’t already done so – to use this extension as an opportunity to proactively secure a Red Flags Rule to ensure Red Flag compliance. It’s an investment in protecting their most important asset – the customer.
By: Kari Michel Most lenders use a credit scoring model in their decision process for opening new accounts; however, between 35 and 50 million adults in the US may be considered unscoreable with traditional credit scoring models. That is equivalent to 18-to-25 percent of the adult population. Due to recent market conditions and shrinking qualified candidates, lenders have placed a renewed interest in assessing the risk of this under served population. Unscoreable consumers could be a pocket of missed opportunity for many lenders. To assess these consumers, lenders must have the ability to better distinguish between consumers with a clear track record of unfavorable credit behaviors versus those that are just beginning to develop their credit history and credit risk models. Unscoreable consumers can be divided into three populations: • Infrequent credit users: Consumers who have not been active on their accounts for the past six months, and who prefer to use non-traditional credit tools for their financial needs. • New entrants: Consumers who do not have at least one account with more than six months of activity; including young adults just entering the workforce, recently divorced or widowed individuals with little or no credit history in their name, newly arrived immigrants, or people who avoid the traditional system by choice. • Thin file consumers: Consumers who have less than three accounts and rarely utilize traditional credit and likely prefer using alternative credit tools and credit score trends. A study done by VantageScore® Solutions, LLC shows that a large percentage of the unscoreable population can be scored with VantageScore* and a portion of these are credit-worthy (defined as the population of consumers who have a cumulative likelihood to become 90 days or more delinquent is less than 5 percent). The following is a high-level summary of the findings for consumers who had at least one trade: Lenders can review their credit decisioning process to determine if they have the tools in place to assess the risk of those unscoreable consumers. As with this population there is an opportunity for portfolio expansion as demonstrated by the VantageScore study. *VantageScore is a generic credit scoring model introduced to meet the market demands for a highly predictive consumer score. Developed as a joint venture among the three major credit reporting companies (CRCs) – Equifax, Experian and TransUnion.