At A Glance
It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.Paragraph Block- is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.


Heading 2
Heading 3
Heading 4
Heading 5
- This is a list
- Item 1
- Item 2
- Sub list
- Sub list 2
- Sub list 3
- More list
- More list 2
- More list 3
- More more
- More more
This is the pull quote block Lorem Ipsumis simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s,
ExperianThis is the citation

This is the pull quote block Lorem Ipsumis simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s,
ExperianThis is the citation
| Table element | Table element | Table element |
| my table | my table | my table |
| Table element | Table element | Table element |

Media Text Block
of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum
My Small H5 Title


If you attended any of our past credit trends Webinars, you’ve heard me mention time and again how auto originations have been a standout during these times when overall consumer lending has been a challenge. In fact, total originated auto volumes topped $100B in the third quarter of 2011, a level not seen since mid-2008. But is this growth sustainable? Since bottoming at the start of 2009, originations have been on a tear for nearly three straight years. Given that, you might think that auto origination’s best days are behind it. But these three key factors indicate originations may still have room to run: 1. The economy Just as it was a factor in declining auto originations during the recession, the economy will drive continued increases in auto sales. If originations were growing during the challenges of the past couple of years, the expected improvements in the economy in 2012 will surely spur new auto originations. 2. Current cars are old A recent study by Experian Automotive showed that today’s automobiles on the road have hit an all-time high of 10.6 years of age. Obviously a result of the recent recession, consumers owning older cars will result in pent up demand for newer and more reliable ones. 3. Auto lending is more diversified than ever I’m talking diversification in a couple of ways: Auto lending has always catered to a broader credit risk range than other products. In recent years, lenders have experimented with moving even further into the subprime space. For example, VantageScore® credit score D consumers now represent 24.4% of all originations vs. 21.2% at the start of 2009. There is a greater selection of lenders that cater to the auto space. With additional players like Captives, Credit Unions and even smaller Finance companies competing for new business, consumers have several options to secure a competitively-priced auto loan. With all three variables in motion, auto originations definitely have a formula for continued growth going forward. Come find out if auto originations do in fact continue to grow in 2012 by signing up for our upcoming Experian-Oliver Wyman credit trends Webinar.

Part II: Where are Models Most Needed Now in Mortgages? (Click here if you missed Part I of this post.) By: John Straka A first important question should always be are all of your models, model uses, and model testing strategies, and your non-model processes, sound and optimal for your business? But in today’s environment, two areas in mortgage stand out where better models and decision systems are most needed now: mortgage servicing and loan-quality assurance. I will discuss loan-quality assurance in a future installment. Mortgage servicing and loss mitigation are clearly one area where better models and new decision analytics continue to have a seemingly great potential today to add significant new value. At the risk of oversimplifying, it is possible that a number of the difficulties and frustrations of mortgage servicers (and regulators) and borrowers in recent years may have been lessened through more efficient automated decision tools and optimization strategies. And because these problems will continue to persist for quite some time, it is certainly not too late to envision and move now towards an improved future state of mortgage servicing, or to continue to advance your existing new strategic direction by adding to enhancements already underway. Much has been written about the difficulties faced by many mortgage servicers who have been overwhelmed by the demands of many more delinquent and defaulted borrowers and very extensive, evolving government involvements in new programs, performance incentives and standards. A strategic question on the minds of many executives and others in the industry today seems to be, where is all of this going? Is there a generally viable strategic direction for mortgage servicers that can help them to emerge from their current issues—perhaps similar to the improved data, standards, modeling, and technologies that allowed the mortgage industry in the 1990s to emerge overall quite successfully from the problems of the late 1980s and early 90s? To review briefly, mortgage industry problems of the early 1990s were less severe, of course—but really not dissimilar to the current environment. There had been a major home-price correction in California, in New England, and in a number of large metro areas elsewhere. A “low doc” mortgage era (and other issues) had left Citicorp nearly insolvent, for example, and caused other significant losses on top of the losses generated by the home prices. A major source of most mortgage funding, the Savings & Loan industry, had largely collapsed, with losses having to be resolved by a special government agency. Statistical mortgage credit scoring and automated underwriting resulted from the improved data, standards, modeling, and technologies that allowed the mortgage industry to recover in the 1990s, allowing mortgages to catch up with the previously established use of this decision technology in cards, autos, etc., thus benefiting the mortgage industry with reduced costs and significant gains in efficiency and risk management. An important question today is, is there a similar “renaissance,” so to speak, now in the offing or at hand for mortgage servicers? Despite all of the still ongoing problems? Let me offer here a very simple analogy—with a disclaimer that this is only a basic starting viewpoint, an oversimplification, recognizing that mortgage servicing and loss mitigation is extraordinarily complex in its details, and often seems only to grow more complex by the day (with added constraints and uncertainties piling on). The simple analogy is this: consider your loan-level Net Present Value (NPV) or other key objective of loan-level decisions in servicing and loss mitigation to be analogous to the statistically based mortgage default “Score” of automated underwriting for originations in the 1990s. Viewed in this way, a simple question stemming from the figure below is: can you reduce costs and satisfy borrowers and performance standards better by automating and focusing your servicing representatives more, or primarily, on the “Refer” group of borrowers? A corollary question is can more automated model-based decision engines confidently reduce the costs and achieve added insights and efficiencies in servicing the lowest and highest NPV delinquent borrowers and the Refer range? Another corollary question is, are new government-driven performance standards helpful or hindering (or even preventing) particular moves toward this type of objective. Is this a generally viable strategic direction for the future (or even the present) of mortgage servicing? Is it your direction today? What is your vision for the future of your quality mortgage servicing?

By: Joel Pruis One might consider this topic redundant to the last submission around application requirements and that assessment would be partially true. As such we are not going to go over the data that has already been collected in the application such as the demographic information of the applicant and guarantors or the business financial information or personal financial information. That discussion like Elvis has “left the building”. Rather, we will discuss the use of additional data to support the underwriting/decisioning process – namely: Personal/Consumer credit data Business data Scorecards Fraud data Let’s get a given out in the open. Personal credit data has a high correlation to the payment performance of a small business. The smaller the business the higher the correlation. “Your honor, counsel requests the above be stipulated in the court records.” “So stipulated for the record.” “Thank you, your honor.” With that put to rest (remember you can always comment on the blog if you have any questions or want to comment on any of the content). The real debate in small business lending revolves around the use of business data. Depth and availability of business data There are some challenges with the gathering and dissemination of business data for use in decisioning – mainly around the history of the data for the individual entity. More specifically, while a consumer is a single entity and for the vast majority of consumers, one does not bankrupt one entity and then start a new person to refresh their credit history. No, that is actually bankruptcy and the bankruptcy stays with the individual. Businesses, however, can and in fact do close one entity and start up another. Restaurants and general contractors come to mind as two examples of individuals who will start up a business, go bankrupt and then start another business under a new entity repeating the cycle multiple times. While this scenario is a challenge, one cannot refute the need to know how both the individual consumer as well as the individual business is handling its obligations whether they are credit cards, auto loans or trade payables. I once worked for a bank president in a small community bank who challenged me with the following mantra, “It’s not what you know that you don’t know that can hurt you, it is what you think you know but really don’t that hurts you the most.” I will admit that it took me a while to digest that statement when I first heard it. Once fully digested the statement was quite insightful. How many times do we think we know something when we really don’t? How many times do we act on an assumed understanding but find that our understanding was flawed? How sound was our decision when we had the flawed understanding? The same holds true as it relates to the use (or lack thereof) of business information. We assume that we don’t need business information because it will not tell us much as it relates to our underwriting. How can the business data be relevant to our underwriting when we know that the business performance is highly correlated to the performance of the owner? Let’s look at a study done a couple of years ago by the Business Information group at Experian. The data comes from a whitepaper titled “Predicting Risk: the relationship between business and consumer scores” and was published in 2008. The purpose of the study was to determine which goes bad first, the business or the owner. At a high level the data shows the following: If you're interested, you can download the full study here. So while a majority of time and without any additional segmentation, the business will show signs of stress before the owner. If we look at the data using length of time in business we see some additional insights. Figure: Distribution of businesses by years in business Interesting distinction is that based upon the age of the business we will see the owner going bad before the business if the business age is 5 years or less. Once we get beyond the 5 year point the “first bad” moves to the business. In either case, there is no clear case to be made to exclude one data source in favor of the other to predict risk in a small business origination process. While we can look at see that there is an overall majority where the business goes bad first or that if we have a young small business the owner will more likely go bad first, in either case, there is still a significant population where the inverse is true. Bottom line, gathering both the business and the consumer data allows the financial institution to make a better and more informed decision. In other words, it prevents us from the damage caused by “thinking we know something when we really don’t”. Coming up next month – Decisioning Strategies.
In this article…
typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.


