All posts by Guest Contributor

Loading...

By: Joel Pruis The debate on what constitutes a small business application is probably second only to the ongoing debate around centralized vs. decentralized loan authority (but we will get to that topic in a couple of blogs later).  We have a couple of topics that need to be considered in this discussion, namely:      1.      When is an application an application?      2.     Do you process an incomplete application? When is an application an application? Any request by a small business with annual sales of $1,000,000 or less falls under Reg B.  As we all know because of this regulation we have to maintain proper records of when we received an application and when a decision on the application was made as well as communicated to the client.  To keep yourself out of trouble, I recommend that there be a small business application form (paper or electronic) and that you have clearly stated the information required for a completed application in your small business application procedures.  The form removes ambiguities in the application process and helps with the compliance documentation. One thing is for certain – when you request a personal credit bureau on the small business owner(s)/guarantor(s) and you currently do not have any credit exposure to the individual(s) – you have received an application and to this there is no debate. Bottom line is that you need to define your application and do so using objective criteria.  Subjective criteria leaves room for interpretation and individual interpretation leaves doubt in the compliance area. Information requirements Whether or not you use a generic or custom small business scorecard or no scorecard at all, there are some baseline data segments that are important to collect on the small business applicant: Requested amount and purpose for the funds Collateral (if necessary based upon the product terms and conditions) General demographics on the business Name and location Business Entity type (corporation, llc, partnership, etc.) Product and/or service provided Length of time in business Current banking relationship General demographics on the owners/guarantors Names and addresses Current banking relationship Length of time with the business External data reports on the business and/or guarantors Business Report Personal Credit Bureau on the owners/guarantors Financial Statements (??) – we’ll talk about that in part II of this post. The demographics and the existing banking relationship are likely not causing any issues with anyone and the requested amount and use of funds is elementary to the process.  Probably the greatest debate is around the collection of financial information and we are going to save that debate for the next post. The non-financial information noted above provides sufficient data to pull personal credit bureaus on the owners/guarantors and the business bureau on the actual borrower.  We have even noted some additional data informing us the length of time the business has been in existence and where the banking relationship is currently held for both the business and the owners.  But what additional information should be requested or should I say required? We have to remember that the application is not only to support the ability to render a decision but also supports the ability to document the loan and maybe even serve as a portion of the loan documentation.  We need to consider the following: How standardized are the products we offer? Do we allow for customization of collateral to be offered? Do we have standard loan/fee pricing? Is automatic debit for the loan payments required?  Optional? Not available? Are personal guarantees required?  Optional? We again go back to the 80/20 rule.  Product standardization is beneficial and optimal when we have high volumes and low dollars.  The smaller the dollar size of the request/relationship the more standardized we need to have our products and as a result our application can be more streamlined.  When we do not negotiate rate, we do not need to have a space to note requested rate.  When we do not negotiate on personal guarantees we always require the personal financial information be collected on all owners of the business (some exceptions for very small ownership interests).  Auto-debit for the loan payments means we always need to have some form of a DDA account with our institution.  I think you get the point that for the highest volume of applications we standardize and thus streamline the process through the removal of ambiguity. Do you process an incomplete application? The most common argument for processing an incomplete application is that if we know we are going to decline the application based upon information on the personal credit bureau, why go through the effort of collecting and spreading the financial information.  Two significant factors make this argument moot: customer satisfaction and fair lending regulation. Customer satisfaction This is based upon the ease of doing business with the financial institution.  More specifically the number of contact points or information requests that are required during the process.  Ideally the number of contact points that are required once the applicant has decided to make a financing request should be minimal the information requirements clearly communicated up front and fully collected prior to rendering a decision.  The idea that a quick no is preferable to submitting a full application actually is working to make the declination process more efficient than the actual approval process.  So in other words we are making the process more efficient and palatable for those clients we do NOT consider acceptable versus those clients that ARE acceptable.  Secondly, if we accept and process incomplete applications, we are actually mis-prioritizing the application volume.  Incomplete applications should never be processed ahead of completed packages yet under the quick no objective, the incomplete application is processed ahead of completed applications simply based upon date and time of submission.  Consequently we are actually incenting and fostering the submission of incomplete applications by our lenders.  Bluntly this is a backward approach that only serves to make the life of the relationship manager more efficient and not the client. Fair lending regulation This perspective poses a potential issue when it comes to consistency.  In my 10 years working with hundreds of financial institutions, only a very small minority of times have I encountered a financial institution that is willing to state with absolute certainty that a particular characteristic will cause an application to e declined 100% of the time.  As a result, I wish to present this scenario: Applicant A provides an incomplete application (missing financial statements, for example).  {C}Application is processed in an incomplete status with personal and business bureaus pulled. Personal credit bureau has blemishes which causes the financial institution to decline the application Process is complete Applicant B provides a completed application package with financial statements Application is processed with personal and business bureaus pulled, financial statements spread and analysis performed Personal credit bureau has the same blemishes as Applicant A Financial performance prompts the underwriter or lender to pursue an explanation of why the blemishes occurred and the response is acceptable to the lender/underwriter. Assuming Applicant A had similar financial performance, we have a case of inconsistency due to a portion of the information that we “state” is required for an application to be complete yet was not received prior to rendering the decision.  Bottom line the approach causes doubt with respect to inconsistent treatment and we need to avoid any potential doubt in the minds of our regulators. Let’s go back to the question of financial statements.  Check back Thursday for my follow-up post, or part II, where we’ll cover the topic in greater detail. 

Published: January 25, 2012 by Guest Contributor

Within the world of cyber security, a great deal of attention has been focused lately on the escalating hazards and frequency of data breaches, with considerable discussion on the high cost of such breaches.  But as the industry has assessed the financial toll of breaches, it has never taken into account how data breaches harm reputations, brand image, and consequently a company's bottom line. Until now. A recently released Ponemon Institute study, sponsored by Experian’s Data Breach Resolution and believed to be the first of its kind, explores the “Reputation Impact of a Data Breach” to provide more context for the full scope of data breaches.  The findings draw enlightening conclusions around the financial toll that data breaches wreak upon harmed corporate reputations, including these key takeaways: Reputation is one of an organization’s most important and valuable assets. Reputation and brand image are perceived as very valuable…and highly vulnerable to negative events, including a data breach. Calculating the value of reputation and brand reveals how valuable these assets are to an organization. The average value of brand and reputation for the study’s participating organizations was determined to be approximately $1.5 billion.  Depending upon the type of information lost as a result of the breach, the average loss in the value of the brand ranged from $184 million to more than $330 million. Depending upon the type of breach, the value of brand and reputation could decline as much as 17 percent to 31 percent. Not all data breaches are equal. Some breaches are more devastating than others to an organization’s reputation and brand image, with the loss or theft of customer information ranked as the most devastating (followed by confidential financial business information and confidential non-financial business information). Data breaches occur in most organizations represented in this study and have at least a moderate or a significant impact on reputation and brand image. According to 82 percent of respondents, their organizations had a data breach involving sensitive or confidential information.  Fifty-three percent say the data breaches had a moderate impact on reputation and brand image and 23 percent say it was significant. Most organizations in the study have had a data breach involving the theft of sensitive or confidential business information. On average these types of breaches have occurred 2.9 times in surveyed organizations, with the theft or loss of confidential financial information having the most significant impact on reputation and brand. Respondents strongly believe in understanding the root cause of the breach and protecting victims from identity theft. When asked what their organizations did following a breach to preserve or restore brand and reputation, the top three steps are: conduct investigations and forensics, work closely with law enforcement and protect those affected from potential harms such as identity theft. The Ponemon study clearly shows that when data breaches occur, the collateral damage of a company’s brand and reputation become significant hard costs that must be factored into the total financial loss. Download the Ponemon Reputation Impact Study

Published: January 17, 2012 by Guest Contributor

By: Joel Pruis Part I – New Application Volume and the Business Banker: Generating small business or business banking applications may be one of the hottest topics in this segment at this time. Loan demand is down and the pool of qualified candidates seems to be down as well. Trust me, I am not going to jump on the easy bandwagon and state that the financial institutions have stopped pursuing small business loan applications. As I work across the country, I have yet to see a financial institution that is not actively pursuing small business loan applications. Loan growth is high on everyone’s priority and it will be for some time. But where have all the applicants gone? Based upon our data, the trend in application volume from 2006 to 2010 is as follows: Chart displays 2010 values: So at face value, we see that actually, overall applications are down (1,032 in 2006 to 982 in 2010) while the largest financial institutions in the study were actually up from 18,616 to 25,427. Furthermore the smallest financial institutions with assets less than $500 million showed a significant increase from 167 to 276. An increase of 65% from the 2006 levels! But before we get too excited, we need to look a little further. When we are talking about increasing application volume we are focusing on applications for new exposure or a new extension of credit and not renewals. The application count in the above chart includes renewals. So let’s take a look at the comparison of New Request Ratio between 2006 and 2010. Chart displays 2010 values: So using this data in combination with the total application count we get the following measurements of new application volume in actual numbers. So once we get under the numbers, we see that the gross application numbers truly don’t tell the whole story. In fact we could classify the change in new application volume as follows: So why did the credit unions and community banks do so well while the rest held steady or dropped significantly? The answer is based upon a few factors: In this blog we are going to focus on the first – Field Resources. The last two factors – Application Requirements and Underwriting Criteria – will be covered in the next two blogs. While they have a significant impact on the application volume and likely are the cause of the application volume shift from 2006 to 2010, each represents a significant discussion that cannot be covered as a mere sub topic. More to come on those two items. Field Resources pursuing Small Business Applications The Business Banker Focus. Focus. Focus. The success of the small business segment depends upon the focus of the field pursuing the applications. As we move up in the asset size of the financial institution we see more dedicated field resources to the Small Business/Business Banking segment. Whether these roles are called business bankers, small business development officers or business banking specialists, the common denominator is that they are dedicated to the small-business/ business banking space. Their goals depend on their performance in this segment and they cannot pursue other avenues to achieve their targets or goals. When we start to review the financial institutions in the less than $20B segment, the use of a dedicated business banker begins to diminish. Marketing segments and/or business development segmentation is blurred at best and the field resource is better characterized as a Commercial Lender or Commercial Relationship Manager. The Commercial Lender is tasked with addressing the business lending needs across a particular region. Goals are based upon total dollars generated and there is no restriction outside of the legal or in house lending limit of the specific financial institution. In this scenario, the notion of any focus on small business is left to the individual commercial lender. You will find some commercial lenders that truly enjoy and devote their efforts to the small business/business banking space. These individuals enjoy working with the smaller business for a variety of reasons such as the consultative approach (small businesses are hungry for advice while the larger businesses tend to get their advice elsewhere) or the ability to use one’s lending authority. Unfortunately while your financial institution may have such commercial lenders (one’s that are truly working solely in the small business or business banking segment) to change that individual’s title or formally commit them to working only in the small business/business banking segment is often perceived as a demotion. It is this perception that continues to hinder the progress of financial institutions with assets between $500 million and $20 billion from truly excelling in the small business/business banking space. Reality is that the best field resource to generate the small business/business banking application volume available to your financial institution is through the dedicated individual known as the Business Banker. Such an individual is capable of generate up to 250 applications (for the truly high performing) per year. Even if we scale this back to 150 applications in a given year for new credit volume at an average request of $106,929 (the lowest dollar of the individual peer groups), the business banker would be generating total application dollars of $16,039,350. If we imply a 50% approval/closure rate, the business banker would be able to generate a total of $8,019,675 in new credit exposure annually. Such exposure would have the potential of generating a net interest margin of $240,590 assuming a 3% NIM.   Not too bad.

Published: December 15, 2011 by Guest Contributor

By: Joel Pruis Basic segmentation strategy for business banking asks the following questions: - Is there a uniform definition of small business across the industry? - How should small business be defined?  Sales size of the applicant?  Exposure to the financial institution? - Is small business/business banking a retail or commercial line of business? No One Size Fits All The notion of a single definition for small business for any financial institution is inappropriate as the intent for segmentation is to focus marketing efforts, establish appropriate products to support the segment, develop appropriate delivery methods and use appropriate risk management practices.  For the purpose of this discussion we will restrict our content to developing the definition of the segment and high level credit product terms and conditions to support the segment. The confusion on how to define the segment is typically due to the multiple sources of such definitions.  The Small Business Administration, developers of generic credit risk scorecards (such as Experian), marketing firms and the like all have multiple ways to define small business.  While they all have a different method of defining small business, the important factor to consider is that each definition serves the purpose of the creator.  As such, the definition of small business should serve the purpose of the specific financial institution. A general rule of thumb is the tried and true 80/20 rule.  Assess your financial institution’s business purpose portfolio by rank ordering individual relationships by total dollar exposure.  Using the 80/20 rule, determine the smallest 80% of the number of relationships by exposure.  Typically the result is that the largest 20% of relationships will cover approximately 80% of the total dollars outstanding in your business purpose portfolio.  Conversely the smallest 80% of relationships will cover only about 20% of the total dollars outstanding. Just from this basic analysis we can see the primary need for segmentation between the business banking and the commercial (middle market, commercial real estate, etc.) portfolios.  Assuming we do not segment we have a significant imbalance of effort vs. actual risk.  Meaning if we treat all credit relationships the same we are spending up to 80% of our time/resources on only 20% of our dollar risk.  Looking at this from the other direction we are only spending 20% of our credit resources assessing 80% of our total dollar risk.  Obviously this is a very basic analysis but any way that you look at it, the risk assessment (underwriting and portfolio management) must be “right-sized” in order to provide the appropriate risk management while working to maximize the return on such portfolio segments. The realities of the credit risk assessment practices without segmentation is that the small business segment will be managed by exception, at best.  Given the large number of relationships and the small impact that the small business segment has on traditional credit quality metrics such as past dues and charge offs, the performance of the small business portfolio can, in fact, be hidden.  Such metrics focus on percentage of dollars that are past due or charged off against the entire portfolio.  Since the largest dollars are in the 20% of relationships, it will take a significant number of individual small business relationships being delinquent or charged off before the overall metric would become alarming. Working with our clients in defining small business, one of the first exercises that we recommend is assessing the actual delinquency and charge off rates in the newly defined small business/business banking portfolio.  Simply put, determine the total dollars that fit the new definition and apply the charge-offs by borrowers that meet the definition that have occurred over the past 12 months divided by total outstanding in the new portfolio segment.  Similarly determine the current dollars past due of relationships meeting the definition of small business divided by the total outstanding of said segment.  Such results typically are quite revealing and will at least provide a baseline for which the financial institution can measure improvement and/or success.  Without such initial analysis, we have witnessed financial institutions laying blame on the new underwriting and portfolio management processes for such performance when it existed all along but was never measured. So basically our first attempt to define the segment has created a total credit exposure limit.  Such limits should be used to determine the appropriate underwriting and portfolio management methods (both of which we will discuss further subsequent blogs), but this type of a definition does little to support a business development effort as the typical small business does not always borrow nor can we accurately assess the potential dollar exposure of any given business until we actually gather additional data.  Thus for business development purposes we establish the definition of small business primarily by sales size.  Looking at the data from your existing relationships, your financial institution can get an accurate indication of the maximum sales size that should be considered in the business development efforts. As a result we have our business development definition by sales size of a given company and our underwriting and portfolio management defined by total exposure.  You may be thinking that such definitions are not always in sync with each other and you would be correct.  You will have some companies with total sales under your definition that borrower more than your total exposure limits while companies with total exposure that falls under small business but the total sales of such companies may exceed the business development limit.  It is impossible to catch every possibility and to do so is an exercise in futility.  Better that you start with the basics of the segmentation and then measure the new applications that exceed the total exposure or the relationships meeting the total exposure cap but exceed the sales limitation.  During the initial phase, judgment on a case by case basis will need to be used. Questions such as: Is the borrower that exceeds our sales limitations likely to need to borrow more in the near future? Is the exposure of the borrower that meets our sales size requirement likely to quickly reduce its exposure to meet our definition? Will our underwriting techniques be adequate to assess the risk of this relationship? Will our portfolio monitoring methods be sufficient to assess the changes in the risk profile after it has been booked? Will the relationship management structure be sufficient to support such a borrower? As you encounter these situations it will become obvious to the financial institution the frequency and consistency of such exceptions to the existing definition and prompt adjustments and/or exclusions.  But to try and create the exclusions before collecting the data or examining the actual application volumes is where the futility lies. Best to avoid the futility and act only on actual data. Further refinement of the segment definition will also be based on the above assessment.  Additional criteria will be added such as: Industry segments (Commercial Real Estate, for example) Product types (construction lending) Just know that the definition will not stay static.  Based upon the average credit request changes from 2006 to 2010, changes can and will be significant.  The following graph represents the average request amounts from 2010 data compared to the dollar amounts from 2006 (noted below the chart). So remember that where you start is not where you have to stay.  Keep measuring, keep adjusting and your segmentation strategy will serve you very well. Look for my next post on generating small business applications.  Specifically I’ll cover who should be involved in the outbound marketing efforts of your small business segment. I look forward to your continued comments, challenges and debate as we continue our discussion around small business/business banking.  And if you're interested, I'm hosting a 3-part Webinar series, Navigating Through The Challenges Affecting Portfolio Performance, that will evaluate how statistics and modeling, combined with strategies from traditional credit management, can create a stronger methodology and protect your bottom line.

Published: November 28, 2011 by Guest Contributor

By: Mike Horrocks Earlier this week, my wife and I were discussing the dinner plans for Thanksgiving.  The yams, cranberries, and pumpkin pies were purchased and the secret family recipes were pulled out of the cupboard.  Everything was ready…we thought.  Then the topic of the turkey was brought up.  In the buzz of work, family, kids, etc., both of us had forgotten to get the turkey.   We had each thought the other was covering this purchase and had scratched if off our respective lists.  Our Thanksgiving dinner was at risk!  This made me think of what best practices from our industry could be utilized if I was going to mitigate risks and pull off the perfect dinner.  So I pulled the page from the Basel Committee on Banking Supervision that defines operational risk as "the risk of loss resulting from inadequate or failed internal processes, people, systems or external events” and I have some suggestions that I think work for both your Thanksgiving dinner and for your existing loan portfolios. First, let’s cover “inadequate or failed processes”.  Clearly our shopping list process failed.   But how are your portfolio management processes?  Are they clearly documented and can they be implemented throughout the organization?  Your processes should be as well communicated and documented as the “Smashed Yam Bake” recipe or you may be at risk. Next, let focus on the “people and systems”.    People make mistakes – learn from them, correct them, and try to get the “systems” to make it so there are fewer mistakes.  For example, I don’t want the risk of letting the turkey cook too long, so I use a remote meat thermometer.  Ok, it is a little geeky; however the turkey has come out perfect every year.    What systems do you have in place to make your quarterly reviews of the portfolio more consistent and up to your standards?  Lastly, how do I mitigate those “external events”?  Odds are I will be able to still get a turkey tonight.  If not, I talked to a friend of mine who is a chef and I have the plans for a goose.   How flexible are your operations and how accessible are you to the subject matter experts that can get you out of those situations?  A solid risk management program takes into account unforeseen events and can make them into opportunities. So as the Horrocks family gathered in Norman Rockwell like fashion this Thanksgiving, a moment of thanks was given to the folks on the Basel committee.  Likewise in your next risk review, I hope you can give thanks for the minimized losses and mitigated risks.  Otherwise, we will have one thing very much in common…our goose will be cooked.

Published: November 25, 2011 by Guest Contributor

By: John Straka For many purposes, national home-price averages, MSA figures, or even zip code data cannot adequately gauge local housing markets. The higher the level of the aggregate, the less it reflects the true variety and constant change in prices and conditions across local neighborhood home markets. Financial institutions, investors, and regulators that seek out and learn how to use local housing market data will generally be much closer to true housing markets. When houses are not good substitutes from the viewpoint of most market participants, they are not part of the same housing market.  Different sizes and types and ages of homes, for example, may be in the same county, zip code, block, or even right next door to each other, but they are generally not in the same housing market when they are not good substitutes.  This highlights the importance of starting with detailed granular information on local-neighborhood home markets and homes.  To be sure, greater granularity in neighborhood home-market evaluation requires analysts and modelers to deal with much more data on literally hundreds of thousands of neighborhoods in the U.S. It is fair to ask if zip-code level data, for example, might not be generally sufficient. Most housing analysts and portfolio modelers, in fact, have traditionally assumed this, believing that reasonable insights can be gleaned from zip code, county-level, or even MSA data. But this is fully adequate, strictly speaking, only if neighborhood home markets and outcomes are homogenous—at least reasonably so—within the level of aggregation used. Unfortunately, even at zip-code level, the data suggests otherwise.  Examples All of the home-price and home-valuation data for this report was supplied by Collateral Analytics. I have focused on zip7s, i.e. zip+2s, which are a more granular neighborhood measure than zip codes. A Hodrick-Prescott (H-P) Filter was applied by Collateral Analytics to the raw home-price data in order to attenuate short-term variation and isolate the six-year trends. But as we’ll see this dampening still leaves an unrealistically high range of variation within zip codes, for reasons discussed below. Fortunately there is an easy way to control for this, which we’ll apply for final estimates of the range of within-zip variation in home-price outcomes.  The three charts below show the H-P filtered 2005-2011 percent changes in home-price per square foot of living area within three different types of zip codes in San Diego county. Within the first type of zip code, 92319 in this case, the home-price changes in recent years have been relatively homogenous, with a range of -56% to -40% home-price change across the zip7s (i.e., zip+2s) in 92319. But the second type of zip code, illustrated by 92078, is more typical. In this type of case the home-price changes across the zip7s have varied much more. The 2055-2011 zip7 %chg in home prices within 92078 have varied by over 40 percentage points, from -51% to -10%. In the third type of zip code, less frequent but surprisingly common, the home-price changes across the zip7s have had a truly remarkable range of variation. This is illustrated here by zip code 92024 in which the home price outcomes have varied from -51% to +21%, or a 71 percentage point range of difference—and this is not the zip code with the maximum range of variation observed! All of the San Diego County zip codes are summarized in the bar chart below. Nearly two-thirds of the zip codes, 65%, have more than 30 percentage points within-zip difference in the 2005-2011 zip7 %changes in home prices. 40% have more than a 40 percentage point range of different home-price outcomes, 23% have more than a 50 percentage point range, and 13% have more than a 70 percentage point range of differences. The average range of the zip7 within-zip code differences is a 37 percentage point median, 41 percentage-point mean. These high numbers are surprising, and are most likely unrealistically high. Summary of Within-Zip (Zip+2 level) Ranges of Variation in Home-Price Changes in San Diego: Percentage of Zips by Range Across Zip+2s in Home Price/Living Area %Change 2005-2011 Controlling for Factors Inflating the Range of Variation Such sizable differences within a typical single zip code clearly suggest materially different neighborhood home markets. While this qualitative conclusion is supported further below, the magnitudes of the within-zip variation in home-price changes shown above are quite likely inflated. There is a tendency for a limited number of observations in various zip7s to create statistical “noise” outliers, and the inclusion of distressed property sales here can create further outliers, with cases of both limited observations and distress sales particularly capable of creating more negative outliers that are not representative of the true price changes for most homes and their true range of variation within zip codes.  (My earlier blog on June 29th discussed the biases from including distressed property sales while trying to gauge general price trends for most properties.) Fortunately, I’ve been able to access a very convenient way to control for these factors by using the zip7 averages of Collateral Analytics’ AVM (Automated Valuation Model) values rather than simply the home price data summarized above. These industry-leading AVM home valuations have been designed, in part, to filter out statistical noise problems.  The bar chart below shows the still significant zip7 ranges within San Diego County zip codes using the AVM values, but the distribution is now shifted considerably, and more realistically, to a much smaller share of the zip codes with remarkably high zip7 variation. Compared with the chart above, now just 1% of the zips have a zip7 range greater than 60 percentage points, 5% greater than 50, and 11% greater than 40, but there are still 36% greater than 30. To be sure, this distribution, and the average range of zip7 differences—which is now a 25 percentage-point median, 26 percent age-point mean—do show a considerable range of local home market variation within zip codes. It seems fair to conclude that the typical zip code does not contain the uniformity in home price outcomes that most housing analysts and modelers have tended to simply assume. The difference between the effects on consumer wealth and behavior of a 10% home price decline, for example, vs. a 35 to 50% decline, would seem to be sizable in most cases. This kind of difference within a zip code is not at all unusual in these data. How About a Different Type of Urban Area—More Uniform? It might be thought that the diversity of topography, etc., across San Diego County (from the sea to the mountains) makes its variation of home market outcomes within zip codes unusually high. To take a quick gauge of this hypothesis, let’s look at a more topographically uniform urban area: Columbus, Ohio. When I informally polled some of my colleagues asking what their prior belief would be about the within-zip code variation in home price outcomes in Columbus vs. San Diego County, there was unanimous agreement with my prior belief. We all expected greater within-zip uniformity in Columbus. I find it interesting to report here that we were wrong. Both the H-P filtered raw home-price information and the AVM values from Collateral Analytics show relatively greater zip7 variation within Columbus (Franklin County) zip codes than in San Diego County.  The bar chart below shows the best-filtered, most attenuated results,  the AVM values. 5% of the Columbus zips have a zip7 range greater than 70 percentage points, 8% greater than 60, 23% greater than 50, 35% greater than 40, and 65% greater than 30. The average range of zip7 within-zip code differences in Columbus is a 35 percentage point median, 38 percentage-point mean. Conclusion These data seem consistent with what experienced appraisers and real estate agents have been trying to tell economists and other housing analysts, investors, and financial institutions and policymakers for quite a long time. Although they have quite reasonable uses for aggregate time-series and forecasting purposes, more aggregate-data based models of housing markets actually miss a lot of the very real and material variation in local neighborhood housing markets.  For home valuation and many other purposes, even models that use data which gets down to the zip code level of aggregation—which most analysts have assumed to be sufficiently disaggregated—are not really good enough. These models are not as good as they can or should be. These facts are indicative of the greater challenge to properly define local housing markets empirically, in such a way that better data, models, and analytics can be more rapidly developed and deployed for greater profitability, and for sooner and more sustainable housing market recoveries. I thank Michael Sklarz for providing the data for this report and for comments, and I thank Stacy Schulman for assistance in this post.

Published: October 7, 2011 by Guest Contributor

With the most recent guidance newly issued by the Federal Financial Institutions Examination Council (FFIEC) there is renewed conversation about knowledge based authentication. I think this is a good thing.  It brings back into the forefront some of the things we have discussed for a while, like the difference between secret questions and dynamic knowledge based authentication, or the importance of risk based authentication. What does the new FFIEC guidance say about KBA?  Acknowledging that many institutions use challenge questions, the FFIEC guidance highlights that the implementation of challenge questions can greatly impact efficacy of its usefulness. Chances are you already know this.  Of greater importance, though, is the fact that the FFIEC guidelines caution on the use of less sophisticated systems and information that can be easily guessed or obtained from an Internet search, given the amount of information available.    As mentioned above, the FFIEC guidelines call for questions that “do not rely on information that is often publicly available,” recommending instead a broad range of data assets on which to base questions.  This is an area knowledge based authentication users should review carefully.  At this point in time it is perfectly appropriate to ask, “Does my KBA provider rely on data that is publicly sourced”  If you aren’t sure, ask for and review data sources.  At a minimum, you want to look for the following in your KBA provider:     ·         Questions!  Diverse questions from broad data categories, including credit and noncredit assets ·         Consumer question performance as one of the elements within an overall risk-based decisioning policy ·         Robust performance monitoring.  Monitor against established key performance indicators and do it often ·         Create a process to rotate questions and adjust access parameters and velocity limits.  Keep fraudsters guessing! ·         Use the resources that are available to you.  Experian has compiled information that you might find helpful: www.experian.com/ffiec Finally, I think the release of the new FFIEC guidelines may have made some people wonder if this is the end of KBA.  I think the answer is a resounding “No.”  Not only do the FFIEC guidelines support the continued use of knowledge based authentication, recent research suggests that KBA is the authentication tool identified as most effective by consumers.  Where I would draw caution is when research doesn’t distinguish between “secret questions” and dynamic knowledge based authentication, which we all know is very different.   

Published: October 4, 2011 by Guest Contributor

By: Mike Horrocks Have you ever been struck by a turtle or even better burnt by water skies that were on fire?  If you are like me, these are not accidents that I think will ever happen to me and I'm not concerned that my family doctor didn't do a rotation in medical school to specialize in treating them. On October 1, 2013, however, doctors and hospitals across the U.S. will have ability to identify, log, bill, and track those accidents and thousands of other very specific medical events.  In fact the list will jump from a current 18,000 medical codes to 140,000 medical codes.  Some people hail this as a great step toward the management of all types of medical conditions, whereas others view it as a introduction of noise in a medical system already over burdened.  What does this have to do with credit risk management you ask? When I look at the amount of financial and non-financial data that the credit industry has available to understand the risk of our consumer or business clients, I wonder where we are in the range of “take two aspirins and call me in the morning” to “[the accident] occurred inside a chicken coop” (code: Y9272).   Are we only identifying a risky consumer after they have defaulted on a loan?  Or are we trying to find a pattern in the consumer's purchases at a coffee house that would correlate with some other data point to indicate risk when the moon is full? The answer is somewhere in between and it will be different for each institution.  Let’s start with what is known to be predictable when it comes to monitoring our portfolios - data and analytics, coupled with portfolio risk monitoring to minimize risk exposure - and then expand that over time.  Click here for a recent case study that demonstrates this quite successfully with one of our clients. Next steps could include adding in analytics and/or triggers to identify certain risks more specifically. When it comes to risk, incorporating attributes or a solid set of triggers, for example, that will identify risk early on and can drill down to some of the specific events, combined with technology that streamlines portfolio management processes - whether you have an existing system in place or in search of a migration - will give you better insight to the risk profile of your consumers. Think about where your organization lies on the spectrum.    If you are already monitoring your portfolio with some of these solutions, consider what the next logical step to improve the process is - is it more data, or advanced analytics using that data, a combination of both, or perhaps it's a better system in place to monitoring the risk more closely. Wherever you are, don’t let your institution have the financial equivalent need for these new medical codes W2202XA, W2202XD, and W2202XS (injuries resulting from walking into a lamppost once, twice, and sequentially).

Published: September 19, 2011 by Guest Contributor

By: Mike Horrocks Let’s all admit it, who would not want to be Warren Buffet for a day?  While soaking in the tub, the “Sage of Omaha” came up with the idea to purchase shares of Bank of America and managed to close the deal in under 24 hours (and also make $357 million in one day thanks to an uptick in the stock). Clearly investor opinions differ when picking investments, so what did Buffet see that was worth taking that large of a risk? In interviews Buffet simply states that he saw the fundamentals of a good bank (once they fix a few things), that will return his investment many times over. He has also said that he came to this conclusion based on years of seeing opportunities where others only see risk. So what does that have to do with risk management? First, ask yourself as you look at your portfolio of customers what ones are you  “short-selling”  and risk losing and what customers are you investing into and expect Buffet-like returns on in the future? Second, ask yourself how are you making that “investment” decision on your customers? And lastly, ask yourself how confident you are in that decision? If you’re not employing some mode of segmentation today on your portfolio stop and make that happen as soon as you are done reading this blog. You know what a good customer looks like or looked like once upon a time. Admit to yourself that not every customer looks as good as they used to before 2008 and while you are not “settling”, be open minded on who you would want as a customer in the future. Amazingly, Buffet did not have Bank of America’s CEO Brian Moynihan’s phone number when he wanted to make the deal. This is where you are heads and shoulders above Garot’s Steak House’s favorite customer.  You have deposit information, loan activity and performance history, credit data, and even the phone number of your customers. This gives you plenty of data and solutions to build that profile of what a good customer looks like – thereby knowing who to invest in. The next part is the hardest. How confident are you in your decision that you will put your money on it? For example, my wife invested in Bank of America the day before Warren put in his $5 billion. She saw some of the same signs that he did in the bank. However, the fact that I am writing this blog is an indicator that she clearly did not invest to the scale that Warren did. But what is stopping you from going all in and investing in your customers’ future? If the fundamentals of your customer segmenting are sound, any investment today into your customers will come back to you in loyalty and profits in the future. So at the risk of conjuring up a mental image, take the last lesson from Warren Buffet’s tub soaking investment process and get up and invest in those perhaps risky today, yet sound tomorrow customers or run the risk of future profits going down the drain.

Published: August 30, 2011 by Guest Contributor

By: Kari Michel The way medical debts are treated in scores may change with the introduction of June 2011, Medical Debt Responsibility Act. The Medical Debt Responsibility Act would require the three national credit bureaus to expunge medical collection records of $2,500 or less from files within 45 days of their being paid or settled. The bill is co-sponsored by Representative Heath Shuler (D-N.C.), Don Manzullo (R-Ill.) and Ralph M. Hall (R-Texas). As a general rule, expunging predictive information is not in the best interest of consumers or credit granters -- both of which benefit when credit reports and scores are as accurate and predictive as possible. If any type of debt information proven to be predictive is expunged, consumers risk exposure to improper credit products as they may appear to be more financially equipped to handle new debt than they truly are. Medical debts are never taken into consideration by VantageScore® Solutions LLC if the debt reporting is known to be from a medical facility. When a medical debt is outsourced to a third-party collection agency, it is treated the same as other debts that are in collection. Collection accounts of lower than $250, or ones that have been settled, have less impact on a consumer’s VantageScore® credit score. With or without the medical debt in collection information, the VantageScore® credit score model remains highly predictive.

Published: August 29, 2011 by Guest Contributor

By: Mike Horrocks The realities of the new economy and the credit crisis are driving businesses and financial institutions to better integrate new data and analytical techniques into operational decision systems. Adjusting credit risk processes in the wake of new regulations, while also increasing profits and customer loyalty will require a new brand of decision management systems to accelerate more precise customer decisions. There is a Webinar scheduled for Thursday that will insightfully show you how blending business rules, data and analytics inside a continuous-loop decisioning process can empower your organization - to control marketing, acquisition and account management activities to minimize risk exposure, while ensuring portfolio growth. Topics include: What the process is and the key building blocks for operating one over time Why the process can improve customer decisions How analytical techniques can be embedded in the change control process (including data-driven strategy design or optimization) If interested check out more - there is still time to register for the Webinar. And if you just want to see a great video - check out this intro.

Published: August 24, 2011 by Guest Contributor

Consumer credit card debt has dipped to levels not seen since 2006 and the memory of pre-recession spending habits continues to get hazier with each passing day. In May, revolving credit card balances totaled over $790 billion, down $180 billion from mid-2008 peak levels. Debit and Prepaid volume accounted for 44% or nearly half of all plastic spending, growing substantially from 35% in 2005 and 23% a decade ago. Although month-to-month tracking suggests some noise in the trends as illustrated by the slight uptick in credit card debt from April to May, the changes we are seeing are not at all temporary. What we are experiencing is a combination of many factors including the aftermath impacts of recession tightening, changes in the level of comfort for financing non-essential purchases, the “new boomer” population entering the workforce in greater numbers and the diligent efforts to improve the general household wallet composition by Gen Xers. How do card issuers shift existing strategies? Baby boomers are entering that comfortable stage of life where incomes are higher and expenses are beginning to trail off as the last child is put through college and mortgage payments are predominantly applied toward principle. This group worries more about retirement investments and depressed home values and as such, they demand high value for their spending. Rewards based credit continues to resonate well with this group. Thirty years ago, baby boomers watched as their parents used cash, money orders and teller checks to manage finances but today’s population has access to many more options and are highly educated. As such, this group demands value for their business and a constant review of competitive offerings and development of new, relevant rewards products are needed to sustain market share. The younger generation is focused on technology. Debit and prepaid products accessible through mobile apps are more widely accepted for this group unlike ten to fifteen years ago when multiple credit cards with four figure credit limits each were provided to college students in large scale. Today’s new boomer is educated on the risks of using credit, while at the same time, parents are apt to absorb more of their children’s monthly expenses. Servicing this segment's needs, while helping them to establish a solid credit history, will result in long-term penetration in a growing segment. Recent CARD Act and subsequent amendments have taken a bite out of revenue previously used to offset increased risk and related costs that allowed card issuers to service the near-prime sector. However, we are seeing a trend of new lenders getting in to the credit card game while existing issuers start to slowly evaluate the next tier. After six quarters of consistent credit card delinquency declines, we are seeing slow signs of relief. The average VantageScore for new card originations increased by 8 points from the end of 2008 into early 2010 driven by credit tightening actions and has started to slowly come back down in recent months.   What next? What all of this means is that card issuers have to be more sophisticated with risk management and marketing practices. The ability to define segments through the use of alternate data sources and access channels is critical to ongoing capture of market share and profitable usage. First, the segmentation will need to identify the “who” and the “what.” Who wants what products, how much credit is a consumer eligible for and what rate, terms and rewards structure will be required to achieve desired profit and risk levels, particularly as the economy continues to teeter between further downturn and, at best, slow growth. By incorporating new modeling and data intelligence techniques, we are helping sophisticated lenders cherry pick the non-super prime prospects and offering guidance on aligning products that best balance risk and reward dynamics for each group. If done right, card issuers will continue to service a diverse universe of segments and generate profitable growth.

Published: August 22, 2011 by Guest Contributor

What happens when once desirable models begin to show their age? Not the willowy, glamorous types that prowl high-fashion catwalks. But rather the aging scoring models you use to predict risk and rank-order various consumer segments. Keeping a fresh face on these models can return big dividends, in the form of lower risk, accurate scoring and higher quality customers. In this post, we provide an overview of custom attributes and present the benefits of overlaying current scoring models with them. We also suggest specific steps communications companies can take to improve the results of an aging or underperforming model. The beauty of custom attributes Attributes are highly predictive variables derived from raw data. Custom attributes, like those you’ve created in house or obtained from third parties, can provide deeper insights into specific behaviors, characteristics and trends. Overlaying your scoring model with custom attributes can further optimize its performance and improve lift. Often, the older the model, the greater the potential for improvement. Seal it with a KS Identifying and integrating the most predictive attributes can add power to your overlay, including the ability to accurately rank-order consumers. Overlaying also increases the separation of “goods and bads” (referred to as “KS”) for a model within a particular industry or sub-segment. Not surprisingly, the most predictive attributes vary greatly between industries and sub-segments, mainly due to behavioral differences among their populations. Getting started The first step in improving an underperforming model is choosing a data partner—one with proven expertise with multivariate statistical methods and models for the communications industry. Next, you’ll compile an unbiased sample of consumers, a reject inference sample and a list of attributes derived from sources you deem most appropriate. Attributes are usually narrowed to 10 or fewer from the larger list, based on predictiveness Predefined, custom or do-it-yourself Your list could include attributes your company has developed over time, or those obtained from other sources, such as Experian Premier AttributesSM (more than 800 predefined consumer-related choices) or Trend ViewSM attributes. Relationship, income/capacity, loan-to-value and other external data may also be overlaid. Attribute ToolboxTM Should you choose to design and create your own list of custom attributes, Experian’s Attribute ToolboxTM offers a platform for development and deployment of attributes from multiple sources (customer data or third-party data identified by you). Testing a rejuvenated model The revised model is tested on your both your unbiased and reject inference samples to confirm and evaluate any additional lift induced by newly overlaid attributes. After completing your analysis and due diligence, attributes are installed into production. Initial testing, in a live environment, can be performed for three to twelve months, depending on the segment (prescreen, collections, fraud, non-pay, etc), outcome or behavior your model seeks to predict. This measured, deliberate approach is considered more conservative, compared with turning new attributes on right away. Depending on the model’s purpose, improvements can be immediate or more tempered. However, the end result of overlaying attributes is usually better accuracy and performance. Make your model super again If your scoring model is starting to show its age, consider overlaying it with high-quality predefined or custom attributes. Because in communications, risk prevention is always in vogue. To learn more about improving your model, contact your Experian representative. To read other recent posts related to scoring, click here.

Published: August 19, 2011 by Guest Contributor

The following article was originally posted on August 15, 2011 by Mike Myers on the Experian Business Credit Blog. Last time we talked about how credit policies are like a plant grown from a seed. They need regular review and attention just like the plants in your garden to really bloom. A credit policy is simply a consistent guideline to follow when decisioning accounts, reviewing accounts, collecting and setting terms. Opening accounts is just the first step. Here are a couple of key items to consider in reviewing  accounts: How many of your approved accounts are paying you late? What is their average days beyond terms? How much credit have they been extended? What attributes of these late paying accounts can predict future payment behavior? I recently worked with a client to create an automated credit policy that consistently reviews accounts based on predictive credit attributes, public records and exception rules using the batch account review decisioning tools within BusinessIQ. The credit team now feels like they are proactively managing their accounts instead of just reacting to them.   A solid credit policy not only focuses on opening accounts, but also on regular account review which can help you reduce your overall risk.

Published: August 18, 2011 by Guest Contributor

By: Staci Baker In my last post about the Dodd-Frank Act, I described the new regulatory bodies created by the Act. In this post, I will concentrate on how the Act will affect community banks. The Dodd-Frank Act is over 3,000 pages of proposed and final rules and regulations set forth by the Consumer Financial Protection Bureau (CFPB). For any bank, managing such a massive amount of regulations is a challenge, but for a median-size bank with fewer employees, it can be overwhelming. The Act has far reaching unintended consequences for community banks.  According to the American Bankers Association, there are five provisions that are particularly troubling for community banks: 1.       Risk retention 2.       Higher Capital Requirements and Narrower Qualifications for Capital 3.       SEC’s Municipal Advisors Rule 4.       Derivatives Rules 5.       Doubling Size of the Deposit Insurance Fund (DIF) In order meet new regulatory requirements, community banks will need to hire additional compliance staff to review the new rules and regulations, as well as to ensure they are implemented on schedule. This means the additional cost of outside lawyers, which will affect resources available to the bank for staff, and for its customers and the community. Community banks will also feel the burden of loosing interchange fee income. Small banks are exempt from the new rules; however, the market will follow the lowest priced product. Which will mean another loss of revenue for the banks. As you can see, community banks will greatly be affected by the Dodd-Frank Act. The increased regulations will mean a loss of revenues, increased oversight, additional out-side staffing (less resources) and reporting requirements. If you are a community bank, how do you plan on overcoming some of these obstacles?

Published: August 15, 2011 by Guest Contributor

Subscribe to our blog

Enter your name and email for the latest updates.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe to our Experian Insights blog

Don't miss out on the latest industry trends and insights!
Subscribe