Latest Posts

Loading...

By: Mike Horrocks The realities of the new economy and the credit crisis are driving businesses and financial institutions to better integrate new data and analytical techniques into operational decision systems. Adjusting credit risk processes in the wake of new regulations, while also increasing profits and customer loyalty will require a new brand of decision management systems to accelerate more precise customer decisions. There is a Webinar scheduled for Thursday that will insightfully show you how blending business rules, data and analytics inside a continuous-loop decisioning process can empower your organization - to control marketing, acquisition and account management activities to minimize risk exposure, while ensuring portfolio growth. Topics include: What the process is and the key building blocks for operating one over time Why the process can improve customer decisions How analytical techniques can be embedded in the change control process (including data-driven strategy design or optimization) If interested check out more - there is still time to register for the Webinar. And if you just want to see a great video - check out this intro.

Published: August 24, 2011 by Guest Contributor

With the raising of the U.S. debt ceiling and its recent ramifications consuming the headlines over the past month, I began to wonder what would happen if the general credit consumer had made a similar argument to their credit lender. Something along the lines of, “Can you please increase my credit line (although I am maxed out)? I promise to reduce my spending in the future!” While novel, probably not possible. In fact, just the opposite typically occurs when an individual begins to borrow up to their personal “debt ceiling.” When the amount of credit an individual utilizes to what is available to them increases above a certain percentage, it can adversely affect their credit score, in turn affecting their ability to secure additional credit. This percentage, known as the utility rate is one of several factors that are considered as part of an individual’s credit score calculation. For example, the utilization rate makes up approximately 23% of an individual’s calculated VantageScore® credit score. The good news is that consumers as a whole have been reducing their utilization rate on revolving credit products such as credit cards and home equity lines (HELOCs) to the lowest levels in over two years. Bankcard and HELOC utilization is down to 20.3% and 49.8%, respectively according to the Q2 2011 Experian – Oliver Wyman Market Intelligence Reports. In addition to lowering their utilization rate, consumers are also doing a better job of managing their current debt, resulting in multi-year lows for delinquency rates as mentioned in my previous blog post. By lowering their utilization and delinquency rates, consumers are viewed as less of a credit risk and become more attractive to lenders for offering new products and increasing credit limits. Perhaps the government could learn a lesson or two from today’s credit consumer.

Published: August 23, 2011 by Alan Ikemura

Consumer credit card debt has dipped to levels not seen since 2006 and the memory of pre-recession spending habits continues to get hazier with each passing day. In May, revolving credit card balances totaled over $790 billion, down $180 billion from mid-2008 peak levels. Debit and Prepaid volume accounted for 44% or nearly half of all plastic spending, growing substantially from 35% in 2005 and 23% a decade ago. Although month-to-month tracking suggests some noise in the trends as illustrated by the slight uptick in credit card debt from April to May, the changes we are seeing are not at all temporary. What we are experiencing is a combination of many factors including the aftermath impacts of recession tightening, changes in the level of comfort for financing non-essential purchases, the “new boomer” population entering the workforce in greater numbers and the diligent efforts to improve the general household wallet composition by Gen Xers. How do card issuers shift existing strategies? Baby boomers are entering that comfortable stage of life where incomes are higher and expenses are beginning to trail off as the last child is put through college and mortgage payments are predominantly applied toward principle. This group worries more about retirement investments and depressed home values and as such, they demand high value for their spending. Rewards based credit continues to resonate well with this group. Thirty years ago, baby boomers watched as their parents used cash, money orders and teller checks to manage finances but today’s population has access to many more options and are highly educated. As such, this group demands value for their business and a constant review of competitive offerings and development of new, relevant rewards products are needed to sustain market share. The younger generation is focused on technology. Debit and prepaid products accessible through mobile apps are more widely accepted for this group unlike ten to fifteen years ago when multiple credit cards with four figure credit limits each were provided to college students in large scale. Today’s new boomer is educated on the risks of using credit, while at the same time, parents are apt to absorb more of their children’s monthly expenses. Servicing this segment's needs, while helping them to establish a solid credit history, will result in long-term penetration in a growing segment. Recent CARD Act and subsequent amendments have taken a bite out of revenue previously used to offset increased risk and related costs that allowed card issuers to service the near-prime sector. However, we are seeing a trend of new lenders getting in to the credit card game while existing issuers start to slowly evaluate the next tier. After six quarters of consistent credit card delinquency declines, we are seeing slow signs of relief. The average VantageScore for new card originations increased by 8 points from the end of 2008 into early 2010 driven by credit tightening actions and has started to slowly come back down in recent months.   What next? What all of this means is that card issuers have to be more sophisticated with risk management and marketing practices. The ability to define segments through the use of alternate data sources and access channels is critical to ongoing capture of market share and profitable usage. First, the segmentation will need to identify the “who” and the “what.” Who wants what products, how much credit is a consumer eligible for and what rate, terms and rewards structure will be required to achieve desired profit and risk levels, particularly as the economy continues to teeter between further downturn and, at best, slow growth. By incorporating new modeling and data intelligence techniques, we are helping sophisticated lenders cherry pick the non-super prime prospects and offering guidance on aligning products that best balance risk and reward dynamics for each group. If done right, card issuers will continue to service a diverse universe of segments and generate profitable growth.

Published: August 22, 2011 by Guest Contributor

As I’m sure you are aware, the Federal Financial Institutions Examination Council (FFIEC) recently released its, "Supplement to Authentication in an Internet Banking Environment" guiding financial institutions to mitigate risk using a variety of processes and technologies as part of a multi-layered approach. In light of this updated mandate, businesses need to move beyond simple challenge and response questions to more complex out-of-wallet authentication.  Additionally, those incorporating device identification should look to more sophisticated technologies well beyond traditional IP address verification alone. Recently, I contribute to an article on how these new guidelines might affect your institution.  Check it out here, in full:  http://ffiec.bankinfosecurity.com/articles.php?art_id=3932 For more on what the FFIEC guidelines mean to you, check out these resources - which also gives you access to a recent Webinar.

Published: August 19, 2011 by Keir Breitenfeld

What happens when once desirable models begin to show their age? Not the willowy, glamorous types that prowl high-fashion catwalks. But rather the aging scoring models you use to predict risk and rank-order various consumer segments. Keeping a fresh face on these models can return big dividends, in the form of lower risk, accurate scoring and higher quality customers. In this post, we provide an overview of custom attributes and present the benefits of overlaying current scoring models with them. We also suggest specific steps communications companies can take to improve the results of an aging or underperforming model. The beauty of custom attributes Attributes are highly predictive variables derived from raw data. Custom attributes, like those you’ve created in house or obtained from third parties, can provide deeper insights into specific behaviors, characteristics and trends. Overlaying your scoring model with custom attributes can further optimize its performance and improve lift. Often, the older the model, the greater the potential for improvement. Seal it with a KS Identifying and integrating the most predictive attributes can add power to your overlay, including the ability to accurately rank-order consumers. Overlaying also increases the separation of “goods and bads” (referred to as “KS”) for a model within a particular industry or sub-segment. Not surprisingly, the most predictive attributes vary greatly between industries and sub-segments, mainly due to behavioral differences among their populations. Getting started The first step in improving an underperforming model is choosing a data partner—one with proven expertise with multivariate statistical methods and models for the communications industry. Next, you’ll compile an unbiased sample of consumers, a reject inference sample and a list of attributes derived from sources you deem most appropriate. Attributes are usually narrowed to 10 or fewer from the larger list, based on predictiveness Predefined, custom or do-it-yourself Your list could include attributes your company has developed over time, or those obtained from other sources, such as Experian Premier AttributesSM (more than 800 predefined consumer-related choices) or Trend ViewSM attributes. Relationship, income/capacity, loan-to-value and other external data may also be overlaid. Attribute ToolboxTM Should you choose to design and create your own list of custom attributes, Experian’s Attribute ToolboxTM offers a platform for development and deployment of attributes from multiple sources (customer data or third-party data identified by you). Testing a rejuvenated model The revised model is tested on your both your unbiased and reject inference samples to confirm and evaluate any additional lift induced by newly overlaid attributes. After completing your analysis and due diligence, attributes are installed into production. Initial testing, in a live environment, can be performed for three to twelve months, depending on the segment (prescreen, collections, fraud, non-pay, etc), outcome or behavior your model seeks to predict. This measured, deliberate approach is considered more conservative, compared with turning new attributes on right away. Depending on the model’s purpose, improvements can be immediate or more tempered. However, the end result of overlaying attributes is usually better accuracy and performance. Make your model super again If your scoring model is starting to show its age, consider overlaying it with high-quality predefined or custom attributes. Because in communications, risk prevention is always in vogue. To learn more about improving your model, contact your Experian representative. To read other recent posts related to scoring, click here.

Published: August 19, 2011 by Guest Contributor

The following article was originally posted on August 15, 2011 by Mike Myers on the Experian Business Credit Blog. Last time we talked about how credit policies are like a plant grown from a seed. They need regular review and attention just like the plants in your garden to really bloom. A credit policy is simply a consistent guideline to follow when decisioning accounts, reviewing accounts, collecting and setting terms. Opening accounts is just the first step. Here are a couple of key items to consider in reviewing  accounts: How many of your approved accounts are paying you late? What is their average days beyond terms? How much credit have they been extended? What attributes of these late paying accounts can predict future payment behavior? I recently worked with a client to create an automated credit policy that consistently reviews accounts based on predictive credit attributes, public records and exception rules using the batch account review decisioning tools within BusinessIQ. The credit team now feels like they are proactively managing their accounts instead of just reacting to them.   A solid credit policy not only focuses on opening accounts, but also on regular account review which can help you reduce your overall risk.

Published: August 18, 2011 by Guest Contributor

By: Staci Baker In my last post about the Dodd-Frank Act, I described the new regulatory bodies created by the Act. In this post, I will concentrate on how the Act will affect community banks. The Dodd-Frank Act is over 3,000 pages of proposed and final rules and regulations set forth by the Consumer Financial Protection Bureau (CFPB). For any bank, managing such a massive amount of regulations is a challenge, but for a median-size bank with fewer employees, it can be overwhelming. The Act has far reaching unintended consequences for community banks.  According to the American Bankers Association, there are five provisions that are particularly troubling for community banks: 1.       Risk retention 2.       Higher Capital Requirements and Narrower Qualifications for Capital 3.       SEC’s Municipal Advisors Rule 4.       Derivatives Rules 5.       Doubling Size of the Deposit Insurance Fund (DIF) In order meet new regulatory requirements, community banks will need to hire additional compliance staff to review the new rules and regulations, as well as to ensure they are implemented on schedule. This means the additional cost of outside lawyers, which will affect resources available to the bank for staff, and for its customers and the community. Community banks will also feel the burden of loosing interchange fee income. Small banks are exempt from the new rules; however, the market will follow the lowest priced product. Which will mean another loss of revenue for the banks. As you can see, community banks will greatly be affected by the Dodd-Frank Act. The increased regulations will mean a loss of revenues, increased oversight, additional out-side staffing (less resources) and reporting requirements. If you are a community bank, how do you plan on overcoming some of these obstacles?

Published: August 15, 2011 by Guest Contributor

It’s time to focus on growth again.In 2010, credit marketers focused on testing new acquisition strategies. In 2011, credit marketers are implementing learnings from those tests.As consumer lending becomes more competitive, lenders are strategically implementing procedures to grow portfolios by expanding their marketable universe. The new universe of prospective customers is moving steadily beyond prime to a variety of near-prime segments outside of the marketing spectrum that lenders have targeted for the past three years.Many credit marketers have moved beyond testing based on new regulatory requirements and have started to market to slightly riskier populations. From testing lower-scoring segments to identifying strategies for unbanked/underbanked consumers, the breadth of methods that lenders are using to acquire new accounts has expanded. Portfolio growth strategies encompass internal process enhancements, product diversification, and precise underwriting and account management techniques that utilize new data assets and analytics to mitigate risk and identify the most profitable target populations.Experian® can help you identify best practices for growth and develop customized strategies that best suit your acquisition objectives. Whether your needs include internal methods to expand your marketable universe (i.e., marketing outside of your current footprint or offers to multiple individuals in a household) or changes to policies for external expansion strategies (i.e., near-prime market sizing or targeting new prospects based on triggered events), Experian has the expertise to help you achieve desired results. For more information on the new acquisition strategies and expanding your marketing universe, leave a comment below or call 1 888 414 1120.

Published: August 9, 2011 by Guest Contributor

By: John Straka Unsurprisingly, Washington deficit hawks have been eyeing the “sacred cows” of tax preferences for homeownership for some time now. Policymakers might even unwind or eliminate the mortgage interest deductions and capital-gains exemptions on home appreciation that have been in place in the U.S for many decades. There is an economic case to be made for doing this—more efficient resource allocation of capital, other countries have high ownership rates without such tax preferences, etc. But if you call or email or tweet Congress, and you choose this subject, my advice is to tell them that they should wait unti it’s 2005. In other words, now—or even the next few years most likely—is definitely not a good time at all to eliminate these housing tax preferences. We need to wait until it’s something like “2005”—when housing markets are much stronger again (hopefully) and state and local government finances are far from their relatively dire straits at present. If we don’t do this right, and insist on making big changes here now, then housing will take an immediate hit, and so will employment from both the housing sector and state and local governments (with further state and local service cutbacks also, due to budget shortfalls). The reason for this, of course, is that most homeowners today have not really benefited much, and won’t, from those well-established tax preferences. Why not? Because these preferences have been in place for so long now that the economic value (expected present discounted value) of these tax savings was long ago baked into the level of home prices that most homeowners paid when they bought their homes. Take the preferences away now, and the value of homes will immediately drop, and therefore so will property tax revenues collected by local governments across the U.S. This strategy will thus further bash the state- and-local sector in order to plump up some (we hope) our federal tax revenues by the value of the tax preferences. Housing will become a further drag on economic growth, and so will the resulting employment losses from both construction and local government services. As a result, it’s possible that on net the federal government may actually lose revenue from making this kind of change at precisely the wrong time. It may very well never be quite like “2005” again. But waiting for greater housing and local government strength to change long-standing housing tax preferences should make the macroeconomic impact smaller, less visible, and more easily absorbed.

Published: August 9, 2011 by Guest Contributor

The high-profile data breaches in recent months not only left millions of consumers vulnerable to the threat of identity theft and caused businesses to incur significant costs, but it also brought data security to the top of the agenda in Washington. In Congress, members of both the House and the Senate have used the recent data breaches to demonstrate the need for a uniform national data breach notification standard and increased data security standards for companies that collect consumer information. Hearings have been held on the issue and it is expected that legislation will be introduced this summer.At the same time, the Obama Administration continues to call for greater data security standards. The White House released its highly anticipated cybersecurity initiative in May. In addition to implementing a national data breach notification law, the proposal would require certain private companies to develop detailed plans to safeguard consumer data.As legislation develops and advances through multiple Congressional committees, Experian will be working with allies and coalitions to ensure that the data security standards established under the Gramm-Leach-Bliley Act and the Fair Credit Reporting Act are not superseded with new, onerous and potentially ineffective mandates.We welcome your questions and comments below.

Published: August 4, 2011 by Guest Contributor

A surprising occurrence is happening in the consumer credit markets. Bank card issuers are back in acquisition mode, enticing consumers with cash back, airline points and other incentives to get a share of their wallet. And while new account originations are nowhere near the levels seen in 2007, recent growth in new bank card accounts has been significant; 17.6% in Q1 2011 when compared to Q1 2010. So what is accounting for this resurgence in the credit card space while the economy is still trying to find its footing and credit is supposedly still difficult to come by for the average consumer? Whether good or bad, the economic crisis over the past few years appears to have improved consumers debt management behavior and card issuers have taken notice. Delinquency rates on bank cards are lower than at any time over the past five years and when compared to the start of 2009 when bank card delinquency was peaking; current performance has improved by over 40%. These figures have given bank card issuers the confidence to ease their underwriting standards and re-establish their acquisition strategies. What’s interesting however is the consumer segments that are driving this new growth. When analyzed by VantageScore, new credit card accounts are growing the fastest in the VantageScore D and F tiers with 46% and 53% increases year over year respectively. For comparison, VantageScore A and B tiers saw 5% and 1% increases during the same time period respectively.   And although VantageScore D and F represent less than 10% of new bank card origination volume ($ limits), it is still surprising to see such a disparity in growth rates between the risk categories. While this is a clear indication that card issuers are making credit more readily available for all consumer segments, it will be interesting to see if the debt management lessons learned over the past few years will stick and delinquency rates will continue to remain low. If these growth rates are any indication, the card issuers are counting on it.

Published: August 3, 2011 by Alan Ikemura

TRMA’s recent Summer 2011 Conference in San Francisco was another great, insightful event. Experian’s own Greg Carmean gave a presentation regarding the issues involved in providing credit to small-business owners. I recently interviewed Greg to get his impressions about last month’s conference. KM: I’m speaking with Experian Program Manager, Greg Carmean, who spoke at TRMA’s Summer Conference. Hi, Greg. GC: Hi, Kathy. KM: Greg, I know I’ve interviewed you before, but can you please remind everyone what your role is here at Experian? GC: Sure, I’m a Program Manager on the Small Business Credit Share side. I work with small- and medium-size companies, including telecom and cable companies, to reduce credit risk and get more value from their data. KM: Thanks, Greg. So last month, you spoke at TRMA’s Summer Conference. What did you discuss? GC: My presentation was entitled, “Beyond Consumer Credit – Providing a More Comprehensive Assessment of Small-Business Owners.” I talked about how traditional risk management tools can provide a point-in-time look at a business owner, but often fail to show the broader picture of the risk associated with all of their current and previous businesses. There is 3-4 times more fraud in small business than in consumer. Business identity theft has become a bigger issue, Tax ID verification is a common problem, and there’s a lot of concern about agents bringing in fraudulent accounts. KM: Why did you choose this particular topic?   GC: Well, Kathy, small business is seen as a large area of opportunity, but there can be a lot of difficulty involved in validation, especially when it comes to remote authentication and new businesses. KM: Would you say there’s more fraud in small business than on the consumer side? GC: Believe it or not, there is 3-4 times more fraud in small business than in consumer. Business identity theft has become a bigger issue, Tax ID verification is a common problem, and there’s a lot of concern about agents bringing in fraudulent accounts. Many telecom and cable companies are beginning to adopt more aggressive, manual processes to lower the risk of fraud. Unfortunately, that usually results in lower activation. KM: What can be done about it?   GC: Many telecom and cable companies are beginning to adopt more aggressive, manual processes to lower the risk of fraud. Unfortunately, that usually results in lower activation. KM: Sounds like it can be frustrating! GC: It can be, especially for the salespeople who bring in an account, and then find it’s not approved for service. Sometimes clients will pass a fraud check, but not a credit check. One of the topics I touched on is better tools that more accurately identify a small business owner's risk across all of their current and previous businesses to alleviate some of these problems. KM: Is there anything else telecom and cable companies should be doing? GC: I think the best risk-mitigation tool when it comes to account acquisition is leveraging information about both the small business and its owner. As they say, knowledge is power. KM: Definitely! Thanks again for your time today, Greg. Share your thoughts! If you attended TRMA’s Summer Conference, and especially if you attended Greg Carmean’s session, we’d love to hear from you. Please share your thoughts by commenting on this blog post. All of us at Experian look forward to seeing you at TRMA’s Fall Conference in Dallas, Texas, on September 20 – 21, 2011.

Published: July 29, 2011 by Guest Contributor

By: Staci Baker The Durbin Amendment, according to Wikipedia, gave the Federal Reserve the power to regulate debit card interchange fees. The amendment, which will have a profound impact on banks, merchants and anyone who holds a debit card will take effect on October 1, 2011 rather than the originally announced July 21, 2011, which will allow banks additional time to implement the new regulations. The Durbin Amendment states that card networks, such as Visa and Mastercard, will include an interchange fee of 21 cents per transaction, and must allow debit cards to be processed on at least two independent networks. This will cost banks roughly $9.4 billion annually according to CardHub.com. As stipulated in the Amendment, institutions with less than $10 billion in assets are exempt from the cap. In preparation for the Durbin Amendment, several banks have begun to impose new fees on checking accounts, end reward programs, raise minimum balance requirements and have threatened to cap transaction amounts for debit card transactions at $50 to $100 in order to recoup some of the earnings they are expected to lose. These new regulations will be a blow to already hurting consumers as their out of wallet expenses keep increasing. As you can see, The Durbin Amendment, which is meant to help consumers, will instead have the cost from the loss of interchange fees passed along in other forms. And, the loss of revenue will greatly impact the bottom line of banking institutions. Who will be the bigger winner with this new amendment - the consumer, merchants or the banks? Will banks be able to lower the cost of credit to an amount that will entice consumers away from their debit cards and to use their credit cards again? I think it is still far too soon to tell. But, I think over the next few months, we will see consumers use payment methods in a new way as both consumers and banks come to a middle ground that will minimize risk levels for all parties. Consumers will still need to shop and bankers will still need their tools utilized. What are you doing to prepare for The Durbin Amendment?

Published: July 20, 2011 by Guest Contributor

Every communication company wants to inoculate its portfolio against bad debt, late payments and painful collections. But many still use traditional generic risk models to uncover potential problems, either because they’ve always used generics or because they see their limited predictive abilities as adequate enough. Generalization dilutes results The main problem with generics, however, is how they generalize consumers’ payment behavior and delinquencies across credit cards, mortgages, auto loans and other products. They do not include payment and behavioral data focused on actual communications customers only. Moreover, their scoring methodologies can be too broad to provide the performance, lift or behavioral insights today’s providers strive to attain. Advantages of industry-specific models Communications-specific modeling can be more predictive, if you want to know who’s more likely to prioritize their phone bill and remit promptly, and who’s not. In multiple market validations, pitting an optimized industry-specific model against traditional generic products, Experian’s Tele-Risk ModelSM and Telecommunications, Energy and Cable (TEC) Risk ModelSM more accurately predicted the likelihood of future serious delinquent or derogatory payment behavior. Compared with generics, they also: Provided a stronger separation of good and bad accounts More precisely classified good vs. bad risk through improved rank ordering Accurately scored more consumers than a generic score that might have otherwise been considered unscorable Anatomy of a risk score These industry risk models are built and optimized using TEC-specific data elements and sample populations, which makes them measurably more predictive for evaluating new or existing communications customers. Optimization also helps identify other potentially troublesome segments, including those that might require special handling during on boarding, “turn ons,” or managing delinquency. Check the vital signs To assess the health of your portfolio, ask a few simple questions: Does your risk model reflect unique behaviors of actual communications customers? Is overly generic data suppressing lift and masking hidden risk? Could you score more files that are currently deemed unscorable? Unless the answer is ‘yes’ to all, your model probably needs a check-up—stat.  

Published: July 13, 2011 by Guest Contributor

Lately there has been a lot of press about breaches and hacking of user credentials.  I thought it might be a good time to pause and distinguish between authentication credentials and identity elements. Identity elements are generally those bits of meta data related to an individual.  Things like: name, address, date of birth, Social Security Number, height, eye color, etc.  Identity elements are typically used as one part of the authentication process to verify an individual’s identity.  Credentials are typically the keys to a system that are granted after someone’s identity elements have been authenticated.  Credentials then stand in place of the identity elements and are used to access systems. When credentials are compromised, there is risk of account takeover by fraudsters with mal intent.  That’s why it’s a good idea to layer-in risk based authentication techniques along with credential access for all businesses.  But for financial institutions, the case is clear: a multi-layered approach is a necessity.  You only need to review the FFIEC Guidance of Authentication in an Internet Banking Environment to confirm this fact.  Boiled down to its essence, the latest guidance issued by the FFIEC is rather simple. Essentially it’s asking U.S. financial institutions to mitigate risk using a variety of processes and technologies, employed in a layered approach. More specifically, it asks those businesses to move beyond simple device identification — such as IP address checks, static cookies and challenge questions derived from customer enrollment information — to more complex device intelligence and more complex out-of-wallet identity verification procedures. In the world of online security, experience is critical.  Layered together, Experian’s authentication capabilities (including device intelligence from 41st Parameter, out-of-wallet questions and analytics) offers a more comprehensive approach to meeting and exceeding the FFIEC’s most recent guidance. More importantly, they offer the most effective and efficient means to mitigating risk in online environments, ensuring a positive customer experience and have been market-tested in the most challenging financial services applications.

Published: July 10, 2011 by Keir Breitenfeld

Subscribe to our blog

Enter your name and email for the latest updates.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe to our Experian Insights blog

Don't miss out on the latest industry trends and insights!
Subscribe