By: Kari Michel The topic of strategic default has been a hot topic for the media as far back as 2009 and continues as this problem won’t really go away until home prices climb and stay there. Terry Stockman (not his real name) earns a handsome income, maintains a high credit score and owns several residential properties. They include the Southern California home where he has lived since 2007. Terry is now angling to buy the foreclosed home across the street. What’s so unusual about this? Terry hasn’t made a mortgage payment on his own home for more than six months. With prices now at 2003 levels, his house is worth only about one-half of what he paid for it. Although he isn’t paying his mortgage loan, Terry is current with his other debt payments. Terry is a strategic defaulter — and he isn’t alone. By the end of 2008, a record 1 in 5 mortgages at least 60 days past due was a strategic default. Since 2008, strategic defaults have fallen below that percentage in every quarter through the second quarter of 2010, the most recent quarter for which figures are available. However, the percentages are still high: 16% in the last quarter of 2009 and 17% in the second quarter of last year. Get more details off of our 2011 Strategic Default Report What does this mean for lenders? Mortgage lenders need to be able to identify strategic defaulters in order to best employ their resources and set different strategies for consumers who have defaulted on their loans. Specifically designed indicators help lenders identify suspected strategic default behavior as early as possible and can be used to prioritize account management or collections workflow queues for better treatment strategies. They also can be used in prospecting and account acquisition strategies to better understand payment behavior prior to extending an offer. Here is a white paper I thought you might find helpful.
When the Consumer Financial Protection Bureau (CFPB) takes authority on July 21, debt collectors and communications companies should pay close attention. If the CFPB has its way, the rules may be changing. Old laws, new technologies The rules governing consumer communications for debt collection haven’t seen a major update since they were written in 1977. While the FTC has enforcement power in this area, it can’t write rules—Congress must provide direction. Consequently, the rules guiding the debt collection industry have evolved based on decisions by the courts. In the meantime, technology has outpaced the law. Debt collectors have taken advantage of the latest available methods of communication, such as cell phones, autodialers and email, while the compliance requirements have largely remained murky. At the same time, complaints about debt collection practices to the FTC continue to rise. While the number is relatively low compared to the amount of overall activity, the FTC receives more complaints about debt collectors than any other industry. The agency has also raised concerns about how new communication tools, such as Facebook and Twitter, will impact the future of debt collection. Priorities for the CFPB While mortgages, credit cards and payday loans will be the early priorities for the CFPB, high on the list of to-do items will be to update the laws governing consumer communications for debt collection. Under the Dodd-Frank Act, the CFPB will be responsible not only for enforcing the Fair Debt Collection Practices Act (FDCPA), but it will also have a new ability to write the rules. This raises new issues, such as how new regulations will affect how debt collection companies can contact consumers. Even as lenders and communications companies have expressed concern about the CFPB writing the rules, the hope is that the agency will create a more predictable legal structure that covers new technologies and reduces the uncertainty around compliance. Faced with the prospect of clarifying the compliance requirements around debt collection, the ACA (Association of Collection and Credit Professionals) has started to get in front of the CFPB by putting together its own blueprint. Will the CFPB be ready by July 21? Over the last year, the CFPB has been busy building an organizational structure but still lacks a leader appointed by the President and confirmed by the Senate. (Elizabeth Warren is currently the unofficial director.) Without a permanent director in place, the agency will be unable to gain full regulatory authority on July 21 – the date set by the Treasury Department. Until then, the CFPB will be able to enforce existing laws but will be unable to write new regulations. Despite the political uncertainty, debt collectors and communications firms still need to be prepared. One way is to ensure you’re following industry best practices established by ACA. To help you be ready for any outcome, we’ll continue to follow this issue and keep you apprised of the CFPB’s direction. Let us know your thoughts and concerns in the comment section. Or feel free to contact your Experian rep directly with any questions you may have. Helpful links: Association of Credit and Collection Professionals Fair Debt Collection Practices Act (PDF) Consumer Financial Protection Bureau (CFPB)
This is the third and final post in an interview between Experian’s Tom Whitfield and Dr. Michael Turner, founder, president and CEO of the Policy and Economic Research Council (PERC)—a non-partisan, non-profit policy institute devoted to research, public education, and outreach on public and economic policy matters. In this post Dr. Turner discusses mandatory credit-information sharing for communications companies, and the value of engaging and educating state regulators. _____________________________ Does it make sense for the FTC to mandate carriers to report? Credit information sharing in the United States is a voluntary system under the Fair Credit Reporting Act (FCRA). Mandating information sharing would break precedent with this successful, decades-old regime, and could result in less rather than more information being shared, as it shifts from being a business matter to a compliance issue. Additionally, the voluntary nature of credit reporting allows data furnishers and credit bureaus to modify reporting in response to concerns. For example, in reaction to high utility bills as a result of severe weather, a utility provider may wish to report delinquencies only 60 days or more past due. Similarly, a credit bureau may not wish to load data it feels is of questionable quality. A voluntary system allows for these flexible modifications in reporting. Further, under existing federal law, those media and communications firms that decide they want to fully report payment data to one or more national credit bureaus are free to do so. In short, there is simply no need for the FTC to mandate that communications and media companies report payment data to credit bureaus, nor would there be any immediate benefit in so doing. How much of the decision is based on the influence of the State PUC or other legislative groups? Credit information sharing is federally regulated by the Fair Credit Reporting Act (FCRA). The FCRA preempts state regulators, and as such, a media or communications firm that wants to fully report may do so regardless of the preferences of the state PUC or PSC. PERC realizes the importance of maintaining good relations with oversight agencies. We recommend that companies communicate the fact of fully reporting payment data to a PUC or PSC and engage in proactive outreach to educate state regulators on the value of credit reporting customer payment data. There have been notable cases of success in this regard. Currently, just four states (CA, OH, NJ and TX) have partial prohibitions regarding the onward transfer of utility customer payment data to third parties, and none of these provisions envisioned credit reporting when drafted. Instead, most are add-ons to federal privacy legislation. Only one state (CA) has restrictions on the onward transfer of media and communications customer payment data, and again this has nothing to do with credit reporting. Agree, disagree or comment Whether you agree with Dr. Turner’s assertions or not, we’d love to hear from you. So please, take a moment to share your thoughts about full-file credit reporting in the communications industry. Click here to learn more about current and pending legislation that impacts communications providers.
By: John Straka The U.S. housing market remains relatively weak, but it’s probably not as weak as you think. To what extent are home prices really falling again? Differing Findings Most recent media coverage of the “double dip in home prices” has centered on declines in the popular Case-Schiller price index; however, the data entering into this index is reported with a lag (the just released April index reflects data for February-April) and with some limitations. CoreLogic publishes a more up-to-date index value that earlier this month showed a small increase, and more importantly, CoreLogic also produces an index that excludes distressed sales. This non-distressed index has shown larger recent price increases, and it shows increases over the last 12 months in 20 states. Others basing their evidence on realtors’ listing data have concluded that there was some double dip last year, but prices have actually been rising now for several months (See Altos). These disparate findings belie overly simplistic media coverage, and they stress that “the housing market” is not one single market, of course, but a wide distribution of differing outcomes in very many local neighborhood home markets across the nation. (For a pointed view of this, see Charron.) Improved Data Sources Experian is now working with the leading source of the most granular and timely home market analytics and information, from nationwide local market data, and the best automated valuation model (AVM) provider based on these and other data, Collateral Analytics. (Their AVM leads in accuracy and geographic coverage in most large lender and third party AVM tests). While acknowledging their popularity, value, and progress, Collateral Analytics President Dr. Michael Sklarz questions the traditional dominance of repeat-sales home price indexes (from Case-Shiller etc.). Repeat-sales data typically includes only around 20 to 30 percent of the total home sales taking place. Collateral Analytics instead studies the full market distribution of home sales and market data and uses their detailed data to construct hedonic price indexes that control for changing home characteristics. This approach provides a similar “constant quality” claim as repeat-sales—without throwing away a high percentage of the market observations. Collateral Analytics indexes also cover over 16,000 zip codes, considerably more than others. Regular vs. Distressed Property Sales Nationwide, some well-known problem states, areas and neighborhoods continue to fare worse than most others in today’s environment, and this skewed national distribution of markets is not well described by overall averages. Indeed, on closer inspection, the recent media-touted gloomy picture of home prices that are “falling again” or that “continue to fall” is a distorted view for many local home markets, where prices have been rising a little or even more, or at least remaining flat or stable. Nationwide or MSA averages that include distressed-property sales (as Case-Shiller tends to do) can be misleading for most markets. The reason for this is that distressed-property sales, while given much prominence in recent years and lowering overall home-price averages, have affected but not dominated most local home markets. The reporting of continued heavy price discounts (twenty percent or significantly more) for distressed sales in most areas is a positive sign of market normality. It typically takes a significantly large buildup of distressed property sales in a local area or neighborhood home market to pull down regular property sale prices to their level. For normal or regular home valuation, distressed sales are typically discounted due to their “fire sale” nature, “as is” sales, and property neglect or damage. This means that the non-distressed or regular home price trends are most relevant for most homes in most neighborhoods. Several examples are shown below. As suggested in these price-per-living-area charts, regular (non-distressed) home-sale prices have fared considerably better in the housing downturn than the more widely reported overall indexes that combine regular and distressed sales(1). Regular-Sale and Combined Home Prices in $ Per Square Foot of Living Area and Distress Sales as a Pct of Total Sales In Los Angeles, combined sale prices fell 46 percent peak-to-trough and are now 16 percent above the trough, while regular sale prices fell by considerably less, 33 percent, and are now 3 percent above the trough. Distressed sales as a percent of total sales peaked at 52 percent in 2009:Q1, but then fell to a little under 30 percent by 2010:Q2, where it has largely remained (this improvement occurred before the general “robo-signer” process concerns slowed down industry foreclosures). L.A. home prices per square foot have remained largely stable for the past two years, with some increase in distressed-sale prices in 2009. Market prices in this area most recently have tended to remain essentially flat—weak, but not declining anew, with some upward pressure from investors and bargain hunters (previously helped by tax credits before they expired). Double-Dip: No. In Washington DC, single-family home prices per square foot have been in a saw- tooth seasonal pattern, with two drops of 15-20% followed by sizable rebounds in spring sales prices. The current combined regular & REO average price is 17 percent below its peak but 13 percent above its trough, while the regular-sale average price is just 12 percent below the peak and 10 percent above its trough. Distressed sales have been comparatively low, but rising slowly to a peak of a little over 20 percent in 2010, with some slight improvement recently to the high teens. Single-family prices in DC have remained comparatively strong; however, more of the homes in DC are actually condos, and condo prices have not been quite as strong, with the market data showing mixed signals but with the average price per square foot remaining essentially flat. Double-Dip: No. In the Miami area, the combined average home price per square foot fell by 48 percent peak to trough and is now just 1 percent above the 2009:Q2 trough. The regular-sale average price already experienced an earlier double-dip, falling by 32 percent to 2009:Q2, then stabilizing for a couple of quarters before falling another 9 percent relative to the peak; since 2010:Q3 this average has been choppy but basically flat, now 3 percent above that second trough. Prices in Miami have been among the weakest in large metro areas, but average prices have been largely flat for the past year, without any sharp new double dip. Distressed sales as a percent of the total peaked at 53 percent in 2009:Q1, but then fell to a little under 30 percent by 2010:Q2; since then there has been some return to a higher distress share, in the mid to upper 30s (but all of these figures are about 10 percentage points lower for condos). New Double-Dip: No. The Dallas area has seen some of the strongest prices in the nation. The combined price per square foot had an earlier peak and fell by 31 percent peak to trough, but it is now 33 percent above the trough. The regular-sale average price fell briefly by 22 percent peak to trough, but it has since risen by 32 percent from the 2009:Q1 trough to where it is now 3 percent above the peak. The increases have occurred in a saw-tooth seasonal pattern with spring prices the highest, but prices here have been largely rising considerably. Distress sales as a percent of the total peaked at 22 percent in 2009:Q1 but have largely fallen since and now stand at just 11 percent. Double-Dip: No. Here You Can See 47 More Examples of Where Double-Dips Are and Are Not: » Pacific West » Southwest » Mountain West » Midwest » Northeast » Mid Atlantic » Southeast To summarize this information and gain a little more insight into the general area conditions for most homes and individuals in the U.S., we can add up the number of homes and the total population across the counties examined. To be sure, this information is not a rigorous random sample across homes, but I have tried to include and show the details of both stronger and weaker metro-area counties throughout the U.S. As shown in the tables below, the information used here has covered 51 metro-area counties, including a total population of over 15 million homes and nearly 75 million individuals(2). These results may be regarded as suggestive of findings from a more thoroughgoing study. Based on these reviews of the market price averages and other data, my assessment is that a little over half of the counties examined are not currently or recently experiencing a double-dip in home prices. Moreover, these counties, where home prices appear to be at least flat or relatively stronger, encompass almost two-thirds (65%) of the total affected U.S. population examined, and nearly three-fifths (58%) of the total properties covered by the data studied. Conclusion This is, on balance, good news. But there are remaining concerns. One is the continued high, or more recently rising, shares of distressed sales in many markets, and the “shadow inventory” of distressed sales now being held up in the current foreclosure pipeline. But it is also interesting to see that many of the reductions in the distressed-property shares of total sales in high-stress areas occurred before the foreclosure processing slowdowns. Another interesting observation is that most of the recent double-dips in prices have been relatively mild compared to the previous original peak-to-trough meltdown. While, to be sure, there are plenty of reasons to remain uncertain and cautious about U.S. home prices, home markets in general do vary considerably, with significant elements of improvement and strength as well as the continuing weaknesses. Despite many reports today about “the beleaguered housing market,” there really is no such thing … not unless the report is referring to a very specific local market. There definitely are double dips in many areas, and reasons for continuing overall concern. But the best available evidence suggests that there are actually double-dip markets—most relatively moderately so, stable markets, and stronger markets, with markets affecting a majority of homes and individuals actually in the stable and stronger categories. Note: In a next installment, we’ll look at some more granular micro market data, to explore in greater depth the extensive variety of home-price outcomes and market conditions in weak pockets and strong pockets across various local areas and home markets. This will highlight the importance of having very good information, at sub-county and even sub-zip code levels, on local-neighborhood home markets. Source of Home Price and Market Information: Collateral Analytics HomePriceTrends. I thank Michael Sklarz for providing the extensive information for this report and for comments, and I thank Stacy Schulman for assistance in this posting. __________________ (1) Based on analysis by Collateral Analytics, price/living sq ft is a useful, simple “hedonic” measure which typically controls for around 70 percent or more of the changing characteristics in a housing stock and home sale mix. Patterns in home prices without dividing by the square footage are generally similar, but not always. (2) The property inventory counts are from Collateral Analytics, while the population estimates are from the 2010 U.S. Census.
This is the second in a three-part interview between Experian’s Tom Whitfield and Dr. Michael Turner, founder, president and CEO of the Policy and Economic Research Council (PERC)—a non-partisan, non-profit policy institute devoted to research, public education, and outreach on public and economic policy matters. Dr. Turner is a prominent expert on credit access, credit reporting and scoring, information policy, and economic development. Mr. Whitfield is the Director of Marketing for Experian’s Telecommunications, Energy and Cable practice. In this post Dr. Turner explains how full-file credit reporting actually benefits consumers and why many communications providers haven’t yet embraced it. _____________________________ Why is full-file credit reporting good for communications customers? Approximately 54 million Americans either have no credit report, or have very little information in their credit reports to generate a credit score. Most of these “thin-file/no-file” persons are financially excluded and many of them are media and communications customers. By having their payment data fully reported to a credit bureau and included in their credit reports, many will be able to access affordable sources of mainstream credit for the first time; others will be helped by repairing their damaged credit. In this way, consumers will save by not relying on high-cost lenders to have their credit needs met. Why don’t providers embrace reporting like other major industries/lenders? A major reason is inertia—providers haven’t done it before and are not sure how they would benefit from change. Just recently, PERC released a major study highlighting the business case for fully reporting customer payment data to one or more nationwide credit bureaus. This includes customer survey results, peer survey results and case studies. The results all point to tremendous upside from fully reporting payment data, with only manageable downsides—including external communications and regulators. Misperceptions and misunderstandings Another significant reason is regulator misperceptions and misunderstandings. State public service and public utility commissions (PSCs and PUCs) aren’t experts in credit reporting or the regulatory framework around credit-information sharing. Many mistakenly believe the data is unregulated and can be used for marketing. Not wanting to contribute to an increase in commercial mail and telemarketing calls, some regulators have a knee-jerk reaction when the topic of credit reporting is raised by an interested media, communications or utility company. PERC has been working to educate regulators and has had success in their outreach efforts. PERC can be a resource to firms interested in full-file reporting in direct communications with regulators. Part 3: Wednesday, June 29 Next, in the concluding post of this interview with PERC founder, president and CEO Dr. Michael Turner, the doctor discusses mandatory credit-information sharing for communications companies, and the value of engaging and educating state regulators. Agree, disagree or comment Whether you agree with Dr. Turner’s assertions or not, we’d love to hear from you. So please, take a moment to share your thoughts about full-file credit reporting in the communications industry.
This is the first in a three-part interview between Experian’s Tom Whitfield and Dr. Michael Turner, founder, president and CEO of the Policy and Economic Research Council (PERC)—a non-partisan, non-profit policy institute devoted to research, public education, and outreach on public and economic policy matters. Dr. Turner is a prominent expert on credit access, credit reporting and scoring, information policy, and economic development. Mr. Whitfield is the Director of Marketing for Experian’s Telecommunications, Energy and Cable practice. In this post Dr. Turner discusses how communications providers and their customers can both benefit from full-file credit reporting. Comments, suggestions and differing viewpoints are welcome. _____________________________ Why is full reporting to the bureaus so critical for communication providers? PERC’s research has found at least three good business reasons for media and communications companies to consider this practice: 1) Improved cash flow. In a survey of nearly 1,000 heads of household (those with primary bill paying responsibility), media and communications payments ranked below payments that were fully reported to credit bureaus. When asked how credit reporting would impact bill payment prioritization, half of all respondents indicated they would be “much more likely” or “more likely” to pay their media and communications bills on time. Such an outcome would represent a significant cash flow improvement. In fact, case study results substantiate this, and demonstrate further benefits from reduced delinquencies and charge offs. 2) Cost savings. In a survey of media, communications, and utilities the perceived costs of reporting payments to a bureau were, in fact, substantially greater than actual costs incurred, and perceived benefits significantly lower than actual benefits. In most cases, the actual benefits reported by firms fully reporting payment data to one or more nationwide credit bureaus were multiples higher than the actual costs, which were reported as being modest as a ratio of IT and customer service expenditures. 3) More customer loyalty, less churn. In a competitive deregulated environment, telling customers about the benefits of fully reporting payment data (building a good credit history, reducing costs of credit and insurance, increasing credit access and credit limits, improving chances of qualifying for an apartment rental or job) could result in increased loyalty and less churn. How do providers stand to benefit from reporting? Providers benefit because fully reporting payment data to a nationwide credit bureau for inclusion in credit reports actually changes customer behavior. Reporting negative-only data doesn’t affect customers in the same way, and, in the vast majority of cases, does not affect payment behavior at all, as consumers are entirely unaware of reporting or see it as a “black list.” By communicating the many customer benefits of fully reporting payment data to a credit bureau for inclusion in a credit report, the provider benefits from improved cash flow, reduced charge offs, and improved customer loyalty. Part 2: Monday, June 27 In Part 2 of this interview, Dr. Turner explains how full-file credit reporting actually benefits consumers and why many communications providers haven’t yet embraced it. The primary reason uncovered in PERC’s research may surprise you, so be sure to come back for Part 2. Agree, disagree or comment Whether you agree with Dr. Turner’s assertions or not, we’d love to hear from you. So please, take a moment to share your thoughts about full-file credit reporting in the communications industry.
High-profile data breaches are back in the headlines as businesses—including many in the communications sector—fall prey to a growing number of cyberattacks. So far this year, 251 public notifications of data breaches have been reported according to the Privacy Rights Clearinghouse. The latest attack comes on the heels of the Obama administration’s recent proposal to replace conflicting state laws with a uniform standard. The idea is not a new one—national breach notification legislation has been in discussion on Capitol Hill since 2007. With the addition of the White House proposal, three data breach notification bills are now under consideration. But rather than waiting for passage of a new law, communications companies and businesses in general should be aware of the issues and take steps to prepare. Replacing 48 laws with one Currently, notification standards differ on a state-by-state basis: 46 states, plus the District of Columbia and Puerto Rico each enforce their own standards. The many varying laws make compliance confusing and expensive. While getting to a single standard sounds like a good idea, finding a single solution becomes difficult when there are 48 different laws to reconcile. The challenge is to craft a uniform national law that preempts state laws, while providing adequate consumer protection. Five things to look for in a National Breach Notification Law Passing a single law will be an uphill battle. In the meantime, these are some of the issues that will need to be resolved before a national breach standard can be enacted: What types of personal information should be protected? First and last name + other info (e.g. bank account number) What should be classified as “personal” information? Email addresses and user names Health and medical information (California now includes this) What qualifies as a breach and what are the triggers for notification? What information should be included in a breach notice? How soon after a breach should notification be sent? Some states require notices be sent within a set number of days, others ASAP. Potential penalties What could happen if a company doesn’t comply with the proposed laws? Under the White House bill, fines would be limited to $1,000/day, with a $1 million cap. The two bills in House would impose penalties of $11,000/day, maxing out at $5 million. How to prepare before a national standard is passed Although the timing for passage is uncertain, communications companies need not wait for a national law to pass before taking action. Put a plan in place instead of sorting through 48 different laws. Preparation can be as simple as making a phone call to your Experian rep about our data breach protection services. Having managed over 2,300 data breach events, Experian can help you effectively mitigate loss. In addition to following updates on this page, you can also stay informed about the progress of pending data breach legislation by following the Data Breach Blog. Share your thoughts and concerns on the current proposals by leaving a comment. For further reading on this subject: Experian Data Breach Blog State Security Breach Notification Laws Obama Administration Proposal: Law Enforcement Provisions Related to Computer Security (pdf of the full bill) Obama national breach notification proposal: Good news, bad news for firms 2011 Data Breach Investigations Report (PDF)
This week, American Express unveiled a new payments offering that will surely compete with not just other prepaid options, but will impact debit and credit sales volume as well. The prepaid card offered by American Express carries no fee for activation, reloading or lost card replacement. The card also offers consumers the option of drawing cash at an ATM. Since the consumer funds all transactions, default risk goes away, exponentially opening up the market potential. The question becomes, how will this impact other plastic and mobile payment sales today and down the road? Back in the year 2000, credit cards dominated purchase volume generating 77% of all merchant sales on general purpose cards, versus 23% on debit. Last year, debit and prepaid purchases captured close to half of all general purpose card spend with credit sales capturing ~53%, debit 44% and the remaining volume coming from prepaid cards. With all of the regulatory changes impacting bank revenue and cost positions, financial institutions are having to rethink existing practices eliminating rewards programs tied to debit charge volume and resurrecting monthly checking account fees in large scale. It\'s not a question of bank gouging, rather how do financial service providers offset lost interchange income of $0.40+ per transaction down to $0.12 to $0.20 as is being mandated with the quickly approaching implementation of the Durbin Amendment on July 21st. Add to that reduced fee income from Dodd-Frank and institutions have to figure out how to still be able to afford to offer these services to their customers. Who will the winners and losers be?Let\'s start with the Merchant perspective. With companies now actively promoting services to help merchants calculate which payment vehicles generate the lowest costs without impacting sales volumes, we very well may start to see more of the "Costco" business model where only certain pay-types are accepted at different merchants. My predictions are that first, merchants will continue to send a message loud and clear that they perceive the cost of interchange to be too high. Smaller institutions, presumably protected from interchange caps, will be forced to reduce rates anyway to sustain merchant acceptance unless existing federal law requirements remain unchanged, precluding retailers from following the laws of capitalism. One certainty is that we will see continued development of alternatives similar to Wal-Mart and Starbucks payment options.Credit and Debit sales will be impacted although they will continue to be valued highly by specific segments. The affluent will continue to expect rewards and other benefits deeming credit cards highly relevant and meaningful. Additionally, small business owners will need the payment float utility to fund services they typically don\'t get paid for up front. At the same time, many consumers will continue to deleverage and sustain favoritism to debit over credit. Prepaid and emerging Mobile Payments technology will continue to attract younger consumers as well as the early adapters that want to leverage the newest and coolest products and services. The negatives will take some time to surface. First, how will the CFPB react to prepaid growth and the fact that the product is not subject to interchange caps stipulated under Durbin? Next, how will merchants react and again, will we continue to see retailer specific options dominate merchant acceptance? Lastly, when fraudsters figure out how to penetrate the prepaid and mobile space, will consumers swarm back to credit before advanced fraud prediction models can be deployed since consumers bear the brunt of the liability in the world of prepaid?
For communications companies, acquiring new accounts is an ongoing challenge. However, it is critical to remember that managing new and existing accounts – and their respective risks – is of tremendous importance. A holistic view of the entire customer lifecycle is something every communications organization can benefit from. The following article was originally posted by Mike Myers on the Experian Business Credit blog. Most of us are pretty familiar with credit reports and scores, but how many of you are aware of the additional tools available to help you manage the entire credit risk lifecycle? I talk to credit managers everyday and as we’re all trying to do more with less, it’s easy to forget that opening accounts is just the first step. Managing risk on these accounts is as critical, if not more so, than opening them. While others may choose to “ship and chase”, you don’t need to. Proactive alert/monitoring services, regular portfolio scoring and segmentation are key components that a successful credit department needs to employ in the constant battle against “bad” accounts. Use these tools to proactively adjust credit terms and limits, both positively and negatively. Inevitably some accounts will go bad, but using collection research tools for skip tracing and targeting services for debt collection will put you first in line for collections. A journey of 1,000 miles begins with a single step; we have tools that can help you with that journey and all can be accessed online.
Later this month, at TRMA’s 2011 Summer conference in San Francisco, U.S. Cellular’s John Stevenson will facilitate a panel discussion by industry experts entitled “How to Make a First-Party Program Successful.” Topics will include: roll-out, how to measure success, criteria in choosing a partner, experience around unsuccessful ventures and how to turn it around; training/recruiting (internal versus external). Panel – How to Make a 1st Party Program Successful Moderated by: John Stevenson, U.S. Cellular Wednesday, June 29 | 10:30 AM – 11:15 AM Panelists: Dave Hall, West Asset Management; David Rogers, GC Services; Sterling Shepherd, CPA ------------------------------------------------ KM: Thanks for joining us today, John. Before we get started, tell us about your background, including what you do for U.S. Cellular and your work on TRMA’s Board of Directors. JS: My pleasure Kathy. I have been in the wireless industry for over 25 years now, mostly with service providers, including U.S. Cellular, where I have been for the past five and half years. I lead the Financial Services organization, which is responsible for cradle to grave accounts receivable- credit, collections, fraud management, risk assessment and management, all the way through to debt sales and write off. I just joined the TRMA board earlier this year and am starting to dive in to all the activity going on. It’s really a strong trade association for sharing information and best practices that can help all members improve results. KM: The discussion you’ll be moderating is entitled “How to Make a First-Party Program Successful.” Can you briefly describe the focus of the proceedings and why you believe companies need this information? JS: Many of our member companies either already use, or are considering the use of an outsource partner for their first party collections. This panel is not going to get into whether a company should or should not, but will focus more on how to make it a success once you have made that choice. We have some real depth on our panel, they have seen a lot of programs and know what it takes to make it a success. We are asking the panel to really focus in on sharing some of the key points to address with a first party program. Our aim is that the TRMA members, both new and experienced with first party programs, have a couple of those AHA moments, where they pick up something new they can use in their own operations. KM: What are one or two other emerging telecom issues you think people should know about? JS: There is a recurring theme, and that is the ever changing risk profile that telecom risk managers have to deal with. The devices are more expensive, the services more complex, there is a lot of bundling going on. All that really emphasizes how important it is to ensure your models and strategy are current and continue to deliver the results you expect. That’s part of the value of TRMA, no matter what the latest trend or issue in risk management is, this is a great place to learn more about it, and talk to your peers and support partners about it. KM: Insightful as ever. Thanks so much for your time. ------------------------------------------------ Other sessions of interest at the TRMA Summer Conference Beyond Consumer Credit: Providing a More Comprehensive Assessment of Small-Business Owners Wednesday, June 29 | 3:15 PM – 4:00 PM Presenter: Greg Carmean, Experian Program Manager, Small Business Credit Share Main topic: new technologies that help uncover fraud, improve risk assessment and optimize commercial collections by providing deeper insights into the entity relationships between companies and their associated principals. Not registered for the TRMA Summer Conference? Go here.
The end of 2010 was a transitional time for credit card lenders. Card issuers were faced with the need to jump-start “return to growth strategies” as a result of diminished profits stemming from the great recession and all of the credit tightening actions deployed over the last two years. Lenders were deliberate in their actions to shrink balance sheets eliminating higher risk customers. At the same time, risk adverse consumers were, and continue to be, more thoughtful about spending, taking deliberate measures to buy what they perceive to be necessary and able to pay back. Being the only safe bet in town, the super prime universe went from saturated to abundantly over-saturated, and only recently have lenders begun to turn the ship in anticipation of continued relief in default trends. As a result of sustained relief in credit card defaults and over-saturation in the prime+ space, more lenders have begun loosening policies. This has created price competition with 74% of new offers including low introductory rates for longer durations, averaging 12 months, up from 9 months just one year ago. The percent of annual fee offers decreased as well to 21% from 34% one year prior. Continuing the trend of competing for the prime+ segment, lenders have increasingly been promoting loyalty programs, in many cases, combined with spend-incented rebates. In fact, over a third of new offers were for rewards based products, up from 26% prior to the start of the economic turn in 2007. Lenders are now shifting gears to compete in new ways focusing on consumer demand for payment choices. Regardless of a consumer’s credit profile, lenders and technology providers are investing in innovative payment solutions. Lenders understand that if the Starbucks “My Coffee Card” is only available on their customer’s iPhone, Blackberry or Android using a re-loadable Starbucks app, then traditional card issuers will lose purchase volume. What is becoming more and more critical is a lenders ability to leverage new data sources in their targeting strategies. It is no longer enough to know what products provide the most relevance to consumer needs. A lender must now know the optimal communication channel for unique segments of the population, their payment preferences and the product terms and features that competitively match the consumer’s needs and risk profile. Lenders are leveraging new data sources around income, wealth, rent payment, ARM reset timing and strategic default, wallet spend and purchase timing.Loading...
By: Tracy Bremmer Score migration has always been a topic of interest among financial institutions. I can remember doing score migration analyses as a consultant at Experian for some of the top financial institutions as far back as 2004, prior to the economic meltdown. Lenders were interested in knowing if I could approve a certain number of people above a particular cut-off, and how many of them will be below that cutoff within five or more years. Or conversely, of all the people I’ve rejected because they were below my cut-off, how many of them would have qualified a year later or maybe even qualified the following month. We’ve done some research recently to gain a better understanding of the impact of score migration, given the economic downturn. What we found was that in aggregate, there is not a ton of change going on. Because as consumers move up or down in their score, the overall average shift tends to be minimal. However, when we’ve tracked this on a quarterly basis into score bands or even at a consumer level, the shift is more meaningful. The general trend is that the VantageScore “A” band, or best scorers, has been shrinking over time, while the VantageScore “D” & “F” bands, lower scorers, has grown over time. For instance, in 2010 Q4, the amount of consumers in VantageScore A was the lowest it has been in the past three years. Conversely, the number of consumers falling into the VantageScore “D” & “F” bands are the highest they have been during that same time period. This constant shift in credit scores, driven by changes in a consumer’s credit file, can impact risk levels beyond the initial point of applicant approval. For this reason, we recommend updating and refreshing scores on a very regular basis, along with regular scorecard monitoring, to ensure that risk propensity and the offering continue to be appropriately aligned with one another.
About a month ago, Senior Decisioning Consultant Krista Santucci and I gave a presentation at Experian’s 2011 Vision Conference on Decisioning as a Service. Due to the positive feedback we received, I thought it might be of interest to members of the communications industry who might not have had the opportunity to attend. A common malady The presentation revolved around a case study of an Experian client. Like many communications industry companies, this client had multiple acquisition systems in place to process consumer and commercial applications. In addition, many of the processes to mitigate fraud and support Red Flag compliance were handled manually. These issues increased both complexity and cost, and limited the client’s ability to holistically manage its customer base. The road to recovery At the beginning of the presentation, we provided a handout that listed the top ten critical functionalities for decisioning platforms. After a thorough review of the client’s system, it was clear that they had none of the ten functionalities. Three main requirements for the new decisioning platform were identified: A single system to support their application processes (integration) A minimum of 90% automatic decisions for all applications (waterfall) The ability to integrate into various data sources and not be resource intensive on their IT department (data access) Decisioning as a ServiceSM is a custom integrated solution that is easily applied to any type of business and can be implemented to either augment or completely overhaul an organization’s current decisioning platforms. We designed this client’s solution with a single interface that manages both consumer and commercial transactions, and supports a variety of access channels and treatment strategies. Following implementation, the client immediately benefited from: Streamlined account opening processes A reduction in manual processes Decreased demand on IT resources The ability to make better, more consistent decisions at a lower cost The agility to quickly respond to changing market needs and regulatory challenges Evaluate your own business Do you recognize some of your own challenges in this post? Download our checklist of the top ten critical functionalities for decisioning platforms and evaluate your own system. As you go through the list, think about what benefits you would derive by having access to each of the capabilities. And if you’d like to learn more about Decisioning as a Service, please complete our form.
By: Kennis Wong Data is the very core of fraud detection. We are constantly seeking new and mining existing data sources that give us more insights into consumers’ fraud and identity theft risk. Here is a way to categorize the various data sources. Account level - When organizations detect fraud, naturally they leverage the data in-house. This type of data is usually from the individual account activities such as transactions, payments, locations or types of purchases, etc. For example, if there’s a purchase $5000 at a dry cleaner, the transaction itself is suspicious enough to raise a red flag. Customer level - Most of the times we want to see a bigger picture than only at the account level. If the customer also has other accounts with the organization, we want to see the status of those accounts as well. It’s not only important from a fraud detection perspective, but it’s also important from a customer relationship management perspective. Consumer level - As Experian Decision Analytics’ clients can attest, sometimes it’s not sufficient to look only at the data within an organization but also to look at all the financial relationships of the consumer. For example, in the situation of bust out fraud or first-party fraud, if you only look at the individual account, it wouldn’t be clear whether a consumer has truly committed the fraud. But when you look at the behavior of all the financial relationships, then the picture becomes clear. Identity level - Fraud detection can go into the identity level. What I mean is that we can tie a consumer’s individual identity elements with those of other consumers to discover hidden inconsistencies and relationships. For example, we can observe the use of the same SSN across different applications and see if the phones or addresses are the same. In the account management environment, when detecting existing account fraud or account takeover, this level of linkage is very useful as more data becomes available after the account is open. Loading...
The Consumer Financial Protection Bureau (CFPB) is a new regulatory agency that is still evolving. But even now it’s clear that it will have unprecedented powers with a broad reach across industries – including communications. Although there are questions about how the CFPB will operate, there are still steps you can take to prepare. To help you get ready, let’s review a few of the areas you should expect CFPB to affect your business, followed by three questions you can help your customers answer. 3 Ways the CFPB Will Impact Business: Consumer disclosures must be clear and easy to read. The goal is to ensure that financial terms and conditions of services (especially for credit cards and mortgages) are disclosed in clear, easy-to-understand terms that allow consumers to compare offers. Consumer products to be examined rather than industries – Regulatory agencies are typically structured around the kinds of businesses they supervise. With the CFPB, we’ll see a regulator with a perspective more focused on consumer financial products and services. Transparency on how credit scores affect terms & conditions – Greater transparency about credit scores and how they are used to determine loan rates will also be a priority. Lenders are required to disclose a score they used in all risk-based pricing notices and adverse action notices beginning July 21, 2011. CFPB Takes Authority on July 21 The CFPB receives full regulatory and enforcement authority on July 21, so it’s important for covered entities to continue complying with current law and striving to follow industry best practices. Companies need to demonstrate that they have taken steps to increase consumer credit education and transparency of credit scores, as these items top the CFPB agenda. Experian Consumer Education Resources Experian is addressing the growing need for consumer education by offering Experian Credit EducatorSM, a credit education service in which consumers engage in a one-on-one credit education session with an Experian credit professional agent, together reviewing a copy of their credit report and VantageScore®. Answers to 3 Consumer Questions: As part of Experian Credit Educator, consumers learn the answers to three main questions: What’s in a credit report? What is a score, and what types of information can increase or decrease a score? How does credit affect my financial situation? Experian Credit Educator allows lenders to provide customers a personalized education service, thereby advancing customer engagement while improving customer satisfaction, loyalty, portfolio quality, and cross-sell opportunities. Do you have questions about the CFPB’s role? Leave a comment or contact your Experian representative if you need assistance in complying with new regulatory requirements.