Uncategorized

Loading...

By: Kari Michel The way medical debts are treated in scores may change with the introduction of June 2011, Medical Debt Responsibility Act. The Medical Debt Responsibility Act would require the three national credit bureaus to expunge medical collection records of $2,500 or less from files within 45 days of their being paid or settled. The bill is co-sponsored by Representative Heath Shuler (D-N.C.), Don Manzullo (R-Ill.) and Ralph M. Hall (R-Texas). As a general rule, expunging predictive information is not in the best interest of consumers or credit granters -- both of which benefit when credit reports and scores are as accurate and predictive as possible. If any type of debt information proven to be predictive is expunged, consumers risk exposure to improper credit products as they may appear to be more financially equipped to handle new debt than they truly are. Medical debts are never taken into consideration by VantageScore  if the debt reporting is known to be from a medical facility. When a medical debt is outsourced to a third-party collection agency, it is treated the same as other debts that are in collection. Collection accounts of lower than $250, or ones that have been settled, have less impact on a consumer’s VantageScore. With or without the medical debt in collection information, the VantageScore model remains highly predictive.

Published: August 29, 2011 by Guest Contributor

By: Mike Horrocks The realities of the new economy and the credit crisis are driving businesses and financial institutions to better integrate new data and analytical techniques into operational decision systems. Adjusting credit risk processes in the wake of new regulations, while also increasing profits and customer loyalty will require a new brand of decision management systems to accelerate more precise customer decisions. There is a Webinar scheduled for Thursday that will insightfully show you how blending business rules, data and analytics inside a continuous-loop decisioning process can empower your organization - to control marketing, acquisition and account management activities to minimize risk exposure, while ensuring portfolio growth. Topics include: What the process is and the key building blocks for operating one over time Why the process can improve customer decisions How analytical techniques can be embedded in the change control process (including data-driven strategy design or optimization) If interested check out more - there is still time to register for the Webinar. And if you just want to see a great video - check out this intro.

Published: August 24, 2011 by Guest Contributor

With the raising of the U.S. debt ceiling and its recent ramifications consuming the headlines over the past month, I began to wonder what would happen if the general credit consumer had made a similar argument to their credit lender. Something along the lines of, “Can you please increase my credit line (although I am maxed out)? I promise to reduce my spending in the future!” While novel, probably not possible. In fact, just the opposite typically occurs when an individual begins to borrow up to their personal “debt ceiling.” When the amount of credit an individual utilizes to what is available to them increases above a certain percentage, it can adversely affect their credit score, in turn affecting their ability to secure additional credit. This percentage, known as the utility rate is one of several factors that are considered as part of an individual’s credit score calculation. For example, the utilization rate makes up approximately 23% of an individual’s calculated VantageScore. The good news is that consumers as a whole have been reducing their utilization rate on revolving credit products such as credit cards and home equity lines (HELOCs) to the lowest levels in over two years. Bankcard and HELOC utilization is down to 20.3% and 49.8%, respectively according to the Q2 2011 Experian – Oliver Wyman Market Intelligence Reports. In addition to lowering their utilization rate, consumers are also doing a better job of managing their current debt, resulting in multi-year lows for delinquency rates as mentioned in my previous blog post. By lowering their utilization and delinquency rates, consumers are viewed as less of a credit risk and become more attractive to lenders for offering new products and increasing credit limits. Perhaps the government could learn a lesson or two from today’s credit consumer.

Published: August 23, 2011 by Alan Ikemura

Consumer credit card debt has dipped to levels not seen since 2006 and the memory of pre-recession spending habits continues to get hazier with each passing day. In May, revolving credit card balances totaled over $790 billion, down $180 billion from mid-2008 peak levels. Debit and Prepaid volume accounted for 44% or nearly half of all plastic spending, growing substantially from 35% in 2005 and 23% a decade ago. Although month-to-month tracking suggests some noise in the trends as illustrated by the slight uptick in credit card debt from April to May, the changes we are seeing are not at all temporary. What we are experiencing is a combination of many factors including the aftermath impacts of recession tightening, changes in the level of comfort for financing non-essential purchases, the “new boomer” population entering the workforce in greater numbers and the diligent efforts to improve the general household wallet composition by Gen Xers. How do card issuers shift existing strategies? Baby boomers are entering that comfortable stage of life where incomes are higher and expenses are beginning to trail off as the last child is put through college and mortgage payments are predominantly applied toward principle. This group worries more about retirement investments and depressed home values and as such, they demand high value for their spending. Rewards based credit continues to resonate well with this group. Thirty years ago, baby boomers watched as their parents used cash, money orders and teller checks to manage finances but today’s population has access to many more options and are highly educated. As such, this group demands value for their business and a constant review of competitive offerings and development of new, relevant rewards products are needed to sustain market share. The younger generation is focused on technology. Debit and prepaid products accessible through mobile apps are more widely accepted for this group unlike ten to fifteen years ago when multiple credit cards with four figure credit limits each were provided to college students in large scale. Today’s new boomer is educated on the risks of using credit, while at the same time, parents are apt to absorb more of their children’s monthly expenses. Servicing this segment's needs, while helping them to establish a solid credit history, will result in long-term penetration in a growing segment. Recent CARD Act and subsequent amendments have taken a bite out of revenue previously used to offset increased risk and related costs that allowed card issuers to service the near-prime sector. However, we are seeing a trend of new lenders getting in to the credit card game while existing issuers start to slowly evaluate the next tier. After six quarters of consistent credit card delinquency declines, we are seeing slow signs of relief. The average VantageScore for new card originations increased by 8 points from the end of 2008 into early 2010 driven by credit tightening actions and has started to slowly come back down in recent months.   What next? What all of this means is that card issuers have to be more sophisticated with risk management and marketing practices. The ability to define segments through the use of alternate data sources and access channels is critical to ongoing capture of market share and profitable usage. First, the segmentation will need to identify the “who” and the “what.” Who wants what products, how much credit is a consumer eligible for and what rate, terms and rewards structure will be required to achieve desired profit and risk levels, particularly as the economy continues to teeter between further downturn and, at best, slow growth. By incorporating new modeling and data intelligence techniques, we are helping sophisticated lenders cherry pick the non-super prime prospects and offering guidance on aligning products that best balance risk and reward dynamics for each group. If done right, card issuers will continue to service a diverse universe of segments and generate profitable growth.

Published: August 22, 2011 by Guest Contributor

What happens when once desirable models begin to show their age? Not the willowy, glamorous types that prowl high-fashion catwalks. But rather the aging scoring models you use to predict risk and rank-order various consumer segments. Keeping a fresh face on these models can return big dividends, in the form of lower risk, accurate scoring and higher quality customers. In this post, we provide an overview of custom attributes and present the benefits of overlaying current scoring models with them. We also suggest specific steps communications companies can take to improve the results of an aging or underperforming model. The beauty of custom attributes Attributes are highly predictive variables derived from raw data. Custom attributes, like those you’ve created in house or obtained from third parties, can provide deeper insights into specific behaviors, characteristics and trends. Overlaying your scoring model with custom attributes can further optimize its performance and improve lift. Often, the older the model, the greater the potential for improvement. Seal it with a KS Identifying and integrating the most predictive attributes can add power to your overlay, including the ability to accurately rank-order consumers. Overlaying also increases the separation of “goods and bads” (referred to as “KS”) for a model within a particular industry or sub-segment. Not surprisingly, the most predictive attributes vary greatly between industries and sub-segments, mainly due to behavioral differences among their populations. Getting started The first step in improving an underperforming model is choosing a data partner—one with proven expertise with multivariate statistical methods and models for the communications industry. Next, you’ll compile an unbiased sample of consumers, a reject inference sample and a list of attributes derived from sources you deem most appropriate. Attributes are usually narrowed to 10 or fewer from the larger list, based on predictiveness Predefined, custom or do-it-yourself Your list could include attributes your company has developed over time, or those obtained from other sources, such as Experian Premier AttributesSM (more than 800 predefined consumer-related choices) or Trend ViewSM attributes. Relationship, income/capacity, loan-to-value and other external data may also be overlaid. Attribute ToolboxTM Should you choose to design and create your own list of custom attributes, Experian’s Attribute ToolboxTM offers a platform for development and deployment of attributes from multiple sources (customer data or third-party data identified by you). Testing a rejuvenated model The revised model is tested on your both your unbiased and reject inference samples to confirm and evaluate any additional lift induced by newly overlaid attributes. After completing your analysis and due diligence, attributes are installed into production. Initial testing, in a live environment, can be performed for three to twelve months, depending on the segment (prescreen, collections, fraud, non-pay, etc), outcome or behavior your model seeks to predict. This measured, deliberate approach is considered more conservative, compared with turning new attributes on right away. Depending on the model’s purpose, improvements can be immediate or more tempered. However, the end result of overlaying attributes is usually better accuracy and performance. Make your model super again If your scoring model is starting to show its age, consider overlaying it with high-quality predefined or custom attributes. Because in communications, risk prevention is always in vogue. To learn more about improving your model, contact your Experian representative. To read other recent posts related to scoring, click here.

Published: August 19, 2011 by Guest Contributor

It’s time to focus on growth again.In 2010, credit marketers focused on testing new acquisition strategies. In 2011, credit marketers are implementing learnings from those tests.As consumer lending becomes more competitive, lenders are strategically implementing procedures to grow portfolios by expanding their marketable universe. The new universe of prospective customers is moving steadily beyond prime to a variety of near-prime segments outside of the marketing spectrum that lenders have targeted for the past three years.Many credit marketers have moved beyond testing based on new regulatory requirements and have started to market to slightly riskier populations. From testing lower-scoring segments to identifying strategies for unbanked/underbanked consumers, the breadth of methods that lenders are using to acquire new accounts has expanded. Portfolio growth strategies encompass internal process enhancements, product diversification, and precise underwriting and account management techniques that utilize new data assets and analytics to mitigate risk and identify the most profitable target populations.Experian® can help you identify best practices for growth and develop customized strategies that best suit your acquisition objectives. Whether your needs include internal methods to expand your marketable universe (i.e., marketing outside of your current footprint or offers to multiple individuals in a household) or changes to policies for external expansion strategies (i.e., near-prime market sizing or targeting new prospects based on triggered events), Experian has the expertise to help you achieve desired results. For more information on the new acquisition strategies and expanding your marketing universe, leave a comment below or call 1 888 414 1120.

Published: August 9, 2011 by Guest Contributor

The high-profile data breaches in recent months not only left millions of consumers vulnerable to the threat of identity theft and caused businesses to incur significant costs, but it also brought data security to the top of the agenda in Washington. In Congress, members of both the House and the Senate have used the recent data breaches to demonstrate the need for a uniform national data breach notification standard and increased data security standards for companies that collect consumer information. Hearings have been held on the issue and it is expected that legislation will be introduced this summer.At the same time, the Obama Administration continues to call for greater data security standards. The White House released its highly anticipated cybersecurity initiative in May. In addition to implementing a national data breach notification law, the proposal would require certain private companies to develop detailed plans to safeguard consumer data.As legislation develops and advances through multiple Congressional committees, Experian will be working with allies and coalitions to ensure that the data security standards established under the Gramm-Leach-Bliley Act and the Fair Credit Reporting Act are not superseded with new, onerous and potentially ineffective mandates.We welcome your questions and comments below.

Published: August 4, 2011 by Guest Contributor

It’s here! July 21 marks the official launch of the Consumer Financial Protection Bureau (CFPB). This new government agency gains the power to write and enforce 18 consumer protection laws that guide financial products and services, including the Fair Credit Reporting Act, the Equal Credit Opportunity Act, and the Fair Debt Collection Practices Act. There’s one snag, however: a director has yet to be approved by the Senate to lead the CFPB. Earlier this week, President Obama nominated former Ohio Attorney General Richard Cordray but it will likely take some time for his nomination to move through the confirmation process. This means that although the Bureau can now enforce existing laws, it will not (yet) have the authority to write new rules. While the agency’s power will be limited at first, the Dodd-Frank Act requires the CFPB to take a number of early steps that will impact the telcom industry (as we’ve noted previously here, here and here). Three CFPB Priorities: 1)    Clarify how credit scores affect lender decisions Beginning today, lenders will be required to disclose the credit score that they used in all risk-based pricing notices and adverse action notices. The CFPB is expected to draft its own compliance rules, but in the meantime the FTC and Federal Reserve have jointly issued a rule that identifies the specific information that must be disclosed and provides model forms of notice. As we’ve discussed earlier, these new requirements can provide valuable relationship-building opportunities for your business.   2)    Shorten and simplify consumer disclosure forms The CFPB has made it clear that one of its first actions will be to make the terms and conditions of financial products and services easier for consumers to understand and compare to other offers (particularly for the mortgage and credit card industry).   3)    Review debt collection practices The CFPB now has the authority to enforce the Fair Debt Collection Practices Act and review current debt collector practices to determine whether their methods are abusive or unfair. Once a CFPB director is confirmed by the Senate, the Bureau may seek to restructure and update debt collection laws. As the CFPB hires new staff and develops its regulatory agenda, it will be important for companies to follow industry best practices. Check back frequently as we continue to provide updates on CFPB developments that will impact your business. If you have questions, concerns or comments, please share your thoughts below.

Published: July 21, 2011 by Guest Contributor

Every communication company wants to inoculate its portfolio against bad debt, late payments and painful collections. But many still use traditional generic risk models to uncover potential problems, either because they’ve always used generics or because they see their limited predictive abilities as adequate enough. Generalization dilutes results The main problem with generics, however, is how they generalize consumers’ payment behavior and delinquencies across credit cards, mortgages, auto loans and other products. They do not include payment and behavioral data focused on actual communications customers only. Moreover, their scoring methodologies can be too broad to provide the performance, lift or behavioral insights today’s providers strive to attain. Advantages of industry-specific models Communications-specific modeling can be more predictive, if you want to know who’s more likely to prioritize their phone bill and remit promptly, and who’s not. In multiple market validations, pitting an optimized industry-specific model against traditional generic products, Experian’s Tele-Risk ModelSM and Telecommunications, Energy and Cable (TEC) Risk ModelSM more accurately predicted the likelihood of future serious delinquent or derogatory payment behavior. Compared with generics, they also: Provided a stronger separation of good and bad accounts More precisely classified good vs. bad risk through improved rank ordering Accurately scored more consumers than a generic score that might have otherwise been considered unscorable Anatomy of a risk score These industry risk models are built and optimized using TEC-specific data elements and sample populations, which makes them measurably more predictive for evaluating new or existing communications customers. Optimization also helps identify other potentially troublesome segments, including those that might require special handling during on boarding, “turn ons,” or managing delinquency. Check the vital signs To assess the health of your portfolio, ask a few simple questions: Does your risk model reflect unique behaviors of actual communications customers? Is overly generic data suppressing lift and masking hidden risk? Could you score more files that are currently deemed unscorable? Unless the answer is ‘yes’ to all, your model probably needs a check-up—stat.  

Published: July 13, 2011 by Guest Contributor

Like their utility counterparts, communications providers routinely participate in federally subsidized assistance programs that discount installation or monthly service for qualified low-income customers. But, as utilities have found, certain challenges must be considered when mining this segment for new growth opportunities, including: Thwarting scammers who use falsified income data and/or multiple IDs to game the system and double up on discounts Equipping internal teams to efficiently process the potential mountain of program applications and recertification paperwork The right tool for the job Experian’s Financial Assistance CheckerSM product is a powerful scoring tool that indicates whether consumers may qualify for low-income assistance programs (such as LifeLine and LinkUp). Originally designed for (and currently used by) utilities, Financial Assistance Checker offers risk-reduction and resource utilization efficiencies that also benefit communications providers. Automation saves time For example, Financial Assistance Checker may be used to help qualify specific individuals among new and existing low-income program participants, as well as others who may qualify but have not yet enrolled. The solution also helps automate labor-intensive manual reviews, making the process less costly and more efficient. Some companies have reduced manual intervention by up to 50% by using financial assistance scores to automatically re-certify current enrollees. Strengthen your overall game plan Experian’s Financial Assistance Checker may be used to: Produce a score that aids in effective decisions Reduce the number of manually reviewed applications Facilitate more efficient resource allocation Mitigate fraud risk by rejecting unqualified applicants   Cautionary caveat Financial Assistance Checker is derived exclusively from Experian’s credit data without demographic factors. While it’s good at qualifying applicants and customers, it may not be used as a basis for adverse action or removal from a program — only to determine eligibility for low-income assistance. Today, acquisitions is the name of the game. If your growth strategy calls for leveraging subsidized segments, consider adding Experian’s Financial Assistance Checker product to your starting lineup. After all, the best offense could just be a strong defense. Link & Learn This link takes you to a short but informative video about LifeLine and LinkUp. See the FCC’s online Lifeline and Link Up program overview here. Hot off the government press! Click to see the FCC’s 6/21/11 report on Lifeline and LinkUp Reform and Modernization  

Published: July 8, 2011 by Guest Contributor

By: Kari Michel The topic of strategic default has been a hot topic for the media as far back as 2009 and continues as this problem won’t really go away until home prices climb and stay there. Terry Stockman (not his real name) earns a handsome income, maintains a high credit score and owns several residential properties. They include the Southern California home where he has lived since 2007. Terry is now angling to buy the foreclosed home across the street. What’s so unusual about this? Terry hasn’t made a mortgage payment on his own home for more than six months. With prices now at 2003 levels, his house is worth only about one-half of what he paid for it. Although he isn’t paying his mortgage loan, Terry is current with his other debt payments.   Terry is a strategic defaulter — and he isn’t alone. By the end of 2008, a record  1 in 5 mortgages at least 60 days past due was a strategic default. Since 2008, strategic defaults have fallen below that percentage in every quarter through the second quarter of 2010, the most recent quarter for which figures are available. However, the percentages are still high: 16% in the last quarter of 2009 and 17% in the second quarter of last year. Get more details off of our 2011 Strategic Default Report What does this mean for lenders? Mortgage lenders need to be able to identify strategic defaulters in order to best employ their resources and set different strategies for consumers who have defaulted on their loans. Specifically designed indicators help lenders identify suspected strategic default behavior as early as possible and can be used to prioritize account management or collections workflow queues for better treatment strategies. They also can be used in prospecting and account acquisition strategies to better understand payment behavior prior to extending an offer. Here is a white paper I thought you might find helpful.

Published: July 1, 2011 by Guest Contributor

When the Consumer Financial Protection Bureau (CFPB) takes authority on July 21, debt collectors and communications companies should pay close attention. If the CFPB has its way, the rules may be changing. Old laws, new technologies The rules governing consumer communications for debt collection haven’t seen a major update since they were written in 1977. While the FTC has enforcement power in this area, it can’t write rules—Congress must provide direction. Consequently, the rules guiding the debt collection industry have evolved based on decisions by the courts. In the meantime, technology has outpaced the law. Debt collectors have taken advantage of the latest available methods of communication, such as cell phones, autodialers and email, while the compliance requirements have largely remained murky. At the same time, complaints about debt collection practices to the FTC continue to rise. While the number is relatively low compared to the amount of overall activity, the FTC receives more complaints about debt collectors than any other industry. The agency has also raised concerns about how new communication tools, such as Facebook and Twitter, will impact the future of debt collection. Priorities for the CFPB While mortgages, credit cards and payday loans will be the early priorities for the CFPB, high on the list of to-do items will be to update the laws governing consumer communications for debt collection. Under the Dodd-Frank Act, the CFPB will be responsible not only for enforcing the Fair Debt Collection Practices Act (FDCPA), but it will also have a new ability to write the rules. This raises new issues, such as how new regulations will affect how debt collection companies can contact consumers. Even as lenders and communications companies have expressed concern about the CFPB writing the rules, the hope is that the agency will create a more predictable legal structure that covers new technologies and reduces the uncertainty around compliance. Faced with the prospect of clarifying the compliance requirements around debt collection, the ACA (Association of Collection and Credit Professionals) has started to get in front of the CFPB by putting together its own blueprint. Will the CFPB be ready by July 21? Over the last year, the CFPB has been busy building an organizational structure but still lacks a leader appointed by the President and confirmed by the Senate. (Elizabeth Warren is currently the unofficial director.) Without a permanent director in place, the agency will be unable to gain full regulatory authority on July 21 – the date set by the Treasury Department. Until then, the CFPB will be able to enforce existing laws but will be unable to write new regulations. Despite the political uncertainty, debt collectors and communications firms still need to be prepared. One way is to ensure you’re following industry best practices established by ACA. To help you be ready for any outcome, we’ll continue to follow this issue and keep you apprised of the CFPB’s direction. Let us know your thoughts and concerns in the comment section. Or feel free to contact your Experian rep directly with any questions you may have. Helpful links: Association of Credit and Collection Professionals Fair Debt Collection Practices Act (PDF) Consumer Financial Protection Bureau (CFPB)

Published: July 1, 2011 by Guest Contributor

This is the third and final post in an interview between Experian’s Tom Whitfield and Dr. Michael Turner, founder, president and CEO of the Policy and Economic Research Council (PERC)—a non-partisan, non-profit policy institute devoted to research, public education, and outreach on public and economic policy matters. In this post Dr. Turner discusses mandatory credit-information sharing for communications companies, and the value of engaging and educating state regulators. _____________________________ Does it make sense for the FTC to mandate carriers to report? Credit information sharing in the United States is a voluntary system under the Fair Credit Reporting Act (FCRA). Mandating information sharing would break precedent with this successful, decades-old regime, and could result in less rather than more information being shared, as it shifts from being a business matter to a compliance issue. Additionally, the voluntary nature of credit reporting allows data furnishers and credit bureaus to modify reporting in response to concerns. For example, in reaction to high utility bills as a result of severe weather, a utility provider may wish to report delinquencies only 60 days or more past due. Similarly, a credit bureau may not wish to load data it feels is of questionable quality. A voluntary system allows for these flexible modifications in reporting. Further, under existing federal law, those media and communications firms that decide they want to fully report payment data to one or more national credit bureaus are free to do so. In short, there is simply no need for the FTC to mandate that communications and media companies report payment data to credit bureaus, nor would there be any immediate benefit in so doing. How much of the decision is based on the influence of the State PUC or other legislative groups? Credit information sharing is federally regulated by the Fair Credit Reporting Act (FCRA). The FCRA preempts state regulators, and as such, a media or communications firm that wants to fully report may do so regardless of the preferences of the state PUC or PSC. PERC realizes the importance of maintaining good relations with oversight agencies. We recommend that companies communicate the fact of fully reporting payment data to a PUC or PSC and engage in proactive outreach to educate state regulators on the value of credit reporting customer payment data. There have been notable cases of success in this regard. Currently, just four states (CA, OH, NJ and TX) have partial prohibitions regarding the onward transfer of utility customer payment data to third parties, and none of these provisions envisioned credit reporting when drafted. Instead, most are add-ons to federal privacy legislation. Only one state (CA) has restrictions on the onward transfer of media and communications customer payment data, and again this has nothing to do with credit reporting. Agree, disagree or comment Whether you agree with Dr. Turner’s assertions or not, we’d love to hear from you. So please, take a moment to share your thoughts about full-file credit reporting in the communications industry. Click here to learn more about current and pending legislation that impacts communications providers.

Published: June 29, 2011 by Guest Contributor

By: John Straka The U.S. housing market remains relatively weak, but it’s probably not as weak as you think. To what extent are home prices really falling again? Differing Findings Most recent media coverage of the “double dip in home prices” has centered on declines in the popular Case-Schiller price index; however, the data entering into this index is reported with a lag (the just released April index reflects data for February-April) and with some limitations.  CoreLogic publishes a more up-to-date index value that earlier this month showed a small increase, and more importantly, CoreLogic also produces an index that excludes distressed sales.  This non-distressed index has shown larger recent price increases, and it shows increases over the last 12 months in 20 states. Others basing their evidence on realtors’ listing data have concluded that there was some double dip last year, but prices have actually been rising now for several months (See Altos).  These disparate findings belie overly simplistic media coverage, and they stress that “the housing market” is not one single market, of course, but a wide distribution of differing outcomes in very many local neighborhood home markets across the nation. (For a pointed view of this, see Charron.) Improved Data Sources Experian is now working with the leading source of the most granular and timely home market analytics and information, from nationwide local market data, and the best automated valuation model (AVM) provider based on these and other data, Collateral Analytics. (Their AVM leads in accuracy and geographic coverage in most large lender and third party AVM tests). While acknowledging their popularity, value, and progress, Collateral Analytics President Dr. Michael Sklarz questions the traditional dominance of repeat-sales home price indexes (from Case-Shiller etc.).  Repeat-sales data typically includes only around 20 to 30 percent of the total home sales taking place. Collateral Analytics instead studies the full market distribution of home sales and market data and uses their detailed data to construct hedonic price indexes that control for changing home characteristics.  This approach provides a similar “constant quality” claim as repeat-sales—without throwing away a high percentage of the market observations. Collateral Analytics indexes also cover over 16,000 zip codes, considerably more than others. Regular vs. Distressed Property Sales Nationwide, some well-known problem states, areas and neighborhoods continue to fare worse than most others in today’s environment, and this skewed national distribution of markets is not well described by overall averages. Indeed, on closer inspection, the recent media-touted gloomy picture of home prices that are “falling again” or that “continue to fall” is a distorted view for many local home markets, where prices have been rising a little or even more, or at least remaining flat or stable.  Nationwide or MSA averages that include distressed-property sales (as Case-Shiller tends to do) can be misleading for most markets. The reason for this is that distressed-property sales, while given much prominence in recent years and lowering overall home-price averages, have affected but not dominated most local home markets. The reporting of continued heavy price discounts (twenty percent or significantly more) for distressed sales in most areas is a positive sign of market normality.  It typically takes a significantly large buildup of distressed property sales in a local area or neighborhood home market to pull down regular property sale prices to their level.  For normal or regular home valuation, distressed sales are typically discounted due to their “fire sale” nature, “as is” sales, and property neglect or damage. This means that the non-distressed or regular home price trends are most relevant for most homes in most neighborhoods. Several examples are shown below. As suggested in these price-per-living-area charts, regular (non-distressed) home-sale prices have fared considerably better in the housing downturn than the more widely reported overall indexes that combine regular and distressed sales(1). Regular-Sale and Combined Home Prices in $ Per Square Foot of Living Area and Distress Sales as a Pct of Total Sales In Los Angeles, combined sale prices fell 46 percent peak-to-trough and are now 16 percent above the trough, while regular sale prices fell by considerably less, 33 percent, and are now 3 percent above the trough.   Distressed sales as a percent of total sales peaked at 52 percent in 2009:Q1, but then fell to a little under 30 percent by 2010:Q2, where it has largely remained (this improvement occurred before the general “robo-signer” process concerns slowed down industry foreclosures).  L.A. home prices per square foot have remained largely stable for the past two years, with some increase in distressed-sale prices in 2009. Market prices in this area most recently have tended to remain essentially flat—weak, but not declining anew, with some upward pressure from investors and bargain hunters (previously helped by tax credits before they expired). Double-Dip: No. In Washington DC, single-family home prices per square foot have been in a saw- tooth seasonal pattern, with two drops of 15-20% followed by sizable rebounds in spring sales prices. The current combined regular & REO average price is 17 percent below its peak but 13 percent above its trough, while the regular-sale average price is just 12 percent below the peak and 10 percent above its trough. Distressed sales have been comparatively low, but rising slowly to a peak of a little over 20 percent in 2010, with some slight improvement recently to the high teens. Single-family prices in DC have remained comparatively strong; however, more of the homes in DC are actually condos, and condo prices have not been quite as strong, with the market data showing mixed signals but with the average price per square foot remaining essentially flat.  Double-Dip: No. In the Miami area, the combined average home price per square foot fell by 48 percent peak to trough and is now just 1 percent above the 2009:Q2 trough. The regular-sale average price already experienced an earlier double-dip, falling by 32 percent to 2009:Q2, then stabilizing for a couple of quarters before falling another 9 percent relative to the peak; since 2010:Q3 this average has been choppy but basically flat, now 3 percent above that second trough. Prices in Miami have been among the weakest in large metro areas, but average prices have been largely flat for the past year, without any sharp new double dip. Distressed sales as a percent of the total peaked at 53 percent in 2009:Q1, but then fell to a little under 30 percent by 2010:Q2; since then there has been some return to a higher distress share, in the mid to upper 30s (but all of these figures are about 10 percentage points lower for condos).   New Double-Dip: No. The Dallas area has seen some of the strongest prices in the nation. The combined price per square foot had an earlier peak and fell by 31 percent peak to trough, but it is now 33 percent above the trough. The regular-sale average price fell briefly by 22 percent peak to trough, but it has since risen by 32 percent from the 2009:Q1 trough to where it is now 3 percent above the peak. The increases have occurred in a saw-tooth seasonal pattern with spring prices the highest, but prices here have been largely rising considerably. Distress sales as a percent of the total peaked at 22 percent in 2009:Q1 but have largely fallen since and now stand at just 11 percent.   Double-Dip: No. Here You Can See 47 More Examples of Where Double-Dips Are and Are Not: »         Pacific West »         Southwest »         Mountain West »         Midwest »        Northeast »         Mid Atlantic »         Southeast  To summarize this information and gain a little more insight into the general area conditions for most homes and individuals in the U.S., we can add up the number of homes and the total population across the counties examined.  To be sure, this information is not a rigorous random sample across homes, but I have tried to include and show the details of both stronger and weaker metro-area counties throughout the U.S. As shown in the tables below, the information used here has covered 51 metro-area counties, including a total population of over 15 million homes and nearly 75 million individuals(2).  These results may be regarded as suggestive of findings from a more thoroughgoing study. Based on these reviews of the market price averages and other data, my assessment is that a little over half of the counties examined are not currently or recently experiencing a double-dip in home prices. Moreover, these counties, where home prices appear to be at least flat or relatively stronger, encompass almost two-thirds (65%) of the total affected U.S. population examined, and nearly three-fifths (58%) of the total properties covered by the data studied. Conclusion This is, on balance, good news. But there are remaining concerns. One is the continued high, or more recently rising, shares of distressed sales in many markets, and the “shadow inventory” of distressed sales now being held up in the current foreclosure pipeline. But it is also interesting to see that many of the reductions in the distressed-property shares of total sales in high-stress areas occurred before the foreclosure processing slowdowns. Another interesting observation is that most of the recent double-dips in prices have been relatively mild compared to the previous original peak-to-trough meltdown. While, to be sure, there are plenty of reasons to remain uncertain and cautious about U.S. home prices, home markets in general do vary considerably, with significant elements of improvement and strength as well as the continuing weaknesses. Despite many reports today about “the beleaguered housing market,” there really is no such thing … not unless the report is referring to a very specific local market.  There definitely are double dips in many areas, and reasons for continuing overall concern. But the best available evidence suggests that there are actually double-dip markets—most relatively moderately so, stable markets, and stronger markets, with markets affecting a majority of homes and individuals actually in the stable and stronger categories.  Note: In a next installment, we’ll look at some more granular micro market data, to explore in greater depth the extensive variety of home-price outcomes and market conditions in weak pockets and strong pockets across various local areas and home markets. This will highlight the importance of having very good information, at sub-county and even sub-zip code levels, on local-neighborhood home markets. Source of Home Price and Market Information: Collateral Analytics HomePriceTrends. I thank Michael Sklarz for providing the extensive information for this report and for comments, and I thank Stacy Schulman for assistance in this posting. __________________ (1) Based on analysis by Collateral Analytics, price/living sq ft is a useful, simple “hedonic” measure which typically controls for around 70 percent or more of the changing characteristics in a housing stock and home sale mix. Patterns in home prices without dividing by the square footage are generally similar, but not always. (2) The property inventory counts are from Collateral Analytics, while the population estimates are from the 2010 U.S. Census.

Published: June 29, 2011 by Guest Contributor

This is the second in a three-part interview between Experian’s Tom Whitfield and Dr. Michael Turner, founder, president and CEO of the Policy and Economic Research Council (PERC)—a non-partisan, non-profit policy institute devoted to research, public education, and outreach on public and economic policy matters. Dr. Turner is a prominent expert on credit access, credit reporting and scoring, information policy, and economic development. Mr. Whitfield is the Director of Marketing for Experian’s Telecommunications, Energy and Cable practice. In this post Dr. Turner explains how full-file credit reporting actually benefits consumers and why many communications providers haven’t yet embraced it. _____________________________ Why is full-file credit reporting good for communications customers? Approximately 54 million Americans either have no credit report, or have very little information in their credit reports to generate a credit score. Most of these “thin-file/no-file” persons are financially excluded and many of them are media and communications customers. By having their payment data fully reported to a credit bureau and included in their credit reports, many will be able to access affordable sources of mainstream credit for the first time; others will be helped by repairing their damaged credit. In this way, consumers will save by not relying on high-cost lenders to have their credit needs met. Why don’t providers embrace reporting like other major industries/lenders? A major reason is inertia—providers haven’t done it before and are not sure how they would benefit from change. Just recently, PERC released a major study highlighting the business case for fully reporting customer payment data to one or more nationwide credit bureaus. This includes customer survey results, peer survey results and case studies. The results all point to tremendous upside from fully reporting payment data, with only manageable downsides—including external communications and regulators.   Misperceptions and misunderstandings Another significant reason is regulator misperceptions and misunderstandings. State public service and public utility commissions (PSCs and PUCs) aren’t experts in credit reporting or the regulatory framework around credit-information sharing. Many mistakenly believe the data is unregulated and can be used for marketing. Not wanting to contribute to an increase in commercial mail and telemarketing calls, some regulators have a knee-jerk reaction when the topic of credit reporting is raised by an interested media, communications or utility company. PERC has been working to educate regulators and has had success in their outreach efforts. PERC can be a resource to firms interested in full-file reporting in direct communications with regulators. Part 3: Wednesday, June 29 Next, in the concluding post of this interview with PERC founder, president and CEO Dr. Michael Turner, the doctor discusses mandatory credit-information sharing for communications companies, and the value of engaging and educating state regulators. Agree, disagree or comment Whether you agree with Dr. Turner’s assertions or not, we’d love to hear from you. So please, take a moment to share your thoughts about full-file credit reporting in the communications industry.

Published: June 27, 2011 by Guest Contributor

Subscription title for insights blog

Description for the insights blog here

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Categories title

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.

Subscription title 2

Description here
Subscribe Now

Text legacy

Contrary to popular belief, Lorem Ipsum is not simply random text. It has roots in a piece of classical Latin literature from 45 BC, making it over 2000 years old. Richard McClintock, a Latin professor at Hampden-Sydney College in Virginia, looked up one of the more obscure Latin words, consectetur, from a Lorem Ipsum passage, and going through the cites of the word in classical literature, discovered the undoubtable source.

recent post

Learn More Image

Follow Us!