All posts by Guest Contributor

Loading...

By: Scott Rhode   This is the second of a three-part blog series focused on the residential solar market looking at; 1) the history of solar technology, 2) current trends and financing mechanisms, and finally 3) overcoming market and regulatory challenges with Experian’s help. Lets discuss the current trends in solar and, more importantly, the mechanisms used to finance solar in the US residential market.  As I discussed in the last blog, the growth in this space has been astronomical.  To illustrate this growth, there was a recent article in The Washington Post by Matt McFarland, highlighting that solar-related jobs are significantly outpacing the rest of labor market in terms of year over year growth.  The article states that since 2010 the number of solar-related jobs has doubled in the US, bring the total number of jobs in this industry to 173,807.  While this is still smaller in comparison to other sectors of our economy, it underscores how much growth has occurred in a short amount of time. So what is driving this explosive growth?  There are a few factors that should be considered; however, in the residential solar market, financing, is the main catalyst.  As you might expect, there are a variety of financial products in the market giving the consumer lots of choices. First, there are traditional loans like home improvement loans, home equity loans, or energy efficiency loans offered by a bank, credit union, or specialty finance company.  For homeowners that do not choose to secure their loan against their property, there are a variety of specialty lenders that will offer long-term, unsecured loans that only file a UCC against the panels themselves.  For these types of offerings, some specialty lenders will even have special credit plans for the 30% Solar Investment Tax Credit so that the homeowner can have a deferred interest plan with the expectation that once they get the tax credit from the federal government, they will pay off the special plan and all of the deferred interest will be waived.  If the customer does not pay in full, the plan rolls to their regular loan plan and the customer has a higher cost of financing. Second, there is a lease product which offers zero to little down and a monthly payment that is less than the savings that the homeowner will experience on their utility bill.  Of all the financing options, the lease has been the biggest driver of growth since it offers an inexpensive, no-hassle way to get all the benefits of going solar without breaking the bank.  What is unusual to most people that are unfamiliar with this concept is the term of the lease, which is typically 20 years.  However, when you consider that most manufacturers warrant their panels for 25 and many have a usable life of 40 years, this term does not seem all that unusual.  The benefits of this program look something like this: The homeowner has an average electric utility bill of $350 / month A solar company quotes a customer a savings of $200 / month in the form of a net metering energy credit, so their bill after solar is now $150 / month The lease payment for the installed solar array, metering equipment, and monitoring software is $150 / month The homeowner’s net saving is an average of $50 / month with nothing out of pocket Over the life of the lease, energy prices will increase which will mean more savings over time so long as there are not escalators in the contract that exceed the increase in energy prices The lessor “owns” the equipment and is responsible for maintenance, performance, and insurance With this product comes complexity.  Many companies offering this program do not have the cash or the appetite to take on massive debt, so they partner with Tax Equity investors to make this transaction possible.  Because of the 30% ITC and accelerated depreciation, this transaction is very favorable for a Tax Equity investor like Google, US Bank, or Bank of America Merrill Lynch.  There are a number of structures they can use; however, the Sale-Leaseback structure is the easiest and most efficient way to fund the transaction.  While this is not “known” to the end customer, it is important because the Tax Equity Investor effectively owns the asset and has the final say in setting credit policy.  This transaction does require that the developer have a stake as well; however, many of the developers go to the debt market for “back leverage” on their stake so that they can reduce the impact to their balance sheet. This complexity carries a cost, as the cost of capital is higher than most traditional loan products from established financial services firms.  That said, the fact that the lease allows the customer to monetize the tax credit and accelerated depreciation in the amount financed, balances out the higher costs of capital.  In the next blog we will touch more on the challenges this product, in particular, has in the market. Last, but not least, there is another mechanism gaining popularity in the market.  This concept is known as community solar.  One of the obstacles of the lease and Tax Equity arrangement is that the lease is only available to single family residence homeowners and, if they have multiple homes, only the homeowner’s primary residence.  That means that people who rent, own a condo, own a vacation home, or own a small business do not qualify for this type of lease.  As a result, community solar has become a great option. With community solar, the panels are put in an ideal location for maximum exposure to the sun and they often produce 10-15% more power than panels on a rooftop.  Portions of this solar farm can be sold, rented, or sublet to consumers regardless of their living situation.  As the panels produce electricity, that power gets sold to the local utility and the customer gets money from that utility that shows up as a credit on their next bill.  In this structure, the customer is not required to put money down in most cases and they are signing up for a specific term. Like a rooftop lease, this structure often has a Tax Equity investor that funds the project.  Again, this allows them to take the 30% ITC and accelerated depreciation which, in turn, gets monetized and lowers the costs of construction. In the final installment of this blog series, I will discuss some of the challenges that this market faces as the ITC expiration date approaches and the market becomes more mature. Leasing is driving the market, so if the ITC does not get renewed, the market will need to have a plan in place to find other innovative ways to keep solar affordable so more consumers can realize the benefits of going solar.   Solar Financing — The current and future catalyst behind the booming residential solar market (Part 1)

Published: February 9, 2015 by Guest Contributor

This is the third post in a three-part series. Experian® is not a doctor. We don’t even play one on TV. However, because of our unique business model and experience with a large number of data providers, we do know data governance. It is a part of our corporate DNA. Our experiences across our many client relationships give us unique insight into client needs and appropriate best practices. Note the qualifier — appropriate. Just as every patient is different in his or her genetic predispositions and lifestyle influences,  every institution is somewhat unique and does not have a similar business model or history. Nor does every institution have the same issues with data governance. Some institutions have stabile growth in a defined footprint and a history of conservative audit procedures. Others have grown quickly through aggressive acquisition marketing plans and unique channels and via institution acquisition/merger, leading to multiple receivable systems and data acquisition and retention platforms. Experian has provided valuable services to both environments many times throughout the years. As the regulatory landscape has evolved, lenders/service providers demand a higher level of hands-on experience and regulatory-facing credibility. Most recently, lenders have required assistance on the issues driven by mandates coming from the Comprehensive Capital Analysis and Review (CCAR), Office of the Comptroller of the Currency (OCC) and the Consumer Financial Protection Bureau (CFPB) bulletins and guidelines. Lenders are best served to begin their internal review of their data governance controls with a detailed individual attribute audit and documentation of findings. We have seen these reviews covering  fewer than 200 attributes to as many as more than 1,000 attributes. Again, the lender/provider size, analytic sophistication and legacy growth and IT issues will influence this scope. The source and definition of the attribute and any calculation routines should be fully documented. The life cycle stage of attribute acquisition and usage also is identified, and the fair lending implication regarding the use of the attribute across the life cycle needs to be considered and documented. As part of this comprehensive documentation, variances in intended definition and subsequent design and deployment are to be identified and corrective action guidance must be considered and documented for follow-up. Simultaneously, an assessment of the current risk governance policies, processes and documentation typically is undertaken. A third party frequently is leveraged in this review to ensure an objective perspective is maintained. This initiative usually is a series of exploratory reviews and a process and procedures assessment with the appropriate management team, risk teams, attribute design and development personnel, and finally business and end-user teams, as necessary. From these interviews and the review of available attribute-level documentation, documents depicting findings and best practices gap analysis are produced to clarify the findings and provide a hierarchy of need to guide the organization’s next steps: A more recent evolution in this data integrity ecosystem is the implication of leveraging a third party to house and manipulate data within client specifications. When data is collected or processed in “the cloud,” consistent data definitions are needed to maintain data integrity and to limit operational costs related to data cleansing and cloud resource consumption. Maintaining the quality of customer personal data is a critical compliance and privacy principle. Another challenge is that of maintaining cloud-stored data in synchronization with on-premises copies of the same data. Delegation to a third party does not discharge the organization from managing risk and compliance or from having to prove compliance to the appropriate authorities. In summary, a lender/service provider must ensure it has developed a rigorous data governance ecosystem for all internal and external processes supporting data acquisition, retention, manipulation and utilization: A secure infrastructure includes both physical and system-level access and control. Systemic audit and reporting are a must for basic compliance standards. If data becomes corrupted, alternative storage, backup or other mechanisms should be available to protect the information. Comprehensive documentation must be developed to reveal the event, the causes and the corrective actions. Data persistence may have multiple meanings. It is imperative that the institution documents the data definition. Changes to the data must be documented and frequently will lead to the creation of a new data attribute meeting the newer definition to ensure that usage in models and analytics is communicated clearly. Issues of data persistence also include making backups and maintaining multiple archive copies. Periodic audits must validate that data and usage conform to relevant laws, regulations, standards and industry best practices. Full audit details, files used and reports generated must be maintained for inspection. Periodic reporting of audit results up to the board level is recommended. Documentation of action plans and follow-up results is necessary to disclose implementation of adequate controls. In the event of lost or stolen data, appropriate response plans and escalation paths should be in place for critical incidents. Throughout this blog series, we have discussed the issues of risk and benefits from an institution’s data governance ecosystem. The external demands show no sign of abating. The regulators are not looking for areas to reduce their oversight. The institutional benefits of an effective data governance program are significant. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.

Published: January 26, 2015 by Guest Contributor

By: Scott Rhode This is the first of a three-part blog series focused on the residential solar market looking at; 1) the history of solar technology, 2) current trends and financing mechanisms, and finally 3) overcoming market and regulatory challenges with Experian’s help. Most people tend to think of the solar industry as a recent, and not so stable, market phenomenon.  However, the residential solar industry is still gaining traction as component prices come down. For more than two thousand years man has been trying to harness the sun’s energy and power. In fact, architects and city planners in early civilizations would also look to the sun when designing dwellings, buildings and bathhouses, so that they could capture as much of the sun’s energy to heat their homes and the water they used.  Our ancestors knew that the sun, unlike any other resource, was a consistent and powerful source of energy that fueled life. Fast forward to the late 19th and early 20th centuries where renowned scientists in the US and across the globe started looking at ways to harness the sun’s energy to generate electricity, and the birth of the modern solar industry was here.  By the mid 1950’s, US architects were trying to incorporate the power of the sun in their designs so that heating the water and commercial office space could be done without heavy use of electricity.  One architect, Frank Bridgers, was so successful in using this technology that his building still continues to operate this way today.  In addition, many companies like Bell Labs, Western Electric, and the US Signal Corp Laboratories started to develop photovoltaic cells that power the panels that we use today. These early cells, operating at 7-11% efficiency (This is the measurement of how efficient the cell is at converting solar radiation to electricity), gave life to solar powered electronics, lights, and panels used by the burgeoning space program to power satellites orbiting earth.  In reaction to the growing possibilities and the broader oil crisis in the late 1970’s, the US Department of Energy created what would later become the National Renewable Energy Laboratory enabling the federal government to use its resources to help grow the industry and foster technological innovations to improve cell efficiency. Throughout the 1980’s, 90’s, and early 2000’s, the industry starts to take root with utilities and mainstream energy providers as they look to the sun to diversify their energy sources away from coal, gas, and oil.  This adoption leads to a push by the US Department of Energy to have “One million Solar Roofs” in the US so that individual home owners can realize the benefits of going solar.  Soon, retailers like Home Depot started selling panels in their stores for customers to install themselves for “off-grid” properties or other uses.  While this allowed a homeowner to use solar, costs are still so high that solar is only available to a select few and, as a result, not competitive with traditional methods of producing energy. In order to incent homeowners to invest in solar, the US Government created the Solar Investment Tax Credit in 2005.  This tax credit allows homeowners to get a credit of 30% of the fair market value of the system they have installed on their roof.  As a result of this and local incentives from municipalities and utility companies, residential solar installations have grown 1,600% over the last ten years, representing an annual CAGR of 76%.  In fact, through the first half of 2014, 53% of all new electric capacity is from solar, making it the fastest growing source of energy in the market.* Since this tax incentive is unlikely to be renewed after it expires, the industry set out to solve the cost issue in order to manufacture and produce highly efficient and durable panels for individual Consumers that could bring the costs to produce down to parity with traditional power.  In this endeavor, the manufactures have poured significant resources into research and development, pushed their manufacturing processes towards ever higher levels of efficiency, and used the latest technology to significantly reduce costs to produce panels that now range from 18-23% cell efficiency.  Since 2010 the average price of a panel has come down by 64% and the industry continues to push to find ways to make solar more affordable.  This is especially important given that the tax credit expires on December 31st of 2016. In the next blog in the series, I will talk about solar financing and the current industry trends.  Financing, as you would expect, has been and will continue to be critical to growth in this space so that more homeowners can afford to move to solar as their primary energy source.  As such, the methods used to acquire, originate, and serve these customers must evolve in order for the industry to sustain the impressive growth rates mentioned earlier in this blog. Solar Financing – The current and future catalyst behind the booming residential solar market (Part II)

Published: January 22, 2015 by Guest Contributor

By: Mike Horrocks Managing your portfolio can be a long and arduous process that ties up your internal resources but without it, there’s an increase of additional risk and potential losses. The key is to use loan automation to pull together data in a meaningful manner and go from a reactive to proactive process that can: Address the overall risks and opportunities within your loan portfolio ​Get a complete view of the credit and operational risk associated with a credit relationship or portfolio segment Monitor and track actionable steps by leveraging both internal and external data Watch how to avoid the 5 most common mistakes in loan portfolio management to help you reduce overall risk, but also identify cross sell and upsell opportunities. With a more automated process your lending staff can focus on bringing in new business rather than reacting to delinquencies and following up on credit issues. 

Published: January 22, 2015 by Guest Contributor

There are two sides to every coin and in banking the question is often to you want to chase the depositor of that coin, or lend it out? Well the Federal Reserve’s decision to hold interest rates at record lows since the economic downturn gave the banks’ in the United States loan portfolios a nice boost from 2010-2011, but the subsequent actions and banking environment resulted in deposit growth outpacing loans – leading to a marked reduction in loan-to-deposit ratios across banks since 2011. In fact currently there is almost $1.30 in deposits for every loan out there today.  This, in turn, has manifested itself as a reduction in net interest margins for all U.S. banks over the last three years – a situation unlikely to improve until the Fed hikes interest rates. Additionally, the banks’ have found that while they are now holding on to more of these deposits that additional regulations in the form of the CFPB looking to evaluate account origination processes,  Basel III Liquidity concerns, CCAR and CIP & KYP have all made the burden of holding these deposits more costly.   In fact the CFPB suggests four items they believe will improve financial institution’s checking account screening policies and practices: Increase the accuracy of data used from CRA’s Identify how institutions can incorporate risk screening tools while not excluding   potential accountholders unnecessarily Ensure consumers are aware and notified of information used to decision the account opening process Ensure consumers are informed of what account options exist and how they access products that align with their individual needs Lastly, to add to this already challenging environment, technology has switched the channel of choice to your smartphone and has introduced a barrage of risks associated with identity authentication – as well as operational opportunities. As leaders in retail banking and in addressing the needs of your customers, I would like to extend an invitation on behalf of Experian for you to participate in our latest survey on the changing landscape of DDA opportunities.  How are regulations changing your product set, what role does mobile play now and in the future, and what are your top priorities for 2015 and beyond?  These are just a few of the insights we would like to gain from experts such as you. To access our survey, please click here.  Our brief survey should take no more than seven minutes to complete and your insights will be highly valued as we look to better support you and your organization’s demand product needs.  Our survey period will close in three weeks, so please respond now. As a sign of our appreciation for your insights, we will send all participants an anonymous aggregation of the responses so that you can see how others view the retail banking marketplace. So take advantage of this chance to learn from your peers and participate in this industry study and don’t leave your strategy to a flip of a coin.

Published: January 20, 2015 by Guest Contributor

This is the second post in a three-part series. Imagine the circumstances of a traveler coming to a never before visited culture. The opportunity is the new sights, cuisine and cultural experiences. Among the risks is the not before experienced pathogens and the strength of the overall health services infrastructure. In a similar vein, all too frequently we see the following conflict within our client institutions. The internal demands of an ever-increasing competitive landscape drive businesses to seek more data; improved ease of accessibility and manipulation of data; and acceleration in creating new attributes supporting more complex analytic solutions. At the same time, requirements for good governance and heightened regulatory oversight are driving for improved documentation and controlled access, all with improved monitoring and documented and tested controls. As always, the traveler/businessman must respond to the environment, and the best medicine is to be well-informed of both the perils and the opportunities. The good news is that we have seen many institutions invest significantly in their audit and compliance functions over the past several years. This has provided the lender with both better insights into its current risk ecosystem and the improved skill set to continue to refine those insights. The opportunity is for the lender to leverage this new strength. For many lenders, this investment largely has been in response to broadening regulatory oversight to ensure there are proper protocols in place to confirm adherence to relevant rules and regulations and to identify issues of disparate impact. A list of the more high-profile regulations would include: Equal Credit Opportunity Act (ECOA) — to facilitate enforcement of fair lending laws and enable communities, governmental entities and creditors to identify business and community development needs and opportunities of women-owned, minority-owned and small businesses. Home Mortgage Disclosure Act (HMDA) — to require mortgage lenders to collect and report additional data fields. Truth in Lending Act (TLA) — to prohibit abusive or unfair lending practices that promote disparities among consumers of equal creditworthiness but of different race, ethnicity, gender or age. Consumer Financial Protection Bureau (CFPB) — evolving rules and regulations with a focus on perceptions of fairness and value through transparency and consumer education. Gramm-Leach-Bliley Act (GLBA) — requires companies to give consumers privacy notices that explain the institutions’ information-sharing practices. In turn, consumers have the right to limit some, but not all, sharing of their information. Fair Debt Collections Practices Act (FDCPA) — provides guidelines for collection agencies that are seeking to collect legitimate debts while providing protections and remedies for debtors. Recently, most lenders have focused their audit/compliance activities on the analytics, models and policies used to treat consumer/client accounts/relationships. This focus is understandable since it is these analytics and models that are central to the portfolio performance forecasts and Comprehensive Capital Analysis and Review (CCAR)–mandated stress-test exercises that have been of greater emphasis in responding to recent regulatory demands. Thus far at many lenders, this same rigor has not yet been applied to the data itself, which is the core component of these policies and frequently complex analytics. The strength of both the individual consumer–level treatments and the portfolio-level forecasts is negatively impacted if the data underlying these treatments is compromised. This data/attribute usage ecosystem demands clarity and consistency in attribute definition; extraction; and new attribute design, implementation to models and treatments, validation and audit. When a lender determines there is a need to enhance its data governance infrastructure, Experian® is a resource to be considered. Experian has this data governance discipline within our corporate DNA — and for good reason. Experian receives large and small files on a daily basis from tens of thousands of data providers. In order to be sure the data is of high quality so as not to contaminate the legacy data, rigorous audits of each file received are conducted and detailed reports are generated on issues of quality and exceptions. This information is shared with the data provider for a cycle of continuous improvement. To further enhance the predictive insights of the data, Experian then develops new attributes and complex analytics leveraging the base and developed attributes for analytic tools. This data and the analytic tools then are utilized by thousands of  authorized users/lenders, who manage broad-ranging relationships with millions of individuals and small businesses. These individuals and businesses then have rights to reproach Experian for error(s) both perceived and actual. This demanding cycle underscores the value of the data and the value of our rigorous data governance infrastructure. This very same process occurs at many lenders sites. Certainly, a similar level of data integrity born from a comprehensive data governance process also is warranted. In the next and final blog in this series, we will explore how a disciplined business review of an institution’s data governance process is conducted. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.

Published: December 18, 2014 by Guest Contributor

Opening a new consumer checking account in the 21st century should be simple and easy to understand as a customer right?  Unfortunately, not all banks have 21st century systems or processes reflecting the fact that negotiable order of withdrawal (NOW) accounts, or checking accounts, were introduced decades ago within financial institutions and often required the consumer to be in person to open the account.  A lot has changed and consumers demand simpler and transparent account opening processes with product choices that match their needs at a price that they’re willing to pay.  Financial institutions that leverage modernized technology capabilities and relevant decision information have the best chance to deliver consumer friendly experiences that meet consumer expectations.  It is obvious to consumers when we in the financial services industry get it right and when we don’t. The process to open a checking account should be easily understood by consumers and provide them with appropriate product choices that aren’t “one size fits all”.  Banks with more advanced core-banking systems incorporating relevant and compliant decision data and transparent consumer friendly approval processes have a huge opportunity to differentiate themselves positively from competitors.  The reality is that banking deposit management organizations throughout the United States continue to evolve check screening strategies, technology and processes.  This is done in an effort to keep up with evolving regulatory expectations from the consumer advocacy regulatory bodies such as the Consumer Financial Protection Bureau (CFPB) and designed to improve transparency of checking account screening for new accounts for an increased number of consumers. The CFPB advocates that financial institutions adopt new checking account decision processes and procedures that maintain sound management practices related to mitigating fraud and risk expense while improving consumer transparency and increasing access to basic consumer financial instruments.  Bank shareholders demand that these accounts be extended to consumers profitably.  The CFPB recognizes that checking accounts are a basic financial product used by almost all consumers, but has expressed concerns that the checking account screening processes may prevent access to some consumers and may be too opaque with respect to the reasons why the consumer may be denied an account.  The gap between the expectations of the CFPB, shareholders and bank deposit management organization’s current products and procedures are not as wide as they may seem.  The solution to closing the gap includes deploying a more holistic approach to checking account screening processes utilizing 21st century technology and decision capabilities.  Core banking technology and checking products developed decades ago leave banks struggling to enact much needed improvements for consumers. The CFPB recognizes that many financial institutions rely on reports used for checking account screening that are provided by specialty consumer reporting agencies (CRAs) to decision approval for new customers.  CRAs specialize in checking account screening and provide financial institutions with consumer information that is helpful in determining if a consumer should be approved or not.  Information such as the consumer’s check writing and account history such as closed accounts or bounced checks are important factors in determining eligibility for the new account.  Financial institutions are also allowed to screen consumers to assess if they may be a credit risk when deciding whether to open a consumer checking account because many consumers opt-in for overdraft functionality attached to the checking account. Richard Cordray, the CFPB Director, clarified the regulatory agency’s position as to how consumers are treated in checking account screening processes within his prepared remarks at a forum on this topic in October 2014.  “The Consumer Bureau has three areas of concern.  First, we are concerned about the information accuracy of these reports. Second, we are concerned about people’s ability to access these reports and dispute any incorrect information they may find. Third, we are concerned about the ways in which these reports are being used.” The CFPB suggests four items they believe will improve financial institution’s checking account screening policies and practices: Increase the accuracy of data used from CRA’s Identify how institutions can incorporate risk screening tools while not excluding   potential accountholders unnecessarily Ensure consumers are aware and notified of information used to decision the account opening process Ensure consumers are informed of what account options exist and how they access products that align with their individual needs Implementing these steps shouldn’t be too difficult to accomplish for deposit management organizations as long as they are fully leveraging software such as Experian’s PowerCurve customized for deposit account origination, relevant decision information such as Experian’s Precise ID Platform and Vantage Score 3.0 combined with consumer product offerings developed within the bank and offered in an environment that is real-time where possible and considers the consumer’s needs.  Enhancing checking account screening procedures by taking into account consumer’s life-stage, affordability considerations, unique risk profile and financial needs will satisfy expectations of the consumers, regulators and the financial institution shareholders. Financial institutions that use technology and data wisely can reduce expenses for their organizations by efficiently managing fraud, risk and operating costs within the checking account screening process while also delighting consumers.  Regulatory agencies are often delighted when consumers are happy.  Shareholders are delighted when regulators and consumers are happy.  Reengineering checking account opening processes for the modern age results in a win-win-win for consumers, regulators and financial institutions. Discover how an Experian Global Consultant can help you with your banking deposit management needs.

Published: December 12, 2014 by Guest Contributor

By: John Robertson Capital is the life-blood of financial institutions and has become more readily scrutinized since the global credit crisis. How one manages their capital is primarily driven by how well one manages their risk. The use of economic capital in measuring profitability enhances risk management efforts by providing a common indicator for risk. It provides pricing metrics such as RAROC (risk adjusted return on capital) and economic value added which include expected and unexpected losses consequently broadening the evaluation of the adequacy of capital in relation to the bank\'s overall risk profile. The first accounts of economic capital date back to the ancient Phoenicians, who took rudimentary tallies of frequency and severity of illnesses among rural farmers to gain an intuition of expected losses in productivity. These calculations were advanced by correlations with predictions of climate change, political outbreak, and birth rate change. The primary value of economic capital is its application to decision-making and overall risk management. Economic capital is a measure of risk, not of capital held. It represents the amount of money which is needed to secure the survival in a worst case scenario; it is a buffer against expected shocks in market values. Economic capital measures risk using economic realities rather than accounting and regulatory rules, which can be misleading. The concept of economic capital differs from regulatory capital in the sense that regulatory capital is the mandatory capital the regulators require to be maintained while economic capital is the best estimate of required capital that financial institutions use internally to manage their own risk and to allocate the cost of maintaining regulatory capital among different units within the organization. The allocation of economic capital to support credit risk begins with similar inputs to derive expected losses but considers other factors to determine unexpected losses, such as credit concentrations and default correlations among borrowers. Economic capital credit risk modeling measures the incremental risk that a transaction adds to a portfolio rather than the absolute level of risk associated with an individual transaction. In a previous blog I restated a phrase I had heard long ago; “Margins will narrow forever”. How well you manage your capital will help you extend “forever”. Has your institution started using these types of risk measures? The Phoenicians did. Learn more about our credit risk solutions.  

Published: December 2, 2014 by Guest Contributor

This is the first post in a three-part series. You’ve probably heard the adage “There is a little poison in every medication,” which typically is attributed to Paracelsus (1493–1541), the father of toxicology. The trick, of course, is to prescribe the correct balance of agents to improve the patient while doing the least harm. One might think of data governance in a similar manner. A well-disciplined and well-executed data governance regimen provides significant improvements to the organization. So too, an overly restrictive or poorly designed and/or ineffectively monitored data governance ecosystem can result in significant harm; less than optimal models/scorecards, inaccurate reporting, imprecise portfolio outcome forecasts and poor regulatory reports, subsequently resulting in significant investment and loss of reputation. In this blog series, we will address the issues and best practices associated with the broad mandate of data governance. In its simplest definition, data governance is the management of the availability, usability, integrity and security of the data employed in an enterprise. A sound data governance program includes a governing body or council, a defined set of procedures and a plan to execute those procedures. Well, upon quick reflection, effective data governance is not simple at all. After all, data is ubiquitous, is becoming more available, encompasses aspects of our digital lives not envisioned as little as 15 years ago and is constantly changing as people’s behavior changes. To add another level of complexity, regulatory oversight is becoming more pervasive as regulations passed since the Great Recession have become more intrusive, granular and demanding. When addressing issues of data governance lenders, service providers and insurers find themselves trying to incorporate a wide range of issues.  Some of these are time-tested best practices, while others previously were never considered. Here is a reasonable checklist of data governance concerns to consider: Who owns the data governance responsibility within the organization? Is the data governance group seen as an impediment to change or is it a ready part of the change management culture? Is the backup and retrieval discipline — redundancy and recovery — well-planned and periodically tested? How agile/flexible is the governance structure to new data sources? How does the governance structure document and reconcile similar data across multiple providers? Are there appropriate and documented approvals and consents from the data provider(s) for all disclosures? Are systemic access and modification controls and reporting fully deployed and monitored for periodic refinement? Does the monitoring of data integrity, persistence and entitled access enable a quick fix culture where issues are identified and resolved at the source of the problem and not settled by downstream processes? Are all data sources, including those that are proprietary, fully documented and subject to systemic accuracy/integrity reporting? Once obtained, how is the data stored and protected in both definition and accessibility? How do we alter data and leverage the modified outcome? Are there reasonable audits and tracking of downstream reporting? In the event of a data breach, does the organization have well-documented protocols and notification thresholds in place? How recently and to what extent have all data retrieval, manipulation, usage and protection policies and processes been audited? Are there scheduled and periodic reports made to the institution board on issues of data governance? Certainly, many institutions have most of these aspects covered. However, “most” is imprecise medicine, and ill effects are certain to follow. As Paracelsus stated, “The doctor can have a stronger impact on the patient than any drug.” As in medical services, for data governance initiatives those impacts can be beneficial or harmful. In our next blog, we’ll discuss observations of client data governance gaps and lessons learned in evaluating the existing data governance ecosystem. Make sure to read Compliance as a Differentiator perspective paper for deeper insight on regulations affecting financial institutions and how you can prepare your business. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.  

Published: November 11, 2014 by Guest Contributor

By: Ori Eisen This article originally appeared on WIRED. When I started 41st Parameter more than a decade ago, I had a sense of what fraud was all about. I’d spent several years dealing with fraud while at VeriSign and American Express. As I considered the problem, I realized that fraud was something that could never be fully prevented. It’s a dispiriting thing to accept that committed criminals will always find some way to get through even the toughest defenses. Dispiriting, but not defeating. The reason I chose to dedicate my life to stopping online fraud is because I saw where the money was going. Once you follow the money and you see how it is used, you can’t “un-know.” The money ends up supporting criminal activities around the globe – not used to buy grandma a gift. Over the past 10 years the nature of fraud has become more sophisticated and systematized. Gone are the days of the lone wolf hacker seeing what they could get away with. Today, those days seem almost simple. Not that I should be saying it, but fraud and the people who perpetrated it had a cavalier air about them, a bravado. It was as if they were saying, in the words of my good friend Frank Abagnale, “catch me if you can.” They learned to mimic the behaviors and clone the devices of legitimate users. This allowed them to have a field day, attacking all sorts of businesses and syphoning away their ill-gotten gains. We learned too. We learned to look hard and close at the devices that attempted to access an account. We looked at things that no one knew could be seen. We learned to recognize all of the little parameters that together represented a device. We learned to notice when even one of them was off. The days of those early fraudsters has faded. New forces are at work to perpetrate fraud on an industrial scale. Criminal enterprises have arisen. Specializations have emerged. Brute force attacks, social engineering, sophisticated malware – all these tools, and so many more – are being applied every day to cracking various security systems. The criminal underworld is awash in credentials, which are being used to create accounts, take over accounts and commit fraudulent transactions. The impact is massive. Every year, billions of dollars are lost due to cyber crime. Aside from the direct monetary losses, customer lose faith in brand and businesses, resources need to be allocated to reviewing suspect transactions and creativity and energy are squandered trying to chase down new risks and threats. To make life just a little simpler, I operate from the assumption that every account, every user name and every password has been compromised. As I said at the start, fraud isn’t something that can be prevented. By hook or by crook (and mainly by crook), fraudsters are finding cracks they can slip through; it’s bound to happen. By watching carefully, we can see when they slip up and stop them from getting away with their intended crimes. If the earliest days of fraud saw impacts on individuals, and fraud today is impacting enterprises, the future of fraud is far more sinister. We’re already seeing hints of fraud’s dark future. Stories are swirling around the recent Wall Street hack. The President and his security team were watching warily, wondering if this was the result of a state-sponsored activity. Rather than just hurting businesses or their customers, we’re on the brink (if we haven’t crossed it already) of fraud being used to destabilize economies. If that doesn’t keep you up at night I don’t know what will. Think about it: in less than a decade we have gone from fraud being an isolated irritant (not that it wasn’t a problem) to being viewed as a potential, if clandestine, weapon. The stakes are no longer the funds in an account or even the well being of a business. Today – and certainly tomorrow – the stakes will be higher. Fraudsters, terrorists really, will look for ways to nudge economies toward the abyss. Sadly, the ability of fraudsters to infiltrate legitimate accounts and networks will never be fully stifled. The options available to them are just too broad for every hole to be plugged. What we can do is recognize when they’ve made it through our defenses and prevent them from taking action. It’s the same approach we’ve always had: they may get in while we do everything possible to prevent them from doing harm. In an ideal world bad guys would never get through in the first place; but we don’t live in an ideal world. In the real world they’re going to get in. Knowing this isn’t easy. It isn’t comforting or comfortable. But in the real world there are real actions we can take to protect the things that matter – your money, your data and your sense of security. We learned how to fight fraud in the past, we are fighting it with new technologies today and we will continue to apply insights and new approaches to protect our future. Download our Perspective Paper to learn about a number of factors that are contributing to the evolving fraud landscape.

Published: November 3, 2014 by Guest Contributor

By: John Robertson I began this blog series asking the question “How can banks offer such low rates?” Exploring the relationship of pricing in an environment where we have a normalized. I outlined a simplistic view of loan pricing as: + Interest Income + Non-Interest Income Cost of Funds Non-Interest Expense Risk Expense = Income before Tax Along those lines, I outlined how perplexing it is to think at some of these current levels, banks could possibly make any money. I suggested these offerings must be lost leaders with the anticipation of more business in the future or possibly, additional deposits to maintain a hold on the relationship over time. Or, I shudder to think, banks could be short funding the loans with the excess cash on their balance sheets. I did stumble across another possibility while proving out an old theory which was very revealing. The old theory stated by a professor many years ago was “Margins will continue to narrow…. Forever”. We’ve certainly seen that in the consumer world. In pursuit of proof to this theory I went to the trusty UBPR and looked at the net interest margin results from 2011 until today for two peer groups (insured commercial banks from $300 million to $1 billion and insured commercial banks greater the $3 billion). What I found was, in fact, margins have narrowed anywhere from 10 to 20 basis points for those two groups during that span even though non-interest expense stayed relatively flat. Not wanting to stop there, I started looking at one of the biggest players individually and found an interesting difference in their C&I portfolio. Their non-interest expense number was comparable to the others as well as their cost of funds but the swing component was non-interest income.  One line item on the UPBR’s income statement is Overhead (i.e. non-interest expense) minus non-interest income (NII). This bank had a strategic advantage when pricing there loans due to their fee income generation capabilities. They are not just looking at spread but contribution as well to ensure they meet their stated goals. So why do banks hesitate to ask for a fee if a customer wants a certain rate? Someone seems to have figured it out. Your thoughts?

Published: October 30, 2014 by Guest Contributor

By: Mike Horrocks I am at the Risk Management Association’s annual conference in DC and I feel like I am back to where my banking career began.  One of the key topics here is how important the Risk Rating Grade is and what impact that right or wrong Risk Rating Grade can have on the bank. It is amazing to me how a risk rating is often a shot in the dark at some institutions or can even vary on the training of one risk manager to another.  For example, you could have a commercial credit with fantastic debt service coverage and have it tied to a terrible piece of collateral and that risk rating grade will range anywhere from prime type credit (cash flow is king and the loan will never default – so why concern ourselves with collateral) to low, subprime (do we really want that kind of collateral dragging us down or in our OREO portfolio?), to anywhere in between. Banks need to define the attributes of a risk rating grade and consistently apply that grade.  The failure of doing that will lead to having that poor risk rating grade impact ALLL calculations (with either an over allocation or not enough) and then that will roll into the loan pricing (making you more costly or not enough to match for the risk). The other thing I hear consistently is that we don’t have the right solutions or resources to complete a project like this.  Fortunately there is help.  A bank should never feel like they should try to do this alone.  I recall how it was an all hands on deck when I first started out to make sure we were getting the right loan grading and loan pricing in place at the first super-regional bank I worked at – and that was without all the compliance pressure of today. So take a pause and look at your loan grading approach – is it passing or failing your needs? If it is not passing, take some time to read up on the topic, perhaps find a tutor (or business partner you can trust) and form a study group of your best bankers.   This is one grade that needs to be at the top of the class.  Looking forward to more from RMA 2014!

Published: October 28, 2014 by Guest Contributor

By: Joel Pruis I have just completed the first of two presentations on Model Risk Governance at the RMA Annual Conference.  The focus of the presentation was the compliance with the Model Risk Governance guidance at the smaller asset sized financial institutions.  The big theme across all of the attendees at the first session was the need for resources to execute on the Model Risk Governance.  Such resources are scarce at the smaller asset sized institutions forcing the need and use for external vendors to assist in the development and ongoing validation of any models in use. With that said, the one area that cannot be outsourced is the model risk governance responsibility of the financial institution.  While resources are few, we have to look for existing roles within the organization to support the model risk governance such as: - Internal Audit - reviewing process, inputs, consistency - Loan Review - accuracy, consistency, thresholds, etc. - Compliance - Data usage, pricing consistency, etc. Start gathering your governance team at your organization and begin the effort around model risk governance! Discover how an Experian business consultant can help with your Model Risk Governance strategies and processes. Also, if you are interested in gaining deeper insight on regulations affecting financial institutions and how to prepare your business, download Experian’s Compliance as a Differentiator perspective paper.  

Published: October 27, 2014 by Guest Contributor

This is the second of a two part blog about the state of auto lending in the U.S.  In the U.S, auto lending has been surging.   This is the second of a two part blog regarding this subject.  The previous blog looked at origination trends as well as noting the attention that auto lending has received from banking regulators and in the media.  Those critical of auto lending have noted that, since 2009, non-prime originations have posted a larger growth rate than prime originations.  This is not unexpected.  In the trough of a recession, lending to non-prime customers is drastically curtailed.  Therefore, out of a recession, non-prime origination tends to grow quickly. Credit card trends are an excellent example of this tendency. From 2009 to today, the number of non-prime accounts originated has grown almost twice as fast as prime accounts --123% versus 65%.   When comparing growth of auto loan originations, we believe that 2006 is a more appropriate point to consider.  Prime and super-prime origination amounts have grown faster since this period than non-prime originations.  Today, auto originations have a lower proportion of non-prime commitments than the period prior to the recession. In this blog, we again examine auto loan and lease trends using Experian IntelliView data to investigate auto lending outstanding balances and performance.  IntelliView is a quarterly update of U.S. lending trends based on credit bureau data, including originations, outstanding loans and lines, credit performance trends, segmented by product and other characteristics.  Auto Loans and Lease Outstandings and Performance Growth of outstanding balances are based on a number of factors, such as acquisition volume, maturity term (for loans), utilization (for lines), account attrition and prepayment. Slide 3 shows that presently auto loans/lease outstandings are 25% above 2006 amounts.  First mortgages are 14% above their 2006 amount, and bankcard balances have only just recovered their 2006 total. As shown in Slide 4, at the present rate of growth, auto loans and leases now at $900 million will cross $1 trillion in outstandings.  Auto balances already exceeded second mortgage line and loan balances more than a year ago. Only mortgages, exceeding $8 trillion in outstandings (and student lending) have outstanding balances higher than auto loans and leases. With the shift of GMAC to Ally Bank, Captive Auto companies lost their top share of outstandings to Banks in 2009. Since 2006, Finance company balances have more than doubled and Credit Unions have grown nearly 49%.  Outstandings for all type of credit grades have increased since 2006. Slide 5 shows super-prime paper oustandings are up 31.3% and prime is up 28.0%.  Near-prime oustandings are up 20.8% and subprime outstandings are up 24.6%.  Deep-subprime outstandings are up 34.0%, and almost all of the growth in deep-subprime can be attributed to Finance companies.  Obviously, there is movement among credit grades.  A customer acquired as a super-prime customer may eventually encounter hardship, stop paying their obligations and reach a deep-subprime grade.  This would be infrequent, and even rarer to move from deep-subprime to super-prime during the course of a loan.  Slide 6 shows a distribution of outstanding balances by credit grades for each type of financial institution as of 2014-Q2. (APRs of each segment are also shown.) Slide 7 shows the distribution of Bank outstanding balances over time.  The proportion of prime and super-prime of all balances have increased.  Subprime and deep-subprime balances have declined, and near-prime oustandings have remained steady. The risk profile of Bank auto loan/lease portfolios is actually much better than prior to the recession. Slide 8 shows a distribution of outstanding balances for Finance companies.  Super-prime balances are twice the size they were in 2006.  Prime balances are 33% higher.  The proportion of subprime and near-prime outstandings is lower, and deep-subprime balances are about the same.  Once again, the quality of the portfolio among Finance companies is better than they were heading into the recession. Slide 9 shows the auto loan/lease delinquency rate trend.  All levels of delinquency peaked in 2008-Q4.  After a long decline, delinquency rates have remained fairly steady for the last two years.  The 30-59 day rate (and therefore the 30+ day delinquency rate) appears to be volatile, but all levels of delinquency (the 30-59 day rate in particular) have a seasonal pattern.  Delinquency is higher in the 3rd and 4th quarter of the year and is lower in the 1st quarter and the 2nd quarter. Slide 10, 11, 12 show delinquency rates by financial institution.  These charts clearly show Finance company delinquencies have grown in the last year.  As noted earlier, credit grades are dynamic.  Nevertheless, they do perform with relative consistency. Accounts classified as super-prime have very little 30-59 day delinquency (average of 0.10%) and deep subprime accounts have a very high rate (average of 39%).  This is true across all financial institution types.  The 60-89 day delinquency rates for deep-subprime range from 13.12% to 18.08%, with an average of 15.73%. And 90+ day delinquency range between 5.67% and 9.79%, with an average of 8.20%.  However, performance of deep-subprime credit has deteriorated in the last year for Finance companies, particularly 60-89 day and 90+ day rates. They are closer to the higher end of the range than the average.  Some of this may be due to vintage as Finance company deep-subprime outstandings tend to be younger accounts.  Some of this performance may be due to collection issues at specific companies.  Continued examination of these trends is necessary over the next few quarters to see if Finance company delinquencies return to more normal levels. Concerns over potential problems due to growth in near- and subprime auto lease and loans are overstated.  The proportion of originations in these groups is lower today compared to the period before the recession.  The risk profile of auto lease and loan portfolios is also much improved. The up-turn in delinquencies among Finance company portfolios is an issue that we will continue to monitor. Learn more about what Experian Intelliview can do for you.

Published: October 22, 2014 by Guest Contributor

By: Joel Pruis When the OCC put forth the supervisory guidance on model risk governance the big focus in the industry was around the larger financial institutions that had created their own risk models.  The overall intent to make sure that the larger financial institutions were properly managing the risk they were assuming through the use of the custom risk models they had developed.  While we can’t say that this model risk governance was a significant issue, the guidance provided by the OCC is intended to provide financial institutions with the minimum requirements for model risk governance. Now that the OCC and the Federal Reserve have gone through the model risk governance reviews for the largest financial institutions in the US, their attention has turned to the rest of the group.  While you may not have developed your own custom scorecard model, you may be using a generic scorecard model to support your credit decisions either for loan origination and/or portfolio management.  As a result of the use of even generic scorecards and models, you do have obligations for model risk governance as stated in the guidance.  While you may not be basing any decisions strictly on a score alone, the questions you have to asking yourself are: Does my credit policy or underwriting guidelines reference the use of a score in my decision process? While I may not be doing any type of auto-decision, do I restrict any credit authority based upon a score? Do I adjust any thresholds/underwriting guidelines based upon a score that is returned?  For example, do I allow a higher debt to income if the score is above a certain level? How long have you been using a score in your decision processes that may have become a significant influence on how you decision credit? As you can see from the questions above, the guidance covers a significant population of the financial institutions in the US.  As a result, some of the basic components that your financial institution must demonstrate it has done (or will do) are: Recent validation of the scorecard against your portfolio performance Demonstration of appropriate policy governing the use of credit risk models per the regulation Independence around the authority and review of the model risk governance and validations Proper support and documentation from your generic scorecard provider per the guidance. If you would like to learn more on this topic, please join me at the upcoming RMA Annual Risk Management Conference where I will be speaking on Model Validation for Community Banks on Monday, Oct. 27, 9:30 a.m. – 10:30 a.m. or 11 a.m. – 12 p.m. Also, if you are interested in gaining deeper insight on regulations affecting financial institutions and how to prepare your business, download Experian’s Compliance as a Differentiator perspective paper.

Published: October 20, 2014 by Guest Contributor

Subscription title for insights blog

Description for the insights blog here

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Categories title

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.

Subscription title 2

Description here
Subscribe Now

Text legacy

Contrary to popular belief, Lorem Ipsum is not simply random text. It has roots in a piece of classical Latin literature from 45 BC, making it over 2000 years old. Richard McClintock, a Latin professor at Hampden-Sydney College in Virginia, looked up one of the more obscure Latin words, consectetur, from a Lorem Ipsum passage, and going through the cites of the word in classical literature, discovered the undoubtable source.

recent post

Learn More Image

Follow Us!