Preparation is key – whether you’re an amateur/professional sports, free-soloing up El Capitan, or business contingency planning as part of a recession readiness strategy. It’s not so much predicting when events will occur, or trying to foresee and pivot for every possible outcome, but rather, acting now so that your business can act faster and smarter in the future. There are certain priorities that have come to be associated with what are widely accepted as the three environments the economy can sustain at any one time: As with recessions throughout the country’s history, those periods have often been characterized by layoffs, charge-offs, delinquencies, and other behaviors as the economy turns to a counter-cycle environment. Rather than wait to implement reactive strategies , the time to manage accounts, plan, stress test and implement contingency plans for when the next economic correction comes, is now. While economists and financial services industry experts argue over when a recession will hit and how severe its implications may be (in comparison with the Great Recession of 2008), there’s a need to start tactical business discussions now. Even in the face of a strong economy, that has seen high employment levels and increased spending, 45% of Americans (112.5 million) say they do not have enough savings to cover at least three months of living expenses, according to a 2018 survey by the Center for Financial Services Innovation. Regardless of the economic environment – pro-cycle, counter-cycle, and cycle-neutral – those statistics paint an alarming picture of consumers\' financial health as a whole. These are four crucial considerations you should be taking now: Create individualized treatments while reducing manual interactions Meet the growing expectation for digital consumer self-service Understand your customer to ensure fair treatment React quickly and effectively to market changes While it may not be on the immediate horizon just yet, it’s important to prepare. For more information, including portfolio mixes, collections considerations and macroeconomic trends, download our latest white paper on recession readiness. Download white paper now
Alex Lintner, Group President at Experian, recently had the chance to sit down with Peter Renton, creator of the Lend Academy Podcast, to discuss alternative credit data,1 UltraFICO, Experian Boost and expanding the credit universe. Lintner spoke about why Experian is determined to be the leader in bringing alternative credit data to the forefront of the lending marketplace to drive greater access to credit for consumers. “To move the tens of millions of “invisible” or “thin file” consumers into the financial mainstream will take innovation, and alternative data is one of the ways which we can do that,” said Lintner. Many U.S. consumers do not have a credit history or enough record of borrowing to establish a credit score, making it difficult for them to obtain credit from mainstream financial institutions. To ease access to credit for these consumers, financial institutions have sought ways to both extend and improve the methods by which they evaluate borrowers’ risk. By leveraging machine learning and alternative data products, like Experian BoostTM, lenders can get a more complete view into a consumer’s creditworthiness, allowing them to make better decisions and consumers to more easily access financial opportunities. Highlights include: The impact of Experian Boost on consumers’ credit scores Experian’s take on the state of the American consumer today Leveraging machine learning in the development of credit scores Expanding the marketable universe Listen now Learn more about alternative credit data 1When we refer to \"Alternative Credit Data,\" this refers to the use of alternative data and its appropriate use in consumer credit lending decisions, as regulated by the Fair Credit Reporting Act. Hence, the term \"Expanded FCRA Data\" may also apply in this instance and both can be used interchangeably.
You’ve Got Mail! Probably a lot of it. Birthday cards from Mom, a graduation announcement from your third cousin’s kid whose name you can’t remember and a postcard from your dentist reminding you you’re overdue for a cleaning. Adding to your pile, are the nearly 850 pieces of unsolicited mail Americans receive annually, according to Reader’s Digest. Many of these are pre-approval offers or invitations to apply for credit cards or personal loans. While many of these offers are getting to the right mailbox, they’re hitting a changing consumer at the wrong time. The digital revolution, along with the proliferation and availability of technology, has empowered consumers. They now not only have access to an abundance of choices but also a litany of new tools and channels, which results in them making faster, sometimes subconscious, decisions. Three Months Too Late The need to consistently stay in front of customers and prospects with the right message at the right time has caused a shortening of campaign cycles across industries. However, for some financial institutions, the customer acquisition process can take up to 120 days! While this timeframe is extreme, customer prospecting can still take around 45-60 days for most financial institutions and includes: Bureau processing: Regularly takes 10-15 days depending on the number of data sources and each time they are requested from a bureau. Data aggregation: Typically takes anywhere from 20-30 days. Targeting and selection: Generally, takes two to five days. Processing and campaign deployment: Usually takes anywhere from three days, if the firm handles it internally, or up to 10 days if an outside company handles the mailing. A Better Way That means for many firms, the data their customer acquisition campaigns are based off is at least 60 days old. Often, they are now dealing with a completely different consumer. With new card originations up 20% year-over-year in 2019 alone, it’s likely they’ve moved on, perhaps to one of your competitors. It’s time financial institutions make the move to a more modern form of prospecting and targeting that leverages the power of cloud technology, machine learning and artificial intelligence to accelerate and improve the marketing process. Financial marketing systems of the future will allow for advanced segmentation and targeting, dynamic campaign design and immediate deployment all based on the freshest data (no more than 24-48 hours old). These systems will allow firms to do ongoing analytics and modeling so their campaign testing and learning results can immediately influence next cycle decisions. Your customers are changing, isn’t it time the way you market to them changes as well?
Earlier this month, Experian joined FinovateSpring in San Francisco, CA to demonstrate innovations impacting financial health to over 1,000 attendees. The Finovate conference promotes real-world solutions while highlighting short-form demos and key insights from thought-leaders on digital lending, banking, payments, artificial intelligence and the customer experience. With more than 100 million Americans lacking fair access to credit, it\'s more important than ever for companies to work to improve the financial health of consumers. In addition to the show\'s abundance of fintech-centered content, Experian hosted an exclusive, cutting-edge breakout series demonstrating innovations that are positively impacting the financial health of consumers across the nation. Finovate Day One Overview While fintechs, banks, venture capitalist, entrepreneurs and industry analysts ascended on the general conference floor for a fast-paced day of demos, a select subset gathered for a luncheon presented by Experian North America CEO, Craig Boundy, and Group President, Alex Lintner. Attendees were given an in-depth look at new, alternative credit data streams and tools that are helping to increase financial access. Demos included: Experian Boost™: a free, groundbreaking online platform that allows consumers to instantly boost their credit scores by adding telecommunications and utility bill payments to their credit file. More than half a million consumers have leveraged Experian Boost, increasing their score by an average of 13 points. Cumulatively, Experian Boost has helped add more than 2.8 million points to consumers’ credit scores. Ascend Analytical Sandbox™: A first-of-its-kind data and analytics platform that gives companies instant access to more than 17 years of depersonalized credit data on more than 220 million U.S. consumers. It has been the most successful product launch in Experian’s history and recently earned the title of “Best Overall Analytics Platform” at this year’s Fintech Breakthrough Awards. Alternative Credit Data: Comprised of data from alternative credit sources, this data helps lenders make smarter and more informed lending decisions. Additionally, Experian’s Clear Data Platform is next-level credit data that adds supplemental FCRA-compliant credit data to enrich decisions across the entire credit spectrum. This new platform features alternative credit data, rental data, public records, consumer-permissioned data and more Upon conclusion of the luncheon, Alpa Lally, Experian’s Vice President of Data Business at Consumer Information Services, was interviewed for the HousingWire Podcast with Jacob Gaffney, HousingWire Editor in Chief, to discuss how new forms of data streams are helping improve consumers’ access to credit by giving lenders a clearer picture of their creditworthiness and risk. “Alternative credit data is different than traditional credit data and helps us paint a fuller picture of the consumer in terms of their ability to pay, willingness to pay and stability. It helps consumers get better access overall to the credit they deserve so that they can actively participate in the economy,” said Lally. Finovate Day Two Overview On the last day of the conference, expert speakers took to the main stage to analyze the latest fintech trends, opportunities and challenges. Alex Lintner and Sandeep Bhandari, Chief Strategy Officer and Chief Risk Officer at Affirm, participated in a fireside chat titled “Improving the Financial Health of America’s 100 Million Credit Underserved Consumers.” Moderated by David Penn, Finovate Analyst, the session explored the latest innovations, trends and technologies – from machine learning to alternative data – that are making a difference in positively impacting the financial health of Americans and expanding financial opportunities for underserved consumers. The panel discussed the efforts made to put financial health at the center of their business and the impact it’s had on their organizations. Following the fireside chat, Experian hosted a second lunch briefing, presented by Vijay Mehta, Chief Innovation Officer, and Greg Wright, EVP Chief Product Officer. The lunch included exclusive table discussions and open conversations to help attendees leave with a better understanding of the importance of prioritizing financial health to build trust, reach new customers and ultimately grow their business. \"We are actively seeking out unresolved problems and creating products and technologies that will help transform the way businesses operate and consumers thrive in our society. But we know we can\'t do it alone,\" Experian North American CEO, Craig Boundy said in a recent blog post on Experian\'s fintech partnerships and Finovate participation. \"That\'s why over the last year, we have built out an entire time of account executives and other support staff that are fully dedicated to developing and supporting partnerships with leading fintech companies. We\'ve made significant strides that will help us pave the way for the next generation of lending while improving the financial health of more people around the world.\" For more information on how Experian is partnering with fintechs, visit experian.com/fintech or read our recent blog article on consumer-permissioned data for an in-depth discussion on Experian BoostTM.
If you’re a credit risk manager or a data scientist responsible for modeling consumer credit risk at a lender, a fintech, a telecommunications company or even a utility company you’re certainly exploring how machine learning (ML) will make you even more successful with predictive analytics. You know your competition is looking beyond the algorithms that have long been used to predict consumer payment behavior: algorithms with names like regression, decision trees and cluster analysis. Perhaps you’re experimenting with or even building a few models with artificial intelligence (AI) algorithms that may be less familiar to your business: neural networks, support vector machines, gradient boosting machines or random forests. One recent survey found that 25 percent of financial services companies are ahead of the industry; they’re already implementing or scaling up adoption of advanced analytics and ML. My alma mater, the Virginia Cavaliers, recently won the 2019 NCAA national championship in nail-biting overtime. With the utmost respect to Coach Tony Bennett, this victory got me thinking more about John Wooden, perhaps the greatest college coach ever. In his book Coach Wooden and Me, Kareem Abdul-Jabbar recalled starting at UCLA in 1965 with what was probably the greatest freshman team in the history of basketball. What was their new coach’s secret as he transformed UCLA into the best college basketball program in the country? I can only imagine their surprise at the first practice when the coach told them, “Today we are going to learn how to put on our sneakers and socks correctly. … Wrinkles cause blisters. Blisters force players to sit on the sideline. And players sitting on the sideline lose games.” What’s that got to do with machine learning? Simply put, the financial services companies ready to move beyond the exploration stage with AI are those that have mastered the tasks that come before and after modeling with the new algorithms. Any ML library — whether it’s TensorFlow, PyTorch, extreme gradient boosting or your company’s in-house library — simply enables a computer to spot patterns in training data that can be generalized for new customers. To win in the ML game, the team and the process are more important than the algorithm. If you’ve assembled the wrong stakeholders, if your project is poorly defined or if you’ve got the wrong training data, you may as well be sitting on the sideline. Consider these important best practices before modeling: Careful project planning is a prerequisite — Assemble all the key project stakeholders, and insist they reach a consensus on specific and measurable project objectives. When during the project life cycle will the model be used? A wealth of new data sources are available. Which data sources and attributes are appropriate candidates for use in the modeling project? Does the final model need to be explainable, or is a black box good enough? If the model will be used to make real-time decisions, what data will be available at runtime? Good ML consultants (like those at Experian) use their experience to help their clients carefully define the model development parameters. Data collection and data preparation are incredibly important — Explore the data to determine not only how important and appropriate each candidate attribute is for your project, but also how you’ll handle missing or corrupt data during training and implementation. Carefully select the training and validation data samples and the performance definition. Any biases in the training data will be reflected in the patterns the algorithm learns and therefore in your future business decisions. When ML is used to build a credit scoring model for loan originations, a common source of bias is the difference between the application population and the population of booked accounts. ML experts from outside the credit risk industry may need to work with specialists to appreciate the variety of reject inference techniques available. Segmentation analysis — In most cases, more than one ML model needs to be built, because different segments of your population perform differently. The segmentation needs to be done in a way that makes sense — both statistically and from a business perspective. Intriguingly, some credit modeling experts have had success using an AI library to inform segmentation and then a more tried-and-true method, such as regression, to develop the actual models. During modeling: With a good plan and well-designed data sets, the modeling project has a very good chance of succeeding. But no automated tool can make the tough decisions that can make or break whether the model is suitable for use in your business — such as trade-offs between the ML model’s accuracy and its simplicity and transparency. Engaged leadership is important. After modeling: Model validation — Your project team should be sure the analysts and consultants appreciate and mitigate the risk of over fitting the model parameters to the training data set. Validate that any ML model is stable. Test it with samples from a different group of customers — preferably a different time period from which the training sample was taken. Documentation — AI models can have important impacts on people’s lives. In our industry, they determine whether someone gets a loan, a credit line increase or an unpleasant loss mitigation experience. Good model governance practice insists that a lender won’t make decisions based on an unexplained black box. In a globally transparent model, good documentation thoroughly explains the data sources and attributes and how the model considers those inputs. With a locally transparent model, you can further explain how a decision is reached for any specific individual — for example, by providing FCRA-compliant adverse action reasons. Model implementation — Plan ahead. How will your ML model be put into production? Will it be recoded into a new computer language, or can it be imported into one of your systems using a format such as the Predictive Model Markup Language (PMML)? How will you test that it works as designed? Post-implementation — Just as with an old-fashioned regression model, it’s important to monitor both the usage and the performance of the ML model. Your governance team should check periodically that the model is being used as it was intended. Audit the model periodically to know whether changing internal and external factors — which might range from a change in data definition to a new customer population to a shift in the economic environment — might impact the model’s strength and predictive power. Coach Wooden used to say, “It isn’t what you do. It’s how you do it.” Just like his players, the most successful ML practitioners understand that a process based on best practices is as important as the “game” itself.
For most businesses, building the best online experience for consumers requires a balance between security and convenience. But the challenge has always been finding a happy medium between the two – offering enough security that won’t get in the way of convenience and vice versa. In the past, it was always believed that one would always come at the expense of the other. But technology and innovation is changing how businesses approach security and is allowing them to give the maximum potential of both. Consumers want security AND convenience Consumers consider security and convenience as the foundation of their online experience. Findings from our 2019 Global Identity and Fraud Report revealed approximately 74 percent of consumers ranked security as the most important part of their online experience, followed by convenience. In other words, they expect businesses to provide them with both. We see this with how consumers are typically using the same security information each time they open a new digital account – out of convenience. But if one account is compromised, the consumer becomes vulnerable to possible fraudulent activity. With today’s technology, businesses can give consumers an easier and more secure way to access their digital accounts. Creating the optimal online experience More security usually meant creating more passwords, answering more security questions, completing CAPTCHA tests, etc. While consumers are willing to work through these friction-inducing methods to complete a transaction or access an account, it’s not always the most convenient process. Advanced data and technology has opened doors for new authentication methods, such as physical and behavioral biometrics, digital tokenization, device intelligence and machine learning, to maximize the potential for businesses to provide the best online experience possible. In fact, consumers have expressed greater confidence in businesses that implement these advanced security methods. Rates of consumer confidence in passwords was only 44 percent, compared to a 74 percent rate of consumer confidence in physical biometrics. Consumers are willing to embrace the latest security technology because it provides the security and convenience they want from businesses. While traditional forms of security were sufficient, advanced authentication methods have proven to be more reliable forms of security that consumers trust and can improve their online experience. The optimal online experience is a balance between security and convenience. Innovative technologies and data are helping businesses protect people’s identities and provide consumers with an improved online experience.
Perhaps more than ever before, technology is changing how companies operate, produce and deliver products and services to their customers. Similarly, technology is also driving a shift in customer expectation in how, when and where they consume products and services. But these changes aren’t just relegated to the arenas where tech giants with household names, like Amazon and Google, play. Likewise, financial institutions of every size are also fielding the changes brought on by innovations to the industry in recent years. According to this report by PWC, 77% of firms plan on dedicating time and budgets to increase innovation. But what areas make the most sense for your business? With a seemingly constant shift in consumer and corporate focus, it can be difficult to know which technological advancements are imperative to your company’s success and which are just the latest fizzling buzzword. As you evaluate innovation investments for your organization in 2019 and beyond, here’s a list of four technology innovations that are already changing the financial sector or will change the banking landscape in the near future. The APIs of Open Banking Ok, it’s not a singular innovation, so I’m cheating a bit here, but it’s a great place to begin the conversation because it comprises and sets the stage for many of the innovations and technologies that are in use today or will be implemented in the future. Created in 2015, the Open Banking Standard defined how a bank’s system data or consumer-permissioned financial data should be created, accessed and shared through the use of application programming interfaces or APIs. When financial institutions open their systems up to third-party developer partners, they can respond to the global trends driving change within the industry while greatly improving the customer experience. With the ability to securely share their financial data with other lenders, greater transparency into the banking process, and more opportunities to compare product offerings, consumers get the frictionless experience they’ve come to expect in just about every aspect of life – just not necessarily one that lenders are known for. But the benefits of open banking are not solely consumer-centric. Financial institutions are able to digitize their product offerings and thus expand their market and more easily share data with partners, all while meeting clients’ individualized needs in the most cost-effective way. Biometrically speaking…and smiling Verifying the identity of a customer is perhaps one of the most fundamental elements to a financial transaction. This ‘Know Your Customer’ (KYC) process is integral to preventing fraud, identity theft, money laundering, etc., but it’s also time-consuming and inconvenient to customers. Technology is changing that. From thumbprint and, now, facial recognition through Apple Pay, consumers have been using biometrics to engage with and authorize financial transactions for some time now. As such, the use of biometrics to authenticate identity and remove friction from the financial process is becoming more mainstream, moving from smartphones to more direct interaction. Chase has now implemented voice biometrics to verify a consumer’s identity in customer service situations, allowing the company to more quickly meet a customer’s needs. Meanwhile, in the US and Europe, Visa is testing biometric credit cards that have a fingerprint reader embedded in the card that stores his or her fingerprint in order to authenticate their identity during a financial transaction. In China, companies like Alipay are taking this to the next level by allowing customers to bypass the phone entirely with its ‘pay with a smile’ service. First launched in KFC restaurants in China, the service is now being offered at hospitals as well. How, when and where a consumer accesses their financial institution data actually creates a digital fingerprint that can be verified. While facial and vocal matching are key components to identity verification and protecting the consumer, behavioral biometrics have also become an important part of the fraud prevention arsenal for many financial institutions. These are key components of Experian’s CrossCore solution, the first open fraud and identity platform partners with a variety of companies, through open APIs discussed above. Not so New Kid on the Block(chain) The first Bitcoin transaction took place on January 12, 2009. And for a number of years, all was quiet. Then in 2017, Bitcoin started to blow up, creating a scene reminiscent of the 1850s California gold rush. Growing at a seemingly exponential rate, the cryptocurrency topped out at a per unit price of more than $20,000. By design cryptocurrencies are decentralized, meaning they are not controlled or regulated by a single entity, reducing the need for central third-party institutions, i.e. banks and other financial institutions to function as central authorities of trust. Volatility and regulation aside, it’s understandable why financial institutions were uneasy, if not skeptical of the innovation. But perhaps the most unique characteristic of cryptocurrencies is the technology on which they are built: blockchain. Essentially, a blockchain is just a special kind of database. The database stores, validates, transfers and keeps a ledger of transfers of encrypted data—records of financial transfers in the case of Bitcoin. But these records aren’t stored on one computer as is the case with traditional databases. Blockchain leverages a distributed ledger or distributed trust approach where a full copy of the database is stored across many distributed processing nodes and the system is constantly checking and validating the contents of the database. But a blockchain can store any type of data, making it useful in a wide variety of applications including tracking the ownership digital or physical assets or the provenance of documents, etc. From clearing and settlements, payments, trade finance, identity and fraud prevention, we’re already seeing financial institutions explore and/or utilize the technology. Santander was the first UK bank to utilize blockchain for their international payments app One Pay FX. Similarly, other banks and industry groups are forming consortiums to test the technology for other various uses. With all this activity, it’s clear that blockchain will become an integral part of financial institutions technology and operations on some level in the coming years. Robot Uprising Rise in Robots While Artificial Intelligence seems to have only recently crept into pop-culture and business vernacular, it was actually coined in 1956 by John McCarthy, a researcher at Dartmouth who thought that any aspect of learning or intelligence could essentially be taught to a machine. AI allows machines to learn from experience, adjust to new inputs and carry out human-like tasks. It’s the result of becoming ‘human-like’ or the potential to become superior to humans that creeps out people like my father, and also worries others like Elon Musk. Doomsday scenarios a la Terminator aside, it’s easy to see how the tech can and is useful to society. In fact, much of the AI development done today uses human-style reasoning as a model, but not necessarily the ultimate aim, to deliver better products and services. It’s this subset of AI, machine learning, that allows companies like Amazon to provide everything from services like automatic encryption in AWS to products like Amazon Echo. While it’s much more complex, a simple way to think about AI is that it functions like billions of conditional if-then-else statements working in a random, varied environment typically towards a set goal. Whereas in the past, programmers would have to code these statements and input reference data themselves, machine learning systems learn, modify and map between inputs and outputs to create new actions based on their learning. It works by combining the large amounts of data created on a daily basis with fast, iterative processing and intelligent algorithms, allowing the program to learn from patterns in the data and make decisions. It’s this type of machine learning that banks are already using to automate routine, rule-based tasks like fraud monitoring and also drive the analytical environments used in their risk modeling and other predictive analytics. Whether or not you’ve implemented AI, machine learning or bot technology into your operations, it’s highly likely your customers are already leveraging AI in their home lives, with smart home devices like Amazon Echo and Google Home. Conversational AI is the next juncture in how people interface with each other, companies and life in general. We’re already seeing previews of what’s possible with technologies like Google Duplex. This has huge implication for the financial services industry, from removing friction at a transaction level to creating a stickier, more engaging customer experience. To that end, according to this report from Accenture, AI may begin to provide in-the-moment, holistic financial advice that is in a customer’s best interest. It goes without saying that the market will continue to evolve, competition will only grow more fierce, consumer expectation will continue to shift, and regulation will likely become more complex. It’s clear technology can be a mitigating factor, even a competitive differentiator, with these changing industry variables. Financial institutions must evolve corporate mindsets in their approach to prioritize innovations that will have the greatest enterprise-wide impact. By putting together an intelligent mix of people, process, and the right technology, financial institutions can better predict consumer need and expectation while modernizing their business models.
2019 is here — with new technology, new regulations and new opportunities on the docket. What does that mean for the financial services space? Here are the five trends you should keep your eye on and how these affect your credit universe. 1. Credit access is at an all-time high With 121 million Americans categorized as credit-challenged (subprime scores and a thin or nonexistent credit file) and 45 million considered credit-invisible (no credit history), the credit access many consumers take for granted has appeared elusive to others. Until now. The recent launch of Experian BoostTM empowers consumers to improve their credit instantly using payment history from their utility and phone bills, giving them more control over their credit scores and making them more visible to lenders and financial institutions. This means more opportunities for more people. Coupled with alternative credit data, which includes alternative financial services data, rental payments, and full-file public records, lenders and financial institutions can see a whole new universe. In 2019, inclusion is key when it comes to universe expansion goals. Both alternative credit and consumer-permissioned data will continue to be an important part of the conversation. 2. Machine learning for the masses The financial services industry has long been notorious for being founded on arguably antiquated systems and steeped in compliance and regulations. But the industry’s recent speed of disruption, including drastic changes fueled by technology and innovation, may suggest a changing of the guard. Digital transformation is an industry hot topic, but defining what that is — and navigating legacy systems — can be challenging. Successfully integrating innovation is the convergence at the center of the Venn diagram of strategy, technology and operations. The key, according to Deloitte, is getting “a better handle on data to extract the greatest value from technology investments.” How do you get the most value? Risk managers need big data, machine learning and artificial intelligence strategies to deliver market insights and risk evaluation. Between the difficulty of leveraging data sets and significant investment in time and money, it’s impossible for many to justify. To combat this challenge, the availability and access to an analytical sandbox (which contains depersonalized consumer data and comparative industry intel) is crucial to better serve clients and act on opportunities in lenders’ credit universe and beyond. “Making information analysis easily accessible also creates distinct competitive advantages,” said Vijay Mehta, Chief Innovation Officer for Experian’s Consumer Information Services, in a recent article for BAI Banking Strategies. “Identifying shifts in markets, changes in regulations or unexpected demand allows for quick course corrections. Tightening the analytic life cycle permits organizations to reach new markets and quickly respond to competitor moves.” This year is about meaningful metrics for action, not just data visualization. 3. How to fit into the digital-first ecosystem With so many things available on demand, the need for instant gratification continues to skyrocket. It’s no secret that the financial services industry needs to compete for attention across consumers’ multiple screens and hours of screen time. What’s in the queue for 2019? Personalization, digitalization and monetization. Consumers’ top banking priorities include customized solutions, omnichannel experience improvement and enhancing the mobile channel (as in, can we “Amazonize” everything?). Financial services leaders’ priorities include some of the same things, such as enhancing the mobile channel and delivering options to customize consumer solutions (BAI Banking Strategies). From geolocation targeting to microinteractions in the user experience journey to leveraging new strategies and consumer data to send personalized credit offers, there’s no shortage of need for consumer hyper-relevance. 33 percent of consumers who abandon business relationships do so because personalization is lacking, according to Accenture data for The Financial Brand. This expectation spans all channels, emphasizing the need for a seamless experience across all devices. 4. Keeping fraudsters out Many IT professionals regard biometric authentication as the most secure authentication method currently available. We see this technology on our personal devices, and many companies have implemented it as well. Biometric hacking is among the predicted threats for 2019, according to Experian’s Data Breach Industry Forecast, released last month. “Sensors can be manipulated and spoofed or deteriorate with too much use. ... Expect hackers to take advantage of not only the flaws found in biometric authentication hardware and devices, but also the collection and storage of data,” according to the report. 5. Regulatory changes and continued trends Under the Trump Administration, the regulatory front has been relatively quiet. But according to the Wall Street Journal, as Democrats gain control of the House of Representatives, lawmakers may be setting their sights on the financial services industry — specifically on legislation in response to the credit data breach in 2017. The Democratic Party leadership has indicated that the House Financial Services Committee will be focused on protecting consumers and investors, preserving sector stability, and encouraging responsible innovation in financial technology, according to Deloitte. In other news, the focus on improving accuracy in data reporting, transparency for consumers in credit scoring and other automated decisions can be expected to continue. Consumer compliance, and specifically the fair and responsible treatment of consumers, will remain a top priority. For all your needs in 2019 and beyond, Experian has you covered. Learn more
I believe it was George Bernard Shaw that once said something along the lines of, “If economists were laid end-to-end, they’d never come to a conclusion, at least not the same conclusion.” It often feels the same way when it comes to big data analytics around customer behavior. As you look at new tools to put your customer insights to work for your enterprise, you likely have questions coming from across your organization. Models always seem to take forever to develop, how sure are we that the results are still accurate? What data did we use in this analysis; do we need to worry about compliance or security? To answer these questions and in an effort to best utilize customer data, the most forward-thinking financial institutions are turning to analytical environments, or sandboxes, to solve their big data problems. But what functionality is right for your financial institution? In your search for a sandbox solution to solve the business problem of big data, make sure you keep these top four features in mind. Efficiency: Building an internal data archive with effective business intelligence tools is expensive, time-consuming and resource-intensive. That’s why investing in a sandbox makes the most sense when it comes to drawing the value out of your customer data.By providing immediate access to the data environment at all times, the best systems can reduce the time from data input to decision by at least 30%. Another way the right sandbox can help you achieve operational efficiencies is by direct integration with your production environment. Pretty charts and graphs are great and can be very insightful, but the best sandbox goes beyond just business intelligence and should allow you to immediately put models into action. Scalability and Flexibility: In implementing any new software system, scalability and flexibility are key when it comes to integration into your native systems and the system’s capabilities. This is even more imperative when implementing an enterprise-wide tool like an analytical sandbox. Look for systems that offer a hosted, cloud-based environment, like Amazon Web Services, that ensures operational redundancy, as well as browser-based access and system availability.The right sandbox will leverage a scalable software framework for efficient processing. It should also be programming language agnostic, allowing for use of all industry-standard programming languages and analytics tools like SAS, R Studio, H2O, Python, Hue and Tableau. Moreover, you shouldn’t have to pay for software suites that your analytics teams aren’t going to use. Support: Whether you have an entire analytics department at your disposal or a lean, start-up style team, you’re going to want the highest level of support when it comes to onboarding, implementation and operational success. The best sandbox solution for your company will have a robust support model in place to ensure client success. Look for solutions that offer hands-on instruction, flexible online or in-person training and analytical support. Look for solutions and data partners that also offer the consultative help of industry experts when your company needs it. Data, Data and More Data: Any analytical environment is only as good as the data you put into it. It should, of course, include your own client data. However, relying exclusively on your own data can lead to incomplete analysis, missed opportunities and reduced impact. When choosing a sandbox solution, pick a system that will include the most local, regional and national credit data, in addition to alternative data and commercial data assets, on top of your own data.The optimum solutions will have years of full-file, archived tradeline data, along with attributes and models for the most robust results. Be sure your data partner has accounted for opt-outs, excludes data precluded by legal or regulatory restrictions and also anonymizes data files when linking your customer data. Data accuracy is also imperative here. Choose a big data partner who is constantly monitoring and correcting discrepancies in customer files across all bureaus. The best partners will have data accuracy rates at or above 99.9%. Solving the business problem around your big data can be a daunting task. However, investing in analytical environments or sandboxes can offer a solution. Finding the right solution and data partner are critical to your success. As you begin your search for the best sandbox for you, be sure to look for solutions that are the right combination of operational efficiency, flexibility and support all combined with the most robust national data, along with your own customer data. Are you interested in learning how companies are using sandboxes to make it easier, faster and more cost-effective to drive actionable insights from their data? Join us for this upcoming webinar. Register for the Webinar
This is an exciting time to work in big data analytics. Here at Experian, we have more than 2 petabytes of data in the United States alone. In the past few years, because of high data volume, more computing power and the availability of open-source code algorithms, my colleagues and I have watched excitedly as more and more companies are getting into machine learning. We’ve observed the growth of competition sites like Kaggle, open-source code sharing sites like GitHub and various machine learning (ML) data repositories. We’ve noticed that on Kaggle, two algorithms win over and over at supervised learning competitions: If the data is well-structured, teams that use Gradient Boosting Machines (GBM) seem to win. For unstructured data, teams that use neural networks win pretty often. Modeling is both an art and a science. Those winning teams tend to be good at what the machine learning people call feature generation and what we credit scoring people called attribute generation. We have nearly 1,000 expert data scientists in more than 12 countries, many of whom are experts in traditional consumer risk models — techniques such as linear regression, logistic regression, survival analysis, CART (classification and regression trees) and CHAID analysis. So naturally I’ve thought about how GBM could apply in our world. Credit scoring is not quite like a machine learning contest. We have to be sure our decisions are fair and explainable and that any scoring algorithm will generalize to new customer populations and stay stable over time. Increasingly, clients are sending us their data to see what we could do with newer machine learning techniques. We combine their data with our bureau data and even third-party data, we use our world-class attributes and develop custom attributes, and we see what comes out. It’s fun — like getting paid to enter a Kaggle competition! For one financial institution, GBM armed with our patented attributes found a nearly 5 percent lift in KS when compared with traditional statistics. At Experian, we use Extreme Gradient Boosting (XGBoost) implementation of GBM that, out of the box, has regularization features we use to prevent overfitting. But it’s missing some features that we and our clients count on in risk scoring. Our Experian DataLabs team worked with our Decision Analytics team to figure out how to make it work in the real world. We found answers for a couple of important issues: Monotonicity — Risk managers count on the ability to impose what we call monotonicity. In application scoring, applications with better attribute values should score as lower risk than applications with worse values. For example, if consumer Adrienne has fewer delinquent accounts on her credit report than consumer Bill, all other things being equal, Adrienne’s machine learning score should indicate lower risk than Bill’s score. Explainability — We were able to adapt a fairly standard “Adverse Action” methodology from logistic regression to work with GBM. There has been enough enthusiasm around our results that we’ve just turned it into a standard benchmarking service. We help clients appreciate the potential for these new machine learning algorithms by evaluating them on their own data. Over time, the acceptance and use of machine learning techniques will become commonplace among model developers as well as internal validation groups and regulators. Whether you’re a data scientist looking for a cool place to work or a risk manager who wants help evaluating the latest techniques, check out our weekly data science video chats and podcasts.
Machine learning (ML), the newest buzzword, has swept into the lexicon and captured the interest of us all. Its recent, widespread popularity has stemmed mainly from the consumer perspective. Whether it’s virtual assistants, self-driving cars or romantic matchmaking, ML has rapidly positioned itself into the mainstream. Though ML may appear to be a new technology, its use in commercial applications has been around for some time. In fact, many of the data scientists and statisticians at Experian are considered pioneers in the field of ML, going back decades. Our team has developed numerous products and processes leveraging ML, from our world-class consumer fraud and ID protection to producing credit data products like our Trended 3DTM attributes. In fact, we were just highlighted in the Wall Street Journal for how we’re using machine learning to improve our internal IT performance. ML’s ability to consume vast amounts of data to uncover patterns and deliver results that are not humanly possible otherwise is what makes it unique and applicable to so many fields. This predictive power has now sparked interest in the credit risk industry. Unlike fraud detection, where ML is well-established and used extensively, credit risk modeling has until recently taken a cautionary approach to adopting newer ML algorithms. Because of regulatory scrutiny and perceived lack of transparency, ML hasn’t experienced the broad acceptance as some of credit risk modeling’s more utilized applications. When it comes to credit risk models, delivering the most predictive score is not the only consideration for a model’s viability. Modelers must be able to explain and detail the model’s logic, or its “thought process,” for calculating the final score. This means taking steps to ensure the model’s compliance with the Equal Credit Opportunity Act, which forbids discriminatory lending practices. Federal laws also require adverse action responses to be sent by the lender if a consumer’s credit application has been declined. This requires the model must be able to highlight the top reasons for a less than optimal score. And so, while ML may be able to deliver the best predictive accuracy, its ability to explain how the results are generated has always been a concern. ML has been stigmatized as a “black box,” where data mysteriously gets transformed into the final predictions without a clear explanation of how. However, this is changing. Depending on the ML algorithm applied to credit risk modeling, we’ve found risk models can offer the same transparency as more traditional methods such as logistic regression. For example, gradient boosting machines (GBMs) are designed as a predictive model built from a sequence of several decision tree submodels. The very nature of GBMs’ decision tree design allows statisticians to explain the logic behind the model’s predictive behavior. We believe model governance teams and regulators in the United States may become comfortable with this approach more quickly than with deep learning or neural network algorithms. Since GBMs are represented as sets of decision trees that can be explained, while neural networks are represented as long sets of cryptic numbers that are much harder to document, manage and understand. In future blog posts, we’ll discuss the GBM algorithm in more detail and how we’re using its predictability and transparency to maximize credit risk decisioning for our clients.
If someone asked you for stats on your retail card portfolio, would you respond with the number of accounts? Average spend per month? Or maybe you know the average revolving balance and profitability. Notice something about that list? Too many lenders think of their portfolio and customers as numbers when in reality these are individuals expressing themselves through their transactions. In an age where consumers increasingly expect customized experiences, marketing to account #5496115149251 is likely to fall on deaf ears. Credit card transaction data including bankcard, retail, and debit cards holds a wealth of information about your consumers\' tastes and preferences. Think about all the purchases you made using a credit card this past month. Did you shop at high-end retail stores or discount stores? Expensive restaurants or fast food? Did you buy new clothes for your kids? Maybe you went to the movies, or met friends at a bar. How you use your card paints a picture of who you are. The trick is turning all those numbers into insights. You may have been swept up in all the excitement around Apple’s announcement of the iPhone X in August. However, you may have overlooked the incorporation of Neural Embedding, or machine learning, as one of the most powerful features of the new phone. Experian DataLabs has developed an innovative approach to analyzing transaction data using similar techniques. Unstructured machine learning is applied and patterns begin to emerge around customer spending. The patterns are highly intuitive and give personality to what was previously an indecipherable stream of data. For example, one group may be more likely to spend on children’s clothing, child care services, and theme parks while another spends on expensive restaurants, airlines, and golf courses. If these two consumers happened to spend approximately the same each month on your card, you’d probably treat them as category. But understanding one is a young family and their other is jet setter allows you to tailor messaging, offers, and terms to their needs and use of your products. Further, you can ensure they have the best product based on their lifestyle to minimize silent attrition as their needs evolve. But it’s not just about marketing. When your latest attrition dashboard is updated, what period are you measuring? Do you analyze account closures from the previous month? Maybe a few months back? Understanding churn is important, but it’s inherently reactive and backward looking. You wouldn’t drive a car looking in the rearview mirror, would you? Experian enables clients to actively monitor the portfolio for attrition risk by analyzing usage patterns and predicting future spend. Transactions are then monitored up to daily and, when spend doesn’t occur as expected, an alert is sent so you can proactively attempt to save the account before it closes. These algorithms are finely tuned to reduce false positives that can come from seasonality or predictable gaps in spend such as only using a card at certain times during the week. Most importantly, it gives you an opportunity to manage each account and address changing customer needs instead of waiting for customers to call to cancel. So how well do you know your customers? If you’re still looking at them as numbers, it may be time to explore new capabilities that allow you to act small, no matter how large your portfolio. Transaction Data Insights brings cutting-edge machine learning capabilities to lenders of all sizes. By digging into behavioral segments and having tools to monitor and send alerts when a consumer is showing signs of attrition risk, card portfolios can suddenly treat customers like people, providing the customized experience they increasingly expect.
Risk analysts are insatiable consumers of big data who require better intelligence to develop market insights, evaluate risk and confirm business strategies. While every credit decision, risk assessment model or marketing forecast improves when it is based on better, faster and more current data, leveraging large data sets can be challenging and unproductive. That’s why Experian added a new functionality to its Analytical Sandbox, giving clients the flexibility they need to analyze big data efficiently. Experian’s Analytical Sandbox now utilizes H2O –an open source machine learning and deep learning platform that can model and predict with high accuracy billions of rows of high-dimensional data from multiple sources in various formats. Through machine learning and advanced predictive modeling, the platform enables Experian to better provide on-demand data insights that empowers analysts with high-quality intelligence to inform regional trends, provide consumer transactional insight or expose marketing opportunities. As a hosted service, Sandbox is offered as a plug-and-play, meaning no internal development is required. Clients can instantly access the data through a secure Web interface on their desktop, giving users access to powerful artificial and business intelligence tools from their own familiar applications. No special training is required. “AI monetizes data,” said SriSatish Ambati, CEO of H2O.ai. “Our partnership with Experian democratizes and delivers AI to the wider community of financial and risk analysts. Experian\'s analytics sandbox can now model and predict with high accuracy billions of rows of high-dimensional data in mere seconds.” Through H2O and the Experian Sandbox, machine learning and predictive analytics are giving risk managers from financial institutions of all sizes the ability to incorporate machine learning models into their own big data processing systems.
Newest technology doesn’t mean best when it comes to stopping fraud I recently attended the Merchant Risk Conference in Las Vegas, which brings together online merchants and industry vendors including payment service providers and fraud detection solution providers. The conference continues to grow year to year – similar to the fraud and risk challenges within the industry. In fact, we just released analysis, that we’ve seen fraud rates spike to 33% in the past year. This year, the exhibit hall was full of new names on the scene – evidence that there is a growing market for controlling risk and fraud in the e-commerce space. I heard from a few merchants at the conference that there were some “cool” new technologies out to help combat fraud. Things like machine learning, selfies and other two-factor authentication tools were all discussed as the latest in the fight against fraud. The problem is, many of these “cool” new technologies aren’t yet efficient enough at identifying and stopping fraud. Cool, yes. Effective, no. Sure, you can ask your customer to take a selfie and send it to you for facial recognition scanning. But, can you imagine your mother-in-law trying to manage this process? Machine Learning, while very promising, still has some room to grow in truly identifying fraud while minimizing the false positives. Many of these “anomaly detection” systems look for just that – anomalies. The problem is, we’re fighting motivated and creative fraudsters who are experts at avoiding detection and can beat anomaly detection. I do not doubt that you can stop fraud if you introduce some of these new technologies. The problem is, at what cost? The trick is stopping fraud with efficiency – to stop the fraud and not disrupt the customer experience. Companies, now more than ever, are competing based on customer experience. Adding any amount of friction to the buying process puts your revenue at risk. Consider these tips when evaluating and deploying fraud detection solutions for your online business. Evaluate solutions based on all metrics What is the fraud detection rate? What impact will it have on approvals? What is the false positive rate and impact on investigations? Does the attack rate decline after implementing the solution? Is the process detectable by fraudsters? What friction is introduced to the process? Use all available data at your disposal to make a decision Does the consumer exist? Can we validate the person’s identity? Is the web-session and user-entered data consistent with this consumer? Step up authentication but limit customer friction Is the technology appropriate for your audience (i.e. a selfie, text-messaging, document verification, etc...)? Are you using jargon in your process? In the end, any solution can stop 100% of the fraud – but at what cost. It’s a balance - a balance between detection and friction. Think about customer friction and the impact on customer satisfaction and revenue.