By: Mike Horrocks Living just outside of Indianapolis, I can tell you that the month of May is all about \"The Greatest Spectacle in Racing\", the Indy 500. The four horsemen of the apocalypse could be in town, but if those horses are not sponsored by Andretti Racing or Pennzoil – forget about it. This year the race was a close one, with three-time Indy 500 winner, Helio Castroneves, losing by .06 of a second. It doesn’t get much closer. So looking back, there are some great lessons from Helio that I want to share with auto lenders: You have to come out strong and with a well-oiled machine. Castroneves lead the race with no contest for 38 laps. You cannot do that without a great car and team. So ask yourself - are you handling your auto lending with the solution that has the ability to lead the market or are you having to go to the pits often, just to keep pace? You need to stay ahead of the pack until the end. Castroneves will be the first to admit that his car was not giving him all the power he wanted in the 196th lap. Now remember there are only 200 laps in the race, so with only four laps to go, that is not a good time to have a hiccup. If your lending strategy hasn\'t changed \"since the first lap\", you could have the same problem getting across the the finish line? Take time to make sure your automated scoring approach is valid, question your existing processes, and consider getting an outside look from leaders in the industry to make sure your are still firing on all cylinders. Time kills. Castroneves lost by .06 seconds. That .06 of a second means he was denied access into a very select club of four time winners. That .06 of a second means he does not get to drink that coveted glass of milk. If your solution is not providing your customers with the fastest and best credit offers, how many deals are you losing? What exclusive club of top auto lenders are you being denied access to? Second place is no fun. If you\'re Castroneves, there\'s no substitute for finishing first at the Indianapolis Motor Speedway. Likewise, in today’s market, there is more need than ever to be the Winner’s Circle. Take a pit stop and check out your lending process and see how you\'re performing against your competitors and in the spirit of the race – “Ladies and gentlemen, start your engines!”
Julie Conroy - Research Director, Aite Group Finding patterns indicative of money laundering and other financial crimes is akin to searching for a needle in a haystack. With the increasing pressure on banks’ anti-money laundering (AML) and fraud teams, many with this responsibility increasingly feel like they’re searching for those needles while a combine is bearing down on them at full speed. These pressures include: Regulatory scrutiny: The high-profile—and expensive—U.S. enforcement actions that took place during the last couple of years underscore the extent to which regulators are scrutinizing FIs and penalizing those who don’t pass muster. Payment volumes and types increasing: As the U.S. economy is gradually easing its way into a recovery, payment volumes are increasing. Not only are volumes rebounding to pre-recession levels, but there have also been a number of new financial products and payment formats introduced over the last few years, which further increases the workload for the teams who have to screen these payments for money-laundering, sanctions, and global anti-corruption-related exceptions. Constrained budgets: All of this is taking place during a time in which top line revenue growth is constrained and financial institutions are under pressure to reduce expenses and optimize efficiency. Illicit activity on the rise: Criminal activity continues to increase at a rapid pace. The array of activity that financial institutions’ AML units are responsible for detecting has also experienced a significant increase in scope over the last decade, when the USA PATRIOT Act expanded the mandate from pure money laundering to also encompass terrorist financing. financial institutions have had to transition from activity primarily focused on account-level monitoring to item-level monitoring, increasing by orders of magnitude the volumes of alerts they must work (Figure 1) Figure 1: U.S. FIs Are Swimming in Alerts Source: Aite Group interviews with eight of the top 30 FIs by asset size, March to April 2013 There are technologies in market that can help. AML vendors continue to refine their analytic and matching capabilities in an effort to help financial insitutions reduce false positives while not adversely affecting detection rates. Hosted solutions are increasingly available, reducing total cost of ownership and making software upgrades easier. And many institutions are working on internal efficiency efforts, reducing vendors, streamlining processes, and eliminating the number of redundant efforts. How are institutions handling the increasing pressure cooker that is AML compliance? Aite Group wants to know your thoughts. We are conducting a survey of financial insitution executives to understand your pain points and proposed solutions. Please take 20 minutes to share your thoughts, and in return, we’ll share a complimentary copy of the resulting report. This data can be used to compare your efforts to those of your peers as well as to glean new ideas and best practices. All responses will be kept confidential and no institutions names will be mentioned anywhere in the report. You can access the survey here: SURVEY
At Experian\'s 2014 Vision conference (#vision2014), Ori Eisen and Matt Ehrlich presented current trends and practices for taking on the growing industrialization of fraud. Together with a risk executive from a leading bank, the team discussed several themes and emerging tactics, including: the cost of single channel fraud prevention strategies, the necessity of a layered security strategy that includes device and identity intelligence, and true real time, point of contact risk-scoring. Data breaches have become all too common and one of the most concerning issues around breaches is that many consumers’ digital identities are based on a single email address or username/password. With stolen identity data in hand, criminals can submit fraudulent mortgages, credit card applications, even create fake credit cards, in the names of thousands of unsuspecting victims. Regardless of how the data is used, one thing is certain: breaches pose serious dangers to consumers, retailers and financial institutions. The need for customer-friendly fraud management is stronger than ever. A single layer of protection is simply ineffective as criminals are more efficient in obtaining consumer identification details and compromising simple access credentials. While mobile technologies and the Internet have enabled consumers to have anytime access to their financial data, these advances actually enable criminals to perpetrate fraud. Eisen and Ehrlich discussed how customer-friendly technologies and policies continue to outpace controls and risk management. Vision 2014: Know Your Enemy - a financial institution’s best practices for preventing the latest fraud attacks Learn more about fraud intelligence products and services and 41st Parameter, a part of Experian. See related content on this topic: 6 Key Predictions on How Data Breach Concerns Will Evolve in 2014
As we discussed in our earlier Heartbleed post, there are several new vulnerabilities online and in the mobile space increasing the challenges that security professionals face. Fraud education is a necessity for companies to help mitigate future fraud occurrences and another critical component when assessing online and mobile fraud is device intelligence. In order to be fraud-ready, there are three areas within device intelligence that companies must understand and address: device recognition, device configuration and device behavior. Device recognition Online situational awareness starts with device recognition. In fraudulent activity there are no human users on online sites, only devices claiming to represent them. Companies need to be able to detect high-risk fraud events. A number of analytical capabilities are built on top of device recognition: Tracking the device’s history with the user and evaluating its trust level. Tracking the device across multiple users and evaluating whether the device is impersonating them. Maintaining a list of devices previously associated with confirmed fraud. Correlation of seemingly unrelated frauds to a common fraud ring and profiling its method of operation. Device configuration The next level of situational awareness is built around the ability to evaluate a device’s configuration in order to identify fraudulent access attempts. This analysis should include the following capabilities: Make sure the configuration is compatible with the user it claims to represent. Check out internal inconsistencies suggesting an attempt to deceive. Review whether there any indications of malware present. Device behavior Finally, online situational awareness should include robust capabilities for profiling a device’s behavior both within individual accounts and across multiple users: Validate that the device focus is not on activity types often associated with fraud staging. Confirm that the timing of the activities do not seem designed to avoid detection rules. By proactively managing online channel risk and combining device recognition with a powerful risk engine, organizations can uncover and prevent future fraud trends and potential attacks. Learn more about Experian fraud intelligence products and services from 41st Parameter, a part of Experian.
The discovery of Heartbleed earlier this year uncovered a large-scale threat that exploits security vulnerability in OpenSSL posing a serious security concern. This liability gave hackers access to servers for many Websites and put consumers’ credentials and private information at risk. Since the discovery, most organizations with an online presence have been trying to determine whether their servers incorporate the affected versions of OpenSSL. However, the impact will be felt even by organizations that do not use OpenSSL, as some consumers could reuse the same password across sites and their password may have been compromised elsewhere. The new vulnerabilities online and in the mobile space increase the challenges that security professionals face, as fraud education is a necessity for companies. Our internal fraud experts share their recommendation in the wake of the Heartbleed bug and what companies can do to help mitigate future occurrences. Here are two suggestions on how to prevent compromised credentials from turning into compromised accounts: Authentication Adopting layered security strategy Authentication The importance of multidimensional and risk-based authentication cannot be overstated. Experian Decision Analytics and 41st Parameter® recommend a layered approach when it comes to responding to future threats like the recent Heartbleed bug. Such methods include combining comprehensive authentication processes at customer acquisition with proportionate measures to monitor user activities throughout the life cycle. "Risk-based authentication is best defined and implemented in striking a balance between fraud risk mitigation and positive customer experience," said Keir Breitenfeld, Vice President of Fraud Product Management for Experian Decision Analytics. "Attacks such as the recent Heartbleed bug further highlight the foundational requirement of any online business or agency applications to adopt multifactor identity and device authentication and monitoring processes throughout their Customer Life Cycle." Some new authentication technologies that do not rely on usernames and passwords could be part of the broader solution. This strategic change involves the incorporation of broader layered-security strategy. Using only authentication puts security strategists in a difficult position since they must balance: Market pressure for convenience (Note that some mobile banking applications now provide access to balances and recent transactions without requiring a formal login.) New automated scripts for large-scale account surveillance. The rapidly growing availability of compromised personal information. Layered security "Layered security through a continuously refined set of ‘locks’ that immediately identify fraudulent access attempts helps organizations to protect their invaluable customer relationships," said Mike Gross, Global Risk Strategy Director for 41st Parameter. "Top global sites should be extra vigilant for an expected rush of fraud-related activities and social engineering attempts through call centers as fraudsters try to take advantage of an elevated volume of password resets." By layering security consistently through a continuously refined set of controls, organizations can identify fraudulent access attempts, unapproved contact information changes and suspicious transactions. Learn more about fraud intelligence products and services from 41st Parameter, a part of Experian.
Both Visa and MasterCard announced their support for Host Card Emulation (HCE) and their intent to release HCE specifications soon. I have been talking about HCE from late 2012 (partly due to my involvement with SimplyTapp) and you could read as to why HCE matter and what Android KitKat-HCE announcement meant for payments. But in light of the network certification announcements yesterday, this post is an attempt to provide some perspective on what the Visa/MasterCard moves mean, how do their approaches differ in certifying payments using cloud hosted credentials, what should issuers expect from a device and terminal support perspective, why retailers should take note of the debate around HCE and ultimately – the role I expect Google to continue to play around HCE. All good stuff. First, what do the Visa/MasterCard announcements mean? It means that it’s time for banks and other issuers to stop looking for directions. The network announcements around HCE specifications provide the clarity required by issuers to meaningfully invest in mobile contactless provisioning and payment. Further, it removes some of the unfavorable economics inherited from a secure element-centric model, who were forced to default to credit cards with higher interchange in the wallet. Renting space on the secure element cost a pretty penny and that is without taking operational costs in to consideration, and as an issuer if you are starting in the red out of the gate, you were not about to put a Durbin controlled debit card in the wallet. But those compulsions go with the wind now, as you are no longer weighed down by these costs and complexities on day one. And further, the door is open for retailers with private label programs or gift cards to also look at this route with a lot more interest. And they are. MasterCard mentioned bank pilots around HCE in its press release, but MCX is hardly the only retailer payment initiative in town. Let me leave it at that. How do the Visa/MasterCard specs differ? From the press releases, some of those differences are evident – but I believe they will coalesce at some point in the future. MasterCard’s approach speaks to mobile contact-less as the only payment modality, whereas Visa refers to augmenting the PayWave standard with QR and in-app payments in the future. Both approaches refer to payment tokens (single or multi-use) and one can expect them to work together with cloud provisioned card profiles, to secure the payment transaction and verify transactional integrity. To MasterCard’s benefit – it has given much thought to ensuring that these steps – provisioning the card profile, issuing payment tokens et al – are invisible to the consumer and therefore refrains from adding undue friction. I am a purist at heart – and I go back to the first iteration of Google Wallet – where all I had to do to pay was turn on the screen and place the device on the till. That is the simplicity to beat for any issuer or retailer payment experiences when using contactless. Otherwise, they are better off ripping out the point-of-sale altogether. MasterCard’s details also makes a reference to a PIN. The PIN will not be verified offline as it would have been if a Secure Element would have been present in the device, rather – it would be verified online which tells me that an incorrect PIN if input would be used to create an “incorrect cryptogram” which would be rejected upstream. Now I am conflicted using a PIN at the point of sale for anything – to me it is but a Band-Aid, it reflects the inability to reduce fraud without introducing friction. Visa so far seems to be intentionally light on details around mandating a PIN, and I believe not forcing one would be the correct approach – as you wouldn’t want to constrain issuers to entering a PIN as means to do authentication, and instead should have laid down the requirements but left it to the market to decide what would suffice – PIN, biometrics et al. Again – I hope these specs will continue to evolve and move towards a more amenable view towards customer authentication. Where do we stand with device and terminal support? All of this is mute if there are not enough devices that support NFC and specifically – Android KitKat. But if you consider Samsung devices by themselves (which is all one should consider for Android) they control over 30% of the NA market – 44.1 million devices sold in 2013 alone. Lion share of those devices support NFC out of the box – including Galaxy Note II and 3, Galaxy S3 and S4 – and their variants mini, Active, Xoom et al. And still, the disparity in their approach to secure elements, continuing lack of availability in standards and Android support – Tap and Pay was largely a dream. What was also worrisome is that 3 months after the launch of Android KitKat – it still struggles under 2% in device distribution. That being said, things are expected to get markedly better for Samsung devices at least. Samsung has noted that 14 of its newer devices will receive KitKat. These devices include all the NFC phones I have listed above. Carriers must follow through quickly (tongue firmly in cheek) to deliver on this promise before customers with old S3 devices see their contracts expire and move to a competitor (iPhone 6?). Though there was always speculation as to whether an MNO will reject HCE as part of the Android distribution, I see that as highly unlikely. Even carriers know a dead horse when they see one, and Isis’s current model is anything but one. Maybe Isis will move to embrace HCE. And then there is the issue of merchant terminals. When a large block of merchants are invested in upending the role of networks in the payment value chain – that intent ripples far and wide in the payments ecosystem. Though it’s a given that merchants of all sizes can expect to re-terminalize in the next couple of years to chip & pin (with contactless under the hood) – it is still the prerogative of the merchant as to whether the contactless capability is left turned on or off. And if merchants toe Best Buy’s strategy in how it opted to turn it off store-wide, then that limits the utility of an NFC wallet. And why wouldn’t they? Merchants have always viewed “Accept all cards” to also mean “Accept all cards despite the form factor” and believes that contactless could come to occupy a higher interchange tier in the future – as questions around fraud risk are sufficiently answered by the device in real-time. This fear is though largely unsubstantiated, as networks have not indicated that they could come to view mobile contact-less as being a “Card Present Plus” category that charges more. But in the absence of any real assurances, fear, uncertainty and doubt runs rampant. But what could a retailer do with HCE? If re-terminalization is certain, then retailers could do much to explore how to leverage it to close the gap with their customer. Private label credit, closed loop are viable alternatives that can be now carried over contactless – and if previously retailers were cut out of the equation due to heavy costs and complexity for provisioning cards to phones, they have none of those limitations now. A merchant could now fold in a closed loop product (like a gift card) in to their mobile app – and accept those payments over contact-less without resorting to clunky QR or barcode schemes. There is a lot of potential in the closed loop space with HCE, that Retailers are ignoring due to a “scorched earth” approach towards contactless. But smarter merchants are asking ‘how’. Finally, what about Google? Google deserves much praise for finally including HCE in Android and paving the way for brands to recognize the opportunity and certify the approach. That being said, Google has no unequal advantage with HCE. In fact, Google has little to do with HCE going forward, despite GoogleWallet utilization of HCE in the future. I would say – HCE has as much to do with Google going forward, as Amazon’s Kindle Fire has to do with Android. Banks and Retailers have to now decide what this means for them – and view HCE as separate to Google – and embrace it if they believe it has potential to incent their brands to remain top of wallet, and top of mind for the consumer. It is a level playing field, finally. Where do you go next? Indeed – there is a lot to take in – starting with HCE’s role, where it fit in to your payment strategy, impact and differences in Visa/MasterCard approaches, weaving all of these in to your mobile assets while not compromising on customer experience. Clarity and context is key and we can help with both. Reach out to us for a conversation. HCE is a means to an end – freeing you from the costs and complexities of leveraging contactless infrastructure to deliver an end-to-end mobile experience, but there is still the question of how your business should evolve to cater to the needs of your customers in the mobile channel. Payment is after all, just one piece of the puzzle.
By: Matt Sifferlen On January 17th, we celebrated the 308th birthday of one of America\'s most famous founding fathers, Ben Franklin. I\'ve been a lifelong fan of his after reading his biography while in middle school, and each year when his birthday rolls around I\'m inspired to research him a bit more since there is always something new to learn about his many meaningful contributions to this great nation. I find Ben a true inspiration for his capacity for knowledge, investigation, innovation, and of course for his many witty and memorable quotes. I think Ben would have been an exceptional blogger back in his day, raising the bar even higher for Seth Godin (one of my personal favorites) and other uber bloggers of today. And as a product manager, I highly respect Ben\'s lifelong devotion to improving society by finding practical solutions to complex problems. Upon a closer examination of many of Ben\'s quotes, I now feel that Ben was also a pioneer in providing useful lessons in commercial fraud prevention. Below is just a small sampling of what I mean. “An ounce of prevention is worth a pound of cure” - Preventing commercial fraud before it happens is the key to saving your organization\'s profits and reputation from harmful damage. If you\'re focused on detecting fraud after the fact, you\'ve already lost. “By failing to prepare, you are preparing to fail.” - Despite the high costs associated with commercial fraud losses, many organizations don\'t have a process in place to prevent it. This is primarily due to the fact that commercial fraud happens at a much lower frequency than consumer fraud. Are you one of those businesses that thinks \"it\'ll never happen to me?\" “When the well’s dry, we know the worth of water.” - So you didn\'t follow the advice of the first two quotes, and now you\'re feeling the pain and embarrassment that accompanies commercial fraud. Have you learned your lesson yet? “After crosses and losses, men grow humbler and wiser.” Ah, no lender likes losses. Nothing like a little scar tissue from \"bad deals\" related to fraud to remind you of decisions and processes that need to be improved in order to avoid history repeating itself. “Honesty is the best policy.” - Lots of businesses stumble on this part, failing to communicate when they\'ve been compromised by fraud or failing to describe the true scope of the damage. Be honest (quickly!) and set expectations about what you\'re doing to limit the damage and prevent similar instances in the future. “Life’s tragedy is that we get old too soon and wise too late.” - Being too late is a big concern when it comes to fraud prevention. It\'s impossible to prevent 100% of all fraud, but that shouldn\'t stop you from making sure that you have adequate preventive processes in place at your organization. “Never leave that till tomorrow which you can do today.” - Get a plan together now to deal with fraud scenarios that your business might be exposed to. Data breaches, online fraud and identity theft rates are higher than they\'ve ever been. Shame on those businesses that aren\'t getting prepared now. “Beer is living proof that God loves us and wants us to be happy.” - I highly doubt Ben actually said this, but some Internet sites attribute it to him. If you already follow all of his advice above, then maybe you can reward yourself with a nice pale ale of your choice! So Ben can not only be considered the \"First American,\" but he can also be considered one of the first fraud prevention visionaries. Guess we\'ll need to add one more thing to his long list of accomplishments!
By: Teri Tassara In my blog last month, I covered the importance of using quality credit attributes to gain greater accuracy in risk models. Credit attributes are also powerful in strengthening the decision process by providing granular views on consumers based on unique behavior characteristics. Effective uses include segmentation, overlay to scores and policy definition – across the entire customer lifecycle, from prospecting to collections and recovery. Overlay to scores – Credit attributes can be used to effectively segment generic scores to arrive at refined “Yes” or “No” decisions. In essence, this is customization without the added time and expense of custom model development. By overlaying attributes to scores, you can further segment the scored population to achieve appreciable lift over and above the use of a score alone. Segmentation – Once you made your “Yes” or “No” decision based on a specific score or within a score range, credit attributes can be used to tailor your final decision based on the “who”, “what” and “why”. For instance, you have two consumers with the same score. Credit attributes will tell you that Consumer A has a total credit limit of $25K and a BTL of 8%; Consumer B has a total credit limit of $15K, but a BTL of 25%. This insight will allow you to determine the best offer for each consumer. Policy definition - Policy rules can be applied first to get the desirable universe. For example, an auto lender may have a strict policy against giving credit to anyone with a repossession in the past, regardless of the consumer’s current risk score. High quality attributes can play a significant role in the overall decision making process, and its expansive usage across the customer lifecycle adds greater flexibility which translates to faster speed to market. In today’s dynamic market, credit attributes that are continuously aligned with market trends and purposed across various analytical are essential to delivering better decisions.
In the 1970s, it took an average of 18 days before a decision could be made on a credit card application. Credit decisioning has come a long way since then, and today, we have the ability to make decisions faster than it takes to ring up a customer in person at the point of sale. Enabling real-time credit decisions helps retail and online merchants lay a platform for customer loyalty while incentivizing an increased customer basket size. While the benefits are clear, customers still are required to be at predetermined endpoints, such as: At the receiving end of a prescreened credit offer in the mail At a merchant point of sale applying for retail credit In front of a personal computer The trends clearly show that customers are moving away from these predetermined touch-points where they are finding mailed credit offers antiquated, spending even less time at a retail point of sale versus preferring to shop online and exchanging personal computers for tablets and smartphones. Despite remaining under 6 percent of retail spending, e-commerce sales for Q2 2013 have reportedly been up 18.5 percent from Q2 2012, representing the largest year-over-year increase since Q4 2007, before the 2008 financial crisis. Fueled by a shift from personal computers to connected devices and a continuing growth in maturity of e-commerce and m-commerce platforms, this trend is only expected to grow stronger in the future. To reflect this shift, marketers need to be asking themselves how they should apportion their budgets and energies to digital while executing broader marketing strategies that also may include traditional channels. Generally, traditional card acquisitions methods have failed to respond to these behavioral shifts, and, as a whole, retail banking was unprepared to handle the disintermediation of traditional products in favor of the convenience mobile offers. Now that the world of banking is finding its feet in the mobile space, accessibility to credit must also adapt to be on the customer’s terms, unencumbered by historical notions around customer and credit risk. Download this white paper to learn how credit and retail private-label issuers can provide an optimal customer experience in emerging channels such as mobile without sacrificing risk mitigation strategies — leading to increased conversions and satisfied customers. It will demonstrate strategies employed by credit and retail private-label issuers who already have made the shift from paper and point of sale to digital, and it provides recommendations that can be used as a business case and/or a road map.
By: Zach Smith On September 13, the Consumer Financial Protection Bureau (CFPB) announced final amendments to the mortgage rules that it issued earlier this year. The CFPB first issued the final mortgage rules in January 2013 and then released subsequent amendments in June. The final amendments also make some additional clarifications and revisions in response to concerns raised by stakeholders. The final modifications announced by the CFPB in September include: Amending the prohibition on certain servicing activities during the first 120 days of a delinquency to allow the delivery of certain notices required under state law that may provide beneficial information about legal aid, counseling, or other resources. Detailing the procedures that servicers should follow when they fail to identify or inform a borrower about missing information from loss mitigation applications, as well as revisions to simplify the offer of short-term forbearance plans to borrowers suffering temporary hardships. Clarifying best practices for informing borrowers about the address for error resolution documents. Exempting all small creditors, including those not operating predominantly in rural or underserved areas, from the ban on high-cost mortgages featuring balloon payments. This exemption will continue for the next two years while the CFPB re-examines the definitions of “rural” and “underserved.” Explaining the \"financing” of credit insurance premiums to make clear that premiums are considered to be “financed” when a lender allows payments to be deferred past the month in which it’s due. Clarifying the circumstances when a bank’s teller or other administrative staff is considered to be a “loan originator” and the instances when manufactured housing employees may be classified as an originator under the rules. Clarifying and revising the definition of points and fees for purposes of the qualified mortgage cap on points and fees and the high-cost mortgage points and fees threshold. Revising effective dates of many loan originator compensation rules from January 10, 2014 to January 1, 2014. While the industry continues to advocate for an extension of the effective date to provide additional time to implement the necessary compliance requirements, the CFPB insists that both lenders and mortgage servicers have had ample time to comply with the rules. Most recently, in testimony before the House Financial Services Committee, CFPB Director Richard Cordray stated that “most of the institutions have told us that they will be in compliance” and he didn’t foresee further delays. Related Research Experian\'s Global Consulting Practice released a recent white paper, CCAR: Getting to the Real Objective, that suggests how banks, reviewers and examiners can best actively manage CCAR\'s objectives with a clear dual strategy that includes both short-term and longer-term goals for stress-testing, modeling and system improvements. Download the paper to understand how CCAR is not a redundant set of regulatory compliance exercices; its effects on risk management include some demanding paradigm shifts from traditional approaches. The paper also reviews the macroeconomic facts around the Great Recession revealing some useful insights for bank extreme-risk scenario development, econometric modeling and stress simulations. Related Posts Where Business Models Worked, and Didn\'t, and Are Most Needed Now in Mortgages Now That the CFPB Has Arrived, What\'s First on It\'s Agenda Can the CFPB Bring Debt Collection Laws into the 21st Centrury
TL;DR Read within as to how Touch ID is made possible via ARM’s TrustZone/TEE, and why this matters in the context of the coming Apple’s identity framework. Also I explain why primary/co-processor combos are here to stay. I believe that eventually, Touch ID has a payments angle – but focusing on e-commerce before retail. Carriers will weep over a lost opportunity while through Touch ID, we have front row seats to Apple’s enterprise strategy, its payment strategy and beyond all – the future direction of its computing platform. I had shared my take on a possible Apple Biometric solution during the Jan of this year based on its Authentec acquisition. I came pretty close, except for the suggestion that NFC is likely to be included. (Sigh.) Its a bit early to play fast and loose with Apple predictions, but its Authentec acquisition should rear its head sometime in the near future (2013 – considering Apple’s manufacturing lead times), that a biometric solution packaged neatly with an NFC chip and secure element could address three factors that has held back customer adoption of biometrics: Ubiquity of readers, Issues around secure local storage and retrieval of biometric data, Standardization in accessing and communicating said data. An on-chip secure solution to store biometric data – in the phone’s secure element can address qualms around a central database of biometric data open to all sorts of malicious attacks. Standard methods to store and retrieve credentials stored in the SE will apply here as well. Why didn’t Apple open up Touch ID to third party dev? Apple expects a short bumpy climb ahead for Touch ID before it stabilizes, as early users begin to use it. By keeping its use limited to authenticating to the device, and to iTunes – it can tightly control the potential issues as they arise. If Touch ID launched with third party apps and were buggy, it’s likely that customers will be confused where to report issues and who to blame. That’s not to say that it won’t open up Touch ID outside of Apple. I believe it will provide fettered access based on the type of app and the type of action that follows user authentication. Banking, Payment, Productivity, Social sharing and Shopping apps should come first. Your fart apps? Probably never. Apple could also allow users to set their preferences (for app categories, based on user’s current location etc.) such that biometrics is how one authenticates for transactions with risk vs not requiring it. If you are at home and buying an app for a buck – don’t ask to authenticate. But if you were initiating a money transfer – then you would. Even better – pair biometrics with your pin for better security. Chip and Pin? So passé. Digital Signatures, iPads and the DRM 2.0: It won’t be long before an iPad shows up in the wild sporting Touch ID. And with Blackberry’s much awaited and celebrated demise in the enterprise, Apple will be waiting on the sidelines – now with capabilities that allow digital signatures to become ubiquitous and simple – on email, contracts or anything worth putting a signature on. Apple has already made its iWork productivity apps(Pages, Numbers, Keynote), iMovie and iPhoto free for new iOS devices activated w/ iOS7. Apple, with a core fan base that includes photographers, designers and other creative types, can now further enable iPads and iPhones to become content creation devices, with the ability to attribute any digital content back to its creator by a set of biometric keys. Imagine a new way to digitally create and sign content, to freely share, without worrying about attribution. Further Apple’s existing DRM frameworks are strengthened with the ability to tag digital content that you download with your own set of biometric keys. Forget disallowing sharing content – Apple now has a way to create a secondary marketplace for its customers to resell or loan digital content, and drive incremental revenue for itself and content owners. Conclaves blowing smoke: In a day and age where we forego the device for storing credentials – whether it be due to convenience or ease of implementation – Apple opted for an on-device answer for where to store user’s biometric keys. There is a reason why it opted to do so – other than the obvious brouhaha that would have resulted if it chose to store these keys on the cloud. Keys inside the device. Signed content on the cloud. Best of both worlds. Biometric keys need to be held locally, so that authentication requires no roundtrip and therefore imposes no latency. Apple would have chosen local storage (ARM’s SecurCore) as a matter of customer experience, and what would happen if the customer was out-of-pocket with no internet access. There is also the obvious question that a centralized biometric keystore will be on the crosshairs of every malicious entity. By decentralizing it, Apple made it infinitely more difficult to scale an attack or potential vulnerability. More than the A7, the trojan in Apple’s announcement was the M7 chip – referred to as the motion co-processor. I believe the M7 chip does more than just measuring motion data. M7 – A security co-processor? I am positing that Apple is using ARM’s TrustZone foundation and it may be using the A7 or the new M7 co-processor for storing these keys and handling the secure backend processing required. Horace Dediu of Asymco had called to question why Apple had opted for M7 and suggested that it may have a yet un-stated use. I believe M7 is not just a motion co-processor, it is also a security co-processor. I am guessing M7 is based on the Cortex-M series processors and offloads much of this secure backend logic from the primary A7 processor and it may be that the keys themselves are likely to be stored here on M7. The Cortex-M4 chip has capabilities that sound very similar to what Apple announced around M7 – such as very low power chip, that is built to integrate sensor output and wake up only when something interesting happens. We should know soon. This type of combo – splitting functions to be offloaded to different cores, allows each cores to focus on the function that it’s supposed to performed. I suspect Android will not be far behind in its adoption, where each core focuses on one or more specific layers of the Android software stack. Back at Google I/O 2013, it had announced 3 new APIs (the Fused location provider) that enables location tracking without the traditional heavy battery consumption. Looks to me that Android decoupled it so that we will see processor cores that focus on these functions specifically – soon. I am fairly confident that Apple has opted for ARM’s Trustzone/TEE. Implementation details of the Trustzone are proprietary and therefore not public. Apple could have made revisions to the A7 chip spec and could have co-opted its own. But using the Trustzone/TEE and SecurCore allows Apple to adopt existing standards around accessing and communicating biometric data. Apple is fully aware of the need to mature iOS as a trusted enterprise computing platform – to address the lack of low-end x86 devices that has a hardware security platform tech. And this is a significant step towards that future. What does Touch ID mean to Payments? Apple plans for Touch ID kicks off with iTunes purchase authorizations. Beyond that, as iTunes continue to grow in to a media store behemoth – Touch ID has the potential to drive fraud risk down for Apple – and to further allow it to drive down risk as it batches up payment transactions to reduce interchange exposure. It’s quite likely that à la Walmart, Apple has negotiated rate reductions – but now they can assume more risk on the front-end because they are able to vouch for the authenticity of these transactions. As they say – customer can longer claim the fifth on those late-night weekend drunken purchase binges. Along with payment aggregation, or via iTunes gift cards – Apple has now another mechanism to reduce its interchange and risk exposure. Now – imagine if Apple were to extend this capability beyond iTunes purchases – and allow app developers to process in-app purchases of physical goods or real-world experiences through iTunes in return for better blended rates? (instead of Paypal’s 4% + $0.30). Heck, Apple can opt for short-term lending if they are able to effectively answer the question of identity – as they can with Touch ID. It’s Paypal’s ‘Bill Me Later’ on steroids. Effectively, a company like Apple who has seriously toyed with the idea of a Software-SIM and a “real-time wireless provider marketplace” where carriers bid against each other to provide you voice, messaging and data access for the day – and your phone picks the most optimal carrier, how far is that notion from picking the cheapest rate across networks for funneling your payment transactions? Based on the level of authentication provided or other known attributes – such as merchant type, location, fraud risk, customer payment history – iTunes can select across a variety of payment options to pick the one that is optimal for the app developer and for itself. And finally, who had the most to lose with Apple’s Touch ID? Carriers. I wrote about this before as well, here’s what I wrote then (edited for brevity): Does it mean that Carriers have no meaningful role to play in commerce? Au contraire. They do. But its around fraud and authentication. Its around Identity. … But they seem to be stuck imitating Google in figuring out a play at the front end of the purchase funnel, to become a consumer brand(Isis). The last thing they want to do is leave it to Apple to figure out the “Identity management” question, which the latter seems best equipped to answer by way of scale, the control it exerts in the ecosystem, its vertical integration strategy that allows it to fold in biometrics meaningfully in to its lineup, and to start with its own services to offer customer value. So there had to have been much ‘weeping and moaning and gnashing of the teeth’ on the Carrier fronts with this launch. Carriers have been so focused on carving out a place in payments, that they lost track of what’s important – that once you have solved authentication, payments is nothing but accounting. I didn’t say that. Ross Anderson of Kansas City Fed did. What about NFC? I don’t have a bloody clue. Maybe iPhone6? iPhone This is a re-post from Cherian's original blog post "Smoke is rising from Apple's Conclave"
By: Matt Sifferlen I recently read interesting articles on the Knowledge@Wharton and CNNMoney sites covering the land grab that\'s taking place among financial services startups that are trying to use a consumer\'s social media activity and data to make lending decisions. Each of these companies are looking at ways to take the mountains of social media data that sites such as Twitter, Facebook, and LinkedIn generate in order to create new and improved algorithms that will help lenders target potential creditworthy individuals. What are they looking at specifically? Some criteria could be: History of typing in ALL CAPS or all lower case letters Frequent usage of inappropriate comments Number of senior level connections on LinkedIn The quantity of posts containing cats or annoying self-portraits (aka \"selfies\") Okay, I made that last one up. The point is that these companies are scouring through the data that individuals are creating on social sites and trying to find useful ways to slice and dice it in order to evaluate and target consumers better. On the consumer banking side of the house, there are benefits for tracking down individuals for marketing and collections purposes. A simple search could yield a person\'s Facebook, Twitter, or LinkedIn profile. The behaviorial information can then be leveraged as a part of more targeted multi-channel and contact strategies. On the commercial banking side, utilizing social site info can help to supplement any traditional underwriting practices. Reviewing the history of a company\'s reviews on Yelp or Angie\'s List could share some insight into how a business is perceived and reveal whether there is any meaningful trend in the level of negative feedback being posted or potential growth outlook of the company. There are some challenges involved with leveraging social media data for these purposes. 1. Easily manipulated information 2. Irrelevant information that doesn\'t represent actual likes, thoughts or relevant behaviors 3. Regulations From a Fraud perspective, most online information can easily and frequently be manipulated which can create a constantly moving target for these providers to monitor and link to the right customer. Fake Facebook and Twitter pages, false connections and referrals on LinkedIn, and fabricated positive online reviews of a business can all be accomplished in a matter of minutes. And commercial fraudsters are likely creating false business social media accounts today for shelf company fraud schemes that they plan on hatching months or years down the road. As B2B review websites continue to make it easier to get customers signed up to use their services, the downside is there will be even more unusable information being created since there are less and less hurdles for commercial fraudsters to clear, particularly for sites that offer their services for free. For now, the larger lenders are more likely to utilize alternative data sources that are third party validated, like rent and utility payment histories, while continuing to rely on tools that can prevent against fraud schemes. It will be interesting to see what new credit and non credit data will be utilized as a common practice in the future as lenders continue their efforts to find more useful data to power their credit and marketing decisions.
By: Joel Pruis As we go through the economic seasons, we need to remember to reassess our strategy. While we use data as the way to accurately assess the environment and determine the best course of action for your future strategy, the one thing that is for certain is that the current environment will definitely change. Aspects that we did not anticipate will develop, trends may start to slow or change direction. Moneyball continues to be a movie that gives us some great examples. We see that Billy Beane and Peter Brand were constantly looking at their position and making adjustments to the team’s roster. Even before they made any significant adjustments, Beane and Brand found themselves justifying their strategy to the owner (even though the primary issue was with the head coach not playing the roster that maximized the team’s probability of winning). The first aspect that worked against the strategy was the head coach and while we could go down a tangent about cultural battles within an organization, let\'s focus on how Beane adjusted. Beane simply traded the players the head coach preferred to play forcing the use of players preferred by Beane and Brand. Later we see Beane and Brand making final adjustments to the roster by negotiating trades resulting in the Oakland A’s landing Ricardo Rincon. The change in the league that allowed such a trade was that Rincon’s team was not doing well and the timing allowed the A’s to execute the trade. Beane adjusted with the changes in the league. One thing to note, is that he changed the roster while the team was doing well. They were winning but Beane made adjustments to continue maximizing the team’s potential. Too often we adjust when things are going poorly and do not adjust when we seem to be hitting our targets. Overall, we need to continually assess what has changed in our environment and determine what new challenges or new opportunities these changes present. I encourage you to regularly assess what is happening in your local economy. High-level national trends are constantly on the front page of the news but we need to drill down to see what is happening in a specific market area being served. As Billy Beane did with the Oakland A’s throughout the season, I challenge you to assess your current strategies and execution against what is happening in your market territory. Related posts: How Financial Institutions can assess the overall conditions for generating the net yield on the assets How to create decision strategies for small business lending Upcoming Webinar: Learn about the current state of small business, the economy and how it applies to you
If you're looking to implement and deploy a knowledge-based authentication (KBA) solution in your application process for your online and mobile customer acquisition channels - then, I have good news for you! Here’s some of the upside you’ll see right away: Revenues (remember, the primary activities of your business?) will accelerate up Your B2C acceptance or approval rates will go up thru automation Manual review of customer applications will go down and that translates to a reduction in your business operation costs Products will be sold and shipped faster if you’re in the retail business, so you can recognize the sales revenue or net sales quicker Your customers will appreciate the fact that they can do business in minutes vs. going thru a lengthy application approval process with turnaround times of days to weeks And last but not least, your losses due to fraud will go down To keep you informed about what’s relevant when choosing a KBA vendor, here’s what separates the good KBA providers from the bad: The underlying data used to create questions should be from multiple data sources and should vary in the type of data, for example credit and non-credit Relying on public record data sources is becoming a risky proposition given recent adoption of various social media websites and various public record websites Have technology that will allow you to create a custom KBA setup that is unique to your business and business customers, and the proven support structure to help you grow your business safely Provide consulting (performance monitoring)and analytical support that will keep you ahead of the fraudsters trying to game your online environment by assuring your KBA tool is performing at optimal levels Solutions that can easily interface with multiple systems, and assist from a customer experience perspective. How are your peers in the following 3 industries doing at adopting a KBA strategy to help grow and protect their businesses? E-commerce 21% use KBA today and are satisfied with the results* 13% have KBA on roadmap and the list is growing fast* Healthcare 20% use dynamic KBA* Financial Institutions 30% combination of dynamic & static KBA* 20% dynamic KBA* What are the typical uses of KBA?* Call center Web / mobile verification Enrollment ID verification Provider authentication Eligibility *According to a 2012 report on knowledge-based authentication by Aite Group LLC Knowledge-based authentication, commonly referred to as KBA, is a method of authentication which seeks to prove the identity of someone accessing a service, such as a website. As the name suggests, KBA requires the knowledge of personal information of the individual to grant access to the protected material. There are two types of KBA: "static KBA", which is based on a pre-agreed set of "shared secrets"; and "dynamic KBA", which is based on questions generated from a wider base of personal information.
There are two core fundamentals of evaluating loan loss performance to consider when generating organic portfolio growth through the setting of customer lending limits. Neither of which can be discussed without first considering what defines a “customer.” Definition of a customer The approach used to define a customer is critical for successful customer management and is directly correlated to how joint accounts are managed. Definitions may vary by how joint accounts are allocated and used in risk evaluation. It is important to acknowledge: Legal restrictions for data usage related to joint account holders throughout the relationship Impact on predictive model performance and reporting where there are two financially linked individuals with differently assigned exposures Complexities of multiple relationships with customers within the same household – consumer and small business Typical customer definitions used by financial services organizations: Checking account holders: This definition groups together accounts that are “fed” by the same checking account. If an individual holds two checking accounts, then she will be treated as two different and unique customers. Physical persons: Joint accounts allocated to each individual. If Mr. Jones has sole accounts and holds joint accounts with Ms. Smith who also has sole accounts, the joint accounts would be allocated to both Mr. Jones and Ms. Smith. Consistent entities: If Mr Jones has sole accounts and holds joint accounts with Ms. Smith who also has sole accounts, then 3 “customers” are defined: Jones, Jones & Smith, Smith. Financially-linked individuals: Whereas consistent entities are considered three separate customers, financially-linked individuals would be considered one customer: “Mr. Jones & Ms. Smith”. When multiple and complex relationships exist, taking a pragmatic approach to define your customers as financially-linked will lead to a better evaluation of predicted loan performance. Evaluation of credit and default risk Most financial institutions calculate a loan default probability on a periodic basis (monthly) for existing loans, in the format of either a custom behavior score or a generic risk score, supplied by a credit bureau. For new loan requests, financial institutions often calculate an application risk score, sometimes used in conjunction with a credit bureau score, often in a matrix-based decision. This approach is challenging for new credit requests where the presence and nature of the existing relationship is not factored into the decision. In most cases, customers with existing relationships are treated in an identical manner to those new applicants with no relationship – the power and value of the organization’s internal data goes overlooked whereby customer satisfaction and profits suffer as a result. One way to overcome this challenge is to use a Strength of Relationship (SOR) indicator. Strength of Relationship (SOR) indicator The Strength of Relationship (SOR) indicator is a single-digit value used to define the nature of the relationship of the customer with financial institution. Traditional approaches for the assignment of a SOR are based upon the following factors Existence of a primary banking relationship (salary deposits) Number of transactional products held (DDA, credit cards) Volume of transactions Number of loan products held Length of time with bank The SOR has a critical role in the calculation of customer level risk grades and strategies and is used to point us to the data that will be the most predictive for each customer. Typically the stronger the relationship, the more we know about our customer, and the more robust will be predictive models of consumer behavior. The more information we have on our customer, the more our models will lean towards internal data as the primary source. For weaker relationships, internal data may not be robust enough alone to be used to calculate customer level limits and there will be a greater dependency to augment internal data with external third party data (credit bureau attributes.) As such, the SOR can be used as a tool to select the type and frequency of external data purchase. Customer Risk Grade (CRG) A customer-level risk grade or behavior score is a periodic (monthly) statistical assessment of the default risk of an existing customer. This probability uses the assumption that past performance is the best possible indicator of future performance. The predictive model is calibrated to provide the probability (or odds) that an individual will incur a “default” on one or more of their accounts. The customer risk grade requires a common definition of a customer across the enterprise. This is required to establish a methodology for treating joint accounts. A unique customer reference number is assigned to those customers defined as “financially-linked individuals”. Account behavior is aggregated on a monthly basis and this information is subsequently combined with information from savings accounts and third party sources to formulate our customer view. Using historical customer information, the behavior score can accurately differentiate between good and bad credit risk individuals. The behavior score is often translated into a Customer Risk Grade (CRG). The purpose of the CRG is to simplify the behavior score for operational purposes making it easier for noncredit/ risk individuals to interpret a grade more easily than a mathematical probability. Different methods for evaluating credit risk will yield different results and an important aspect in the setting of customer exposure thresholds is the ability to perform analytical tests of different strategies in a controlled environment. In my next post, I’ll dive deeper into adaptive control, champion challenger techniques and strategy design fundamentals. Related content: White paper: Improving decisions across the Customer Life Cycle