At A Glance
It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.Paragraph Block- is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.


Heading 2
Heading 3
Heading 4
Heading 5
- This is a list
- Item 1
- Item 2
- Sub list
- Sub list 2
- Sub list 3
- More list
- More list 2
- More list 3
- More more
- More more
This is the pull quote block Lorem Ipsumis simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s,
ExperianThis is the citation

This is the pull quote block Lorem Ipsumis simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s,
ExperianThis is the citation
| Table element | Table element | Table element |
| my table | my table | my table |
| Table element | Table element | Table element |

Media Text Block
of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum
My Small H5 Title


In my previous three postings, I’ve covered basic principles that can define a risk-based authentication process, associated value propositions, and some best-practices to consider. Finally, I’d like to briefly discuss some emerging informational elements and processes that enhance (or have already enhanced) the notion of risk-based authentication in the coming year. For simplicity, I’m boiling these down to three categories: 1. Enterprise Risk Management – As you’d imagine, this concept involves the creation of a real-time, cross channel, enterprise-wide (cross business unit) view of a consumer and/or transaction. That sounds pretty good, right? Well, the challenge has been, and still remains, the cost of developing and implementing a data sharing and aggregation process that can accomplish this task. There is little doubt that operating in a more silo’d environment limits the amount of available high-risk and/or positive authentication data associated with a consumer…and therefore limits the predictive value of tools that utilize such data. It is only a matter of time before we see more widespread implementation of systems designed to look at a single transaction, an initial application profile, previous authentication results, or other relationships a consumer may have within the same organization — and across all of this information in tandem. It’s simply a matter of the business case to do so, and the resources to carry it out. 2. Additional Intelligence – Beyond some of the data mentioned above, some additional informational elements emerging as useful in isolation (or, even better, as a factor among others in a holistic assessment of a consumer’s identity and risk profile) include these areas: IP address vs. physical address comparisons; device ID or fingerprinting; and biometrics (such as voice verification). While these tools are being used and tested in many organizations and markets, there is still work to be done to strike the right balance as they are incorporated into an overall risk-based authentication process. False positives, cost and implementation challenges still hinder widespread use of these tools from being a reality. That should change over time, and quickly to help with the cost of credit risk. 3. Emerging Verification Techniques – Out-of-band authentication is defined as the use of two separate channels, used simultaneously, to authenticate a customer. For example: using a phone to verify the identity of that person while performing a Web transaction. Similarly, many institutions are finding success in initiating SMS texts as a means of customer notification and/or verification of monetary or non-monetary transactions. The ability to reach out to a consumer in a channel alternate to their transaction channel is a customer friendly and cost effective way to perform additional due diligence.

By: Kennis Wong In Part 1 of Generic fraud score, we emphasized the importance of a risk-based approach when it comes to fraud detection. Here are some further questions you may want to consider. What is the performance window? When a model is built, it has a defined performance window. That means the score is predicting a certain outcome within that time period. For example, a traditional risk score may be predicting accounts that are decreasing in twenty-four months. That score may not perform well if your population typically worsens in two months. This question is particularly important when it relates to scoring your population. For example, if a bust-out score has a performance window of three months, and you score your accounts at the time of acquisition, it would only catch accounts that are busting-out within the next three months. As a result, you should score your accounts during periodic account reviews in addition to the time of acquisition to ensure you catch all bust-outs. Therefore, bust out fraud is an important indicator. Which accounts should I score? While it’s typical for creditors to use a fraud score on every applicant at the time of acquisition, they may not score all their accounts during review. For example, they may exclude inactive accounts or older accounts assuming those with a long history means less likelihood of fraud. This mistake may be expensive. For instance, the typical bust-out behavior is for fraudsters to apply for cards way before they intend to bust out. This may be forty-eight months or more. So when you think they are good and profitable customers, they can strike and leave you with seriously injury. Make sure that your fraud database is updated and accurate. As a result, the recommended approach is to score your entire portfolio during account review. How often do I validate the score? The answer is very often — this may be monthly or quarterly. You want to understand whether the score is working for you – do your actual results match the volume and risk projections? Shifts of your score distribution will almost certainly occur over time. To meet your objectives over the long run, continue to monitor and adjust cutoffs. Keep your fraud database updated at all times.

When reviewing offers for prospective clients, lenders often deal with a significant amount of missing information in assessing the outcomes of lending decisions, such as: Why did a consumer accept an offer with a competitor? What were the differentiating factors between other offers and my offer, i.e. what were their credit score trends? What happened to consumers that we declined? Do they perform as expected or better than anticipated? What were their credit risk models? While lenders can easily understand the implications of the loans they have offered and booked with consumers, they often have little information about two important groups of consumers: 1. Lost leads: consumers to whom they made an offer but did not book 2. Proxy performance: consumers to whom financing was not offered, but where the consumer found financing elsewhere. Performing a lost lead analysis on the applications approved and declined, can provide considerable insight into the outcomes and credit performance of consumers that were not added to the lender’s portfolio. Lost lead analysis can also help answer key questions for each of these groups: How many of these consumers accepted credit elsewhere? What were their credit attributes? What are the credit characteristics of the consumers we're not booking? Were these loans booked by one of my peers or another type of lender? What were the terms and conditions of these offers? What was the performance of the loans booked elsewhere? Who did they choose for loan origination? Within each of these groups, further analysis can be conducted to provide lenders with actionable feedback on the implications of their lending policies, possibly identifying opportunities for changes to better fulfill lending objectives. Some key questions can be answered with this information: Are competitors offering longer repayment terms? Are peers offering lower interest rates to the same consumers? Are peers accepting lower scoring consumers to increase market share? The results of a lost lead analysis can either confirm that the competitive marketplace is behaving in a manner that matches a lender’s perspective. It can also shine a light into aspects of the market where policy changes may lead to superior results. In both circumstances, the information provided is invaluable in making the best decision in today’s highly-sensitive lending environment.
In this article…
typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.


