At A Glance
It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.Paragraph Block- is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.


Heading 2
Heading 3
Heading 4
Heading 5
- This is a list
- Item 1
- Item 2
- Sub list
- Sub list 2
- Sub list 3
- More list
- More list 2
- More list 3
- More more
- More more
This is the pull quote block Lorem Ipsumis simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s,
ExperianThis is the citation

This is the pull quote block Lorem Ipsumis simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s,
ExperianThis is the citation
| Table element | Table element | Table element |
| my table | my table | my table |
| Table element | Table element | Table element |

Media Text Block
of the printing and typesetting industry. Lorem Ipsum has been the industry’s standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum
My Small H5 Title


As E-Government customer demand and opportunity increases, so too will regulatory requirements and associated guidance become more standardized and uniformly adopted. Regardless of credentialing techniques and ongoing access management, all enrollment processes must continue to be founded in accurate and, most importantly, predictive risk-based authentication. Such authentication tools must be able to evolve as new technologies and data assets become available, as compliance requirements and guidance become more defined, and as specific fraud threats align with various access channels and unique customer segments. A risk-based fraud detection system allows institutions to make customer relationship and transactional decisions based not on a handful of rules or conditions in isolation, but on a holistic view of a customer’s identity and predicted likelihood of associated identity theft. To implement efficient and appropriate risk-based authentication procedures, the incorporation of comprehensive and broadly categorized data assets must be combined with targeted analytics and consistent decisioning policies to achieve a measurably effective balance between fraud detection and positive identity proofing results. The inherent value of a risk-based approach to authentication lies in the ability to strike such a balance not only in a current environment, but as that environment shifts as do its underlying forces. The National Institute of Standards and Technology, in special publication 800-63, defines electronic authentication (E-authentication) as “the process of establishing confidence in user identities electronically presented to an information system”. Since, as stated in publication 800-63, “individuals are enrolled and undergo an identity proofing process in which their identity is bound to an authentication secret, called a token”, it is imperative that identity proofing is founded in an approach that generates confidence in the authentication process. Experian believes that a risk-based approach that can separate valid from invalid identities using a combination of data and proven quantitative techniques is best. As “individuals are remotely authenticated to systems and applications over an open network, using a token in an authentication protocol”, enrollment processes that drive ultimate provision of tokens must be implemented with an eye towards identity risk, and not simply a series of checks against one or more third party data assets. If the “keys to the kingdom” are housed in the ongoing use of tokens provided by Credentials Service Providers (CRA) and binding credentials to that token, trusted Registration Authorities (RA) must employ highly predictive identity proofing techniques designed to segment true, low-risk identities from identities that may have been manipulated, fabricated, or in true-form are subject to fraudulent use, abuse or victimization. Many compliance-oriented authentication requirements (ex. USA PATRIOT Act, FACTA Red Flags Rule) and resultant processes hinge upon identity element (ex. name, address, Social Security number, phone number) validation and verification checks. Without minimizing the importance of performing such checks, the purpose of a more risk-based approach to authentication is to leverage other data sources and quantitative techniques to further assess the probability of fraudulent behavior.

Experian recently contributed to a TSYS whitepaper focused on the various threats associated with first party fraud. I think the paper does a good job at summarizing the problem, and points out some very important strategies that can be employed to help both prevent first party fraud losses and detect those already in an institution’s active and collections account populations. I’d urge you to have a look at this paper as you begin asking the right questions within your own organization. Watch here The bad news is that first party fraud may currently account for up to 20 percent of credit charge-offs. The good news is that scoring models (using a combination of credit attributes and identity element analysis) targeted at various first party fraud schemes such as Bust Out, Never Pay, and even Synthetic Identity are quite effective in all phases of the customer lifecycle. Appropriate implementation of these models, usually involving coordinated decisioning strategies across both fraud and credit policies, can stem many losses either at account acquisition, or at least early enough in an account management stage, to substantially reduce average fraud balances. The key is to prevent these accounts from ending up in collections queues where they’ll never have any chance of actually being collected upon. A traditional customer information program and identity theft prevention program (associated, for example with the Red Flags Rule) will often fail to identify first party fraud, as these are founded in identity element verification and validation, checks that often ‘pass’ when applied to first party fraudsters.

By: Wendy Greenawalt Large financial institutions have acknowledged for some time that taking a more consumer-centric versus product-centric approach can be a successful strategy for an organization. However, implementing such a strategy can be difficult, because inherently organizations want to promote a specific product for one reason or another. With the current economic unrest, organizations are looking for ways to improve customer loyalty with their most profitable and lowest risk customers. They are also looking for ways to improve offers to consumers to provide segment of one decisioning, while satisfying organizational goals. Customer management, and specifically cross-sell or up-sell strategies, are a great example of where organizations can implement what I call “segment of one decisioning”. In essence, this refers to identifying the best possible decision or outcome for a specific consumer when given multiple offers, scenarios and objectives. Marketers strive to identify the best strategies to maximize decision-making, while minimizing costs. For many, this takes the form of models and complex strategy trees or spreadsheets to identify the ideal offering for a segment of consumers. While this approach is effective, algorithm-based decisioning processes exist that can help organizations identify the optimal decisioning strategies, while considering all possible options at a consumers level. By leveraging an optimization tool, organizations can expand the decision process by considering all variables and all alternatives to find the most cost effective, most-likely-to-be-successful strategies. By optimizing decisions, marketers can determine the ideal offer, while quantifying the ROI and adhering to budgetary or other campaign constraints. Many organizations are once again focusing on account growth and building strategies to implement in the near future. With the limited pool of qualified candidates and increased competition, it is more important than ever that each consumer offer be the best to increase response rates, achieve portfolio growth goals and build a profitable portfolio.
In this article…
typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.


