Financial Services

Loading...

Well, here we are about two weeks from the Federal Trade Commission’s June 1, 2010 Red Flags Rule enforcement date.  While this date has been a bit of a moving target for the past year or so, I believe this one will stick.  It appears that the new reality is one in which individual trade associations and advocacy groups will, one by one, seek relief from enforcement and related penalties post-June 1.  Here’s why I say that: The American Bar Association has already file suit against the FTC, and in October, 2009, The U.S. District Court for the District of Columbia ruled that the Red Flags Rule is not applicable to attorneys engaged in the practice of law.  While an appeal of this case is still pending, in mid-March, the U.S. District Court for the District of Columbia issued another order declaring that the FTC should postpone enforcement of the Red Flags Rule “with respect to members of the American Institute of Certified Public Accountants” engaged in practice for 90 days after the U.S. Court of Appeals for the District of Columbia renders an opinion in the American Bar Association’s case against the FTC.” Slippery slope here.  Is this what we can expect for the foreseeable future? A rather ambiguous guideline that leaves openings for specific categories of “covered entities” to seek exemption?  The seemingly innocuous element to the definition of “creditor” that includes “businesses or organizations that regularly defer payment for goods or services or provide goods or services and bill customers later” is causing havoc among peripheral industries like healthcare and other professional services. Those of you in banking are locked in for sure, but it ought to be an interesting year as the outliers fight to make sense of it all while they figure out what their identity theft prevention programs should or shouldn’t be.  

Published: May 13, 2010 by Keir Breitenfeld

By: Kari Michel Credit quality deteriorated across the credit spectrum during the recession that began in December, 2007. As the recession winds down, lenders must start strategically assessing credit risk and target creditworthy consumer segments for lending opportunities, while avoiding those segments where consumer credit quality could continue to slip. Studies and analyses by VantageScore Solutions, LLC demonstrate that there are more than 60 million creditworthy borrowers in the United States - 7 million of whom cannot be identified using standard scoring models. Leveraging methods using VantageScore® in conjunction with consumer credit behaviors can effectively identify profitable opportunities and segments that require increased risk mitigation thus optimizing decisions. VantageScore Solutions examined how consumers credit scores changed over a 12 month period.  The study focused on three areas of consumer behavior: Stable:  consumers that stay within the same credit tier for one year Improving:  consumers that move to a higher credit tier in any quarter and remain at a high credit tier for the remainder of the timeframe Deteriorating: consumers that move to a lower credit tier in any quarter and remain at a lower credit tier for the remainder of the timeframe Through a segmentation approach, using the three credit behaviors above and credit quality tiers, emerges a clearer picture into profitable segments for acquisitions and existing account management strategies. Download the white paper, “Finding creditworthy consumers in a changing economic climate”, for more information on finding creditworthy consumers from VantageScore Solutions. Lenders can use a similar segmentation analysis on their own population to identify pockets of opportunity to move beyond recession-based management strategies and intelligently re-enter into the world of originations and maximize portfolio profitability.

Published: May 13, 2010 by Guest Contributor

By: Wendy Greenawalt The auto industry has been hit hard by this Great Recession. Recently, some good news has emerged from the captive lenders, and the industry is beginning to rebound from the business challenges they have faced in the last few years.  As such, many lenders are looking for ways to improve risk management and strategically grow their portfolio as the US economy begins to recover. Due to the economic decline, the pool of qualified consumers has shrunk, and competition for the best consumers has significantly increased. As a result, approval terms at the consumer level need to be more robust to increase loan origination and booking rates of new consumers. Leveraging optimized decisions is a way lenders can address regional pricing pressure to improve conversion rates within specific geographies. Specifically, lenders can perform a deep analysis of specific competitors such as captives, credit unions and banks to determine if approved loans are being lost to specific competitor segments. Once the analysis is complete, auto lenders can leverage optimization software to create robust pricing, loan amount and term account strategies to effectively compete within specific geographic regions and grow profitable portfolio segments. Optimization software utilizes a mathematical decisioning approach to identify the ideal consumer level decision to maximize organizational goals while considering defined constraints. The consumer level decisions can then be converted into a decision tree that can be deployed into current decisioning strategies to improve profitability and meet key business objectives over time.  

Published: May 10, 2010 by Guest Contributor

By: Staci Baker With the shift in the economy, it has become increasingly more difficult to gauge -- in advance -- what a consumer is going to do when it comes to buying an automobile.  However, there are tools available that allow auto lenders to gain insight into auto loans/leases that were approved but did not book, and for assessing credit risk of their consumers.  By gaining competitive insight and improving  risk management, an auto lender is able to positively impact loan origination strategies by determining the proper loan or lease term, what the finance offer should be and proactively address each unique market and risk segment. As the economy starts to rebound, the auto industry needs to take a more proactive approach in the way its members acquire business; the days of business-as-usual are gone.  All factors except the length of the loan being the same, if one auto dealer is extending 60-month loans per its norm and the dealer down the road is extending 72-month loans, a consumer may choose the longer loan period to help conserve cash for other items. This is one scenario for which auto dealers could leverage Experian’s Auto Prospect Intelligence(SM).  By performing a thorough analysis of approved loans that booked with other auto lenders, and their corresponding terms, auto lenders will receive a clear picture of who they are losing their loans to.  This information will allow an organization to compare account terms within specific peer group or institution type (captive/banks/credit union) and address discrepancies by creating more robust pricing structures and enhanced loan terms, which will result in strategic portfolio growth.    

Published: May 7, 2010 by Guest Contributor

Since 2007, when the housing and credit crises started to unfold, we’ve seen unemployment rates continue to rise (9.7% in March 2010 *)  with very few indicators that they will return to levels that indicate a healthy economy any time soon. I’ve also found myself reading about the hardship and challenge that people are facing in today’s economy, and the question of creditworthiness keeps coming into my mind, especially as it relates to employment, or the lack thereof, by a consumer. Specifically, I can’t help but sense that there is a segment of the unemployed that will soon possess a better risk profile than someone who has remained employed throughout this crisis. In times of consistent economic performance, the static state does not create the broad range of unique circumstances that comes when sharp growth or decline occurs. For instance, the occurrence of strategic default is one circumstance where the capacity to pay has not been harmed, but the borrower defaults on the commitment anyway. Strategic defaults are rare in a stable market. In contrast, many unemployed individuals who have encountered unfortunate circumstances and are now out of work may have repayment issues today, but do possess highly desirable character traits (willingness to pay) that enhance their long-term desirability as a borrower. Although the use of credit score trends, credit risk modeling and credit attributes are essential in assessing the risk within these different borrowers, I think new risk models and lending policies will need to adjust to account for the growing number of individuals who might be exceptions to current policies. Will character start to account for more than a steady job? Perhaps. This change in lending policy, may in turn, allow lenders to uncover new and untapped opportunities for growth in segments they wouldn’t traditionally serve. *  Source: US Department of Labor. http://www.bls.gov/bls/unemployment.htm

Published: April 29, 2010 by Kelly Kent

A common request for information we receive pertains to shifts in credit score trends. While broader changes in consumer migration are well documented – increases in foreclosure and default have negatively impacted consumer scores for a group of consumers – little analysis exists on the more granular changes between the score tiers. For this blog, I conducted a brief analysis on consumers who held at least one mortgage, and viewed the changes in their score tier distributions over the past three years to see if there was more that could be learned from a closer look. I found the findings to be quite interesting. As you can see by the chart below, the shifts within different VantageScore tiers shows two major phases. Firstly, the changes from 2007 to 2008 reflect the decline in the number of consumers in VantageScore B, C, and D, and the increase in the number of consumers in VantageScore F. This is consistent with the housing crisis and economic issues at that time. Also notable at this time is the increase in VantageScore A proportions. Loan origination trends show that lenders continued to supply credit to these consumers in this period, and the increase in number of consumers considered ‘super prime’ grew. The second phase occurs between 2008 and 2010, where there is a period of stabilization for many of the middle-tier consumers, but a dramatic decline in the number of previously-growing super-prime consumers. The chart shows the decline in proportion of this high-scoring tier and the resulting growth of the next highest tier, which inherited many of the downward-shifting consumers. I find this analysis intriguing since it tends to highlight the recent patterns within the super-prime and prime consumer and adds some new perspective to the management of risk across the score ranges, not just the problematic subprime population that has garnered so much attention. As for the true causes of this change – is unemployment, or declining housing prices are to blame? Obviously, a deeper study into the changes at the top of the score range is necessary to assess the true credit risk, but what is clear is that changes are not consistent across the score spectrum and further analyses must consider the uniqueness of each consumer.

Published: April 27, 2010 by Kelly Kent

By: Wendy Greenawalt Optimization has become somewhat of a buzzword lately being used to solve all sorts of problems. This got me thinking about what optimizing decisions really means to me? In pondering the question, I decided to start at the beginning and really think about what optimization really stands for. For me, it is an unbiased mathematical way to determine the most advantageous solution to a problem given all the options and variables. At its simplest form, optimization is a tool, which synthesizes data and can be applied to everyday problems such as determining the best route to take when running errands. Everyone is pressed for time these days and finding a few extra minutes or dollars left in our bank account at the end of the month is appealing. The first step to determine my ideal route was to identify the different route options, including toll-roads, factoring the total miles driven, travel time and cost associated with each option. In addition, I incorporated limitations such as required stops, avoid main street, don’t visit the grocery store before lunch and must be back home as quickly as possible. Optimization is a way to take all of these limitations and objectives and simultaneously compare all possible combinations and outcomes to determine the ideal option to maximize a goal, which in this case was to be home as quickly as possible. While this is by its nature a very simple example, optimizing decisions can be applied to home and business in very imaginative and effective means. Business is catching on and optimization is finding its way into more and more businesses to save time and money, which will provide a competitive advantage. I encourage all of you to think about optimization in a new way and explore the opportunities where it can be applied to provide improvements over business-as-usual as well as to improve your quality of life.  

Published: April 20, 2010 by Guest Contributor

I received a call on my cell phone the other day. It was my bank calling because a transaction outside of my normal behavior pattern tripped a flag in their fraud models. “Hello!\" said the friendly, automated voice, “I’m calling from [bank name] and we need to talk to you about some unusual transaction activity on your account, but before we do, I need to make sure Monica Bellflower has answered the phone. We need to ask you a few questions for security reasons to protect your account. Please hold on a moment.”  At this point, the IVR (Interactive Voice Response) system invoked a Knowledge Based Authentication session that the IVR controlled. The IVR, not a call center representative, asked me the Knowledge Based Authentication questions and confirmed the answers with me. When the session was completed, I had been authenticated, and the friendly, automated voice thanked me before launching into the list of transactions to be reviewed. Only when I questioned the transaction was I transferred, immediately – with no hold time, to a human fraud account management specialist. The entire process was seamless and as smooth as butter. Using IVR technology is not new, but using IVR to control a Knowledge Based Authentication session is one way of controlling operational expenses. An example of this is reducing the number of humans that are required, while increasing the ROI made in both the Knowledge Based Authentication tool and the IVR solution.  From a risk management standpoint, the use of decisioning strategies and fraud models allows for the objective review of a customer’s transactions, while employing fraud best practices. After all, an IVR never hinted at an answer or helped a customer pass Knowledge Based Authentication, and an IVR didn\'t get hired in a call center for the purpose of committing fraud. These technologies lend themselves well, to fraud alerts and identity theft prevention programs, and also to account management activities. Experian has successfully integrated Knowledge Based Authentication with IVR as part of relationship management and/or risk management solutions.  To learn more, visit the Experian website at: https://www.experian.com/decision-analytics/fraud-detection.html?cat1=fraud-management&cat2=detect-and-reduce-fraud).  Trust me, Knowledge Based Authentication with IVR is only the beginning. However, the rest will have to wait; right now my high-tech, automated refrigerator is calling to tell me I\'m out of butter.

Published: April 20, 2010 by Monica Pearson

By: Ken Pruett I want to touch a bit on some of the third party fraud scenarios that are often top of mind with our customers: identity theft; synthetic identities; and account takeover. Identity Theft Identity theft usually occurs during the acquisition stage of the customer life cycle. Simply put, identity theft is the use of stolen identity information to fraudulently open up a new account.  These accounts do not have to be just credit card related. For example, there are instances of people using others identities to open up wireless phone and utilities accounts Recent fraud trends show this type of fraud is on the rise again after a decrease over the past several years.  A recent Experian study found that people who have better credit scores are more likely to have their identity stolen than those with very poor credit scores. It does seem logical that fraudsters would likely opt to steal an identity from someone with higher credit limits and available purchasing power.  This type of fraud gets the majority of media attention because it is the consumer who is often the victim (as opposed to a major corporation). Fraud changes over time and recent findings show that looking at data from a historical perspective is a good way to help prevent identity theft.  For example, if you see a phone number being used by multiple parties, this could be an indicator of a fraud ring in action.  Using these types of data elements can make your fraud models much more predictive and reduce your fraud referral rates. Synthetic Identities Synthetic Identities are another acquisition fraud problem.  It is similar to identity theft, but the information used is fictitious in nature.  The fraud perpetrator may be taking pieces of information from a variety of parties to create a new identity.  Trade lines may be purchased from companies who act as middle men between good consumers with good credit and perpetrators who creating new identities.   This strategy allows the fraud perpetrator to quickly create a fictitious identity that looks like a real person with an active and good credit history. Most of the trade lines will be for authorized users only.  The perpetrator opens up a variety of accounts in a short period of time using the trade lines. When creditors try to collect, they can’t find the account owners because they never existed.  As Heather Grover mentioned in her blog, this fraud has leveled off in some areas and even decreased in others, but is probably still worth keeping an eye on.  One concern on which to focus especially is that these identities are sometimes used for bust out fraud. The best approach to predicting this type of fraud is using strong fraud models that incorporate a variety of non-credit and credit variables in the model development process.  These models look beyond the basic validation and verification of identity elements (such as name, address, and social security number), by leveraging additional attributes associated with a holistic identity -- such as inconsistent use of those identity elements. Account Takeover Another type of fraud that occurs during the account management period of the customer life cycle is account takeover fraud.  This type of fraud occurs when an individual uses a variety of methods to take over an account of another individual. This may be accomplished by changing online passwords, changing an address or even adding themselves as an authorized user to a credit card. Some customers have tools in place to try to prevent this, but social networking sites are making it easier to obtain personal information for many consumers.  For example, a person may have been asked to provide the answer to a challenge question such as the name of their high school as a means to properly identify them before gaining access to a banking account.  Today, this piece of information is often readily available on social networking sites making it easier for the fraud perpetrators to defeat these types of tools. It may be more useful to use out of wallet, or knowledge-based authentication and challenge tools that dynamically generate questions based on credit or public record data to avoid this type of fraud.  

Published: April 5, 2010 by Guest Contributor

By: Wendy Greenawalt In my last few blogs, I have discussed how optimization can be leveraged to make improved decisions across an organization while considering the impact that opimizing decisions have to organizational profits, costs or other business metrics. In this entry, I would like to discuss how optimization is used to improve decisions at the point of acquisition, while minimizing costs. Determining the right account terms at inception is increasingly important due to recent regulatory legislation such as the Credit Card Act.  Doing so plays a role in assessing credit risk, relationship managment, and increasing out of wallet share. These regulations have established guidelines specific to consumer age, verification of income, teaser rates and interest rate increases. Complying with these regulations will require changes to existing processes and creation of new toolsets to ensure organizations adhere to the guidelines. These new regulations will not only increase the costs associated with obtaining new customers, but also the long term revenue and value as changes in account terms will have to be carefully considered. The cost of on-boarding and servicing individual accounts continues to escalate while internal resources remain flat. Due to this, organizations of all sizes are looking for ways to improve efficiency and decisions while minimizing costs. Optimizing decisions is an ideal solution to this problem. Optimized strategy trees (trees that optimize decisioning strategies) can be easily implemented into current processes to ensure lending decisions adhere to organizational revenue, growth or cost objectives as well as regulatory requirements.  Optimized strategy trees enable organizations to create executable strategies that provide on-going decisions based upon optimization conducted at a consumer level. Optimized strategy trees outperform manually created trees as they are created utilizing sophisticated mathematical analysis and ensure organizational objectives are adhered to. In addition, an organization can quantify the expected ROI of decisioning strategies and provide validation in strategies – before implementation. This type of data is not available without the use of a sophisticated optimization software application.  By implementing optimized strategy trees, organizations can minimize the volume of accounts that must be manually reviewed, which results in lower resource costs. In addition, account terms are determined based on organizational priorities leading to increased revenue, retention and profitability.

Published: April 5, 2010 by Guest Contributor

By: Wendy Greenawalt Financial institutions have placed very little focus on portfolio growth over the last few years.  Recent market updates have provided little guidance to the future of the marketplace, but there seems to be a consensus that the US economic recovery will be slow compared to previous recessions. The latest economic indicators show that slow employment growth, continued property value fluctuations and lower consumer confidence will continue to influence the demand and issuance of new credit. However, the positive aspect is that most analysts agree that these indicators will improve over the next 12 to 24 months. Due to this, lenders should start thinking about updating acquisition strategies now and consider new tools that can help them reach their short and long-term portfolio growth goals. Most financial institutions have experienced high account delinquency levels in the past few years. These account delinquencies have had a major impact to consumer credit scores. The bad news is that the pool of qualified candidates continues to shrink so the competition for the best consumers will only increase over the next few years. Identifying target populations and improving response/booking rates will be a challenge for some time so marketers must create smarter, more tailored offers to remain competitive and strategically grow their portfolios. Recently, new scores have been created to estimate consumer income and debt ratios when combined with consumer credit data. This data can be very valuable and when combined with optimization (optimizing decisions) can provide robust acquisition strategies. Specifically, optimization / optimizing decisions allows an organization to define product offerings, contact methods, timing and consumer known preferences, as well as organizational goals such as response rates, consumer level profitability and product specific growth metrics into a software application. The optimization software will then utilize a proven mathematical technique to identify the ideal product offering and timing to meet or exceed the defined organizational goals.  The consumer level decisions can then be executed via normal channels such as mail, email or call centers. Not only does optimization software reduce campaign development time, but it also allows marketers to quantify the effectiveness of marketing campaigns – before execution. Today, optimization technology provide decision analytics accessible for organizations of almost any size and can provide an improvement over business-as-usual techniques for decisioning strategies. If your organization is looking for new tools to incorporate into existing acquisition processes, I would encourage you to consider optimization and the value it can bring to your organization.

Published: April 1, 2010 by Guest Contributor

There seems to be two viewpoints in the market today about Knowledge Based Authentication (KBA): one positive, one negative.  Depending on the corner you choose, you probably view it as either a tool to help reduce identity theft and minimize fraud losses, or a deficiency in the management of risk and the root of all evil.  The opinions on both sides are pretty strong, and biases “for” and “against” run pretty deep. One of the biggest challenges in discussing Knowledge Based Authentication as part of an organization’s identity theft prevention program, is the perpetual confusion between dynamic out-of-wallet questions and static “secret” questions.  At this point, most people in the industry agree that static secret questions offer little consumer protection.  Answers are easily guessed, or easily researched, and if the questions are preference based (like “what is your favorite book?”) there is a good chance the consumer will fail the authentication session because they forgot the answers or the answers changed over time. Dynamic Knowledge Based Authentication, on the other hand, presents questions that were not selected by the consumer.  Questions are generated from information known about the consumer – concerning things the true consumer would know and a fraudster most likely wouldn’t know.  The questions posed during Knowledge Based Authentication sessions aren’t designed to “trick” anyone but a fraudster, though a best in class product should offer a number of features and options.  These may allow for flexible configuration of the product and deployment at multiple points of the consumer life cycle without impacting the consumer experience. The two are as different as night and day.  Do those who consider “secret questions” as Knowledge Based Authentication consider the password portion of the user name and password process as KBA, as well?  If you want to hold to strict logic and definition, one could argue that a password meets the definition for Knowledge Based Authentication, but common sense and practical use cause us to differentiate it, which is exactly what we should do with secret questions – differentiate them from true KBA. KBA can provide strong authentication or be a part of a multifactor authentication environment without a negative impact on the consumer experience.  So, for the record, when we say KBA we mean dynamic, out of wallet questions, the kind that are generated “on the fly” and delivered to a consumer via “pop quiz” in a real-time environment; and we think this kind of KBA does work.  As part of a risk management strategy, KBA has a place within the authentication framework as a component of risk- based authentication… and risk-based authentication is what it is really all about.  

Published: March 5, 2010 by Monica Pearson

When a client is selecting questions to use, Knowledge Based Authentication is always about the underlying data – or at least it should be.  The strength of Knowledge Based Authentication questions will depend, in large part, on the strength of the data and how reliable it is.  After all, if you are going to depend on Knowledge Based Authentication for part of your risk management and decisioning strategy the data better be accurate.  I’ve heard it said within the industry that clients only want a system that works and they have no interest where the data originates.  Personally, I think that opinion is wrong. I think it is closer to the truth to say there are those who would prefer if clients didn’t know where the data that supports their fraud models and Knowledge Based Authentication questions originates; and I think those people “encourage” clients not to ask.  It isn’t a secret that many within the industry use public record data as the primary source for their Knowledge Based Authentication products, but what’s important to consider is just how accessible that public record information is.  Think about that for a minute.  If a vendor can build questions on public record data, can a fraudster find the answers in public record data via an online search? Using Knowledge Based Authentication for fraud account management is a delicate balance between customer experience/relationship management and risk management.  Because it is so important, we believe in research – reading the research of well-known and respected groups like Pew, Tower, Javelin, etc. and doing our own research.  Based on our research, I know consumers prefer questions that are appropriate and relative to their activity.  In other words, if the consumer is engaged in a credit-granting activity, it may be less appropriate to ask questions centered on personal associations and relatives.  Questions should be difficult for the fraudster, but not difficult or perceived as inappropriate or intrusive by the true consumer.  Additionally, I think questions should be applicable to many clients and many consumers.  The question set should use a mix of data sources: public, proprietary, non-credit, credit (if permissible purpose exists) and innovative. Is it appropriate to have in-depth data discussions with clients about each data source?  Debatable.  Is it appropriate to ensure that each client has an understanding of the questions they ask as part of Knowledge Based Authentication and where the data that supports those questions originates?  Absolutely.    

Published: March 2, 2010 by Monica Pearson

By: Kari Michel What is Basel II?  Basel II is the international convergence of Capital Measurement and Capital Standards. It is a revised framework and is the second iteration of an international standard of laws. The purpose of Basel II is to create an international standard that banking regulators can use when creating regulations about how much capital banks need to put aside to guard against the types of financial and operations risk banks face.  Basel II ultimately implements standards to assist in maintaining a healthy financial system. The business challenge The framework for Basel II compels the supervisors to ensure that banks implement credit rating techniques that represent their particular risk profile.  Besides the risk inputs (Probability of Default (PD), Loss Given Default (LGD) and Exposure at Default (EAD)) calculation, the final Basel accord includes the “use test” requirement which is the requirement for a firm to use an advanced approach more widely in its business and met merely for calculation of regulatory capital. Therefore many financial institutions are required to make considerable changes in their approach to risk management (i.e. infrastructure, systems, processes, data requirements).  Experian is a leading provider of risk management solutions -- products and services for the new Basel Capital Accord (Basel II).  Experian’s approach includes consultancy, software, and analytics tailored to meet the lender’s Basel II requirements.  

Published: February 26, 2010 by Guest Contributor

A recent January 29, 2010 article in the Wall Street Journal * discussing the repurchasing of loans by banks from Freddie Mae and Fannie Mac included a simple, yet compelling statement that I feel is worth further analysis. The article stated that \"while growth in subprime defaults is slowing, defaults on prime loans are accelerating.\" I think this statement might come as a surprise to some who feel that there is some amount of credit risk and economic immunity for prime and super-prime consumers – many of whom are highly sought-after in today’s credit market. To support this statement, I reference a few statistics from the Experian-Oliver Wyman Market Intelligence Reports: • From Q1 2007 to Q1 2008, 30+ DPD mortgage delinquency rates for VantageScore A and B consumers remained flat (actually down 2%); while near-prime, subprime, and deep-subprime consumers experienced an increase of over 36% in 30+ rates. • From Q4 2008 to Q4 2009, 30+ DPD mortgage delinquency rates for VantageScore A and B consumers increased by 42%; whereas consumers in the lower VantageScore tiers saw their 30+ DPD rate increase by only 23% in the same period Clearly, whether through economic or some other form of impact, repayment practices of prime and super-prime, consumers have been changing as of late, and this is translating to higher delinquency rates. The call-to-action for lenders, in their financial risk management and credit risk modeling efforts, is increased attentiveness in assessing credit risk beyond just a credit score...whether this be using a combination of scores, or adding Premier Attributes into lending models – in order to fully assess each consumer’s risk profile. *  http://online.wsj.com/article/SB10001424052748704343104575033543886200942.html

Published: February 23, 2010 by Kelly Kent

Subscription title for insights blog

Description for the insights blog here

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Categories title

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.

Subscription title 2

Description here
Subscribe Now

Text legacy

Contrary to popular belief, Lorem Ipsum is not simply random text. It has roots in a piece of classical Latin literature from 45 BC, making it over 2000 years old. Richard McClintock, a Latin professor at Hampden-Sydney College in Virginia, looked up one of the more obscure Latin words, consectetur, from a Lorem Ipsum passage, and going through the cites of the word in classical literature, discovered the undoubtable source.

recent post

Learn More Image

Follow Us!