Slashing Product Development Time in Financial Services

Published:

In a data-rich environment like insurance, it is often easier to quantify opportunities associated with speed than it is in other industries. A global financial services company knew it took them 18 months to get new nationwide variable annuity products to the market. The company’s vast database and proven models allowed them to estimate that […]

Read more

Using Blended Optimization to Quantify the Qualitative

Published:

Because of corporate collapses such as Enron and WorldCom, large financial institutions during the last decade have had to absorb enormous losses. In the wake of these fiscal disasters, financial organizations needed to develop risk-rating scorecards to help them become much more quantitative in how they evaluate and assign risks. Six Sigma can provide a […]

Read more

Using DFLSS to Evaluate a New Loan Referral Process

Published:

Many a company has gotten caught up in the enthusiasm that comes with a great-sounding new idea. It is great when that idea turns into a profitable new product or service. But it can be a disaster when a lot time and money is invested only to end up with a dud. How great would […]

Read more

Making Use of DFLSS to Cultivate a Shared Compute Farm

Published:

Digital check processing. Fraud detection. Compliance monitoring. Trading floor analytics. These are all integral functions for many modern financial service institutions. And all have one thing in common – they require a huge amount of computing power. One approach to providing fast responses even with highly complex computations is distributed computing, or creating a shared compute […]

Read more
To top