Latin Hypercube vs. Monte Carlo Sampling

A copula structure generated using RLHS.

In a recent posting on Linked In entitled The pros and cons of Latin Hypercube sampling, David Vose compares Latin Hypercube sampling to Monte Carlo and argues that the advantages of Latin Hypercube are so minimal that "LHS does not deserve a place in modern simulation software." [1]  It is an interesting article and he makes several good points, yet despite extensive application to thousands of real-life probabilistic models over several decades, products like Analytica and Crystal Ball still provide Latin Hypercube sampling and even use it as the default sampling method. Why is this? Are we, the makers of the simulation software products, naïve? As the lead technical architect of Analytica over the past two decades, I've studied the performance of Latin Hypercube vs. Monte Carlo over hundreds of real-world models, and examined the question of whether we should continue to keep Latin Hypercube as the default sampling method. My conclusion is that it does make sense to keep Latin … [Read more...]

Estimation of Mutual Information

uncorrelated association

Abstract: This article explores the difficult problem of estimating the mutual information between two variables from a sample of data. I use examples to demonstrate the challenges, and introduce a new algorithm for estimating mutual information along with an explicit representation of its uncertainty. A measure of association is a function that rates the strength of statistical dependence between two variables. In casual talk, people often express this idea by asserting that two things are "correlated"; however, in statistics the term correlated has a more precise meaning, where correlation (or more precisely, Pearson correlation) is a measure of how linear a statistical dependence is. Also in common usage is rank correlation, which measures how monotonic a statistical dependency is. But many dependencies can elude these measures, motivating the use of other measures of association. Another possible measure is mutual information, I(x,y). In Figure 1, the variables have a … [Read more...]

Introduction to Analytica

[Read more...]