Latin Hypercube vs. Monte Carlo Sampling

A copula structure generated using RLHS.

In a recent post on Linked In, David Vose argues that the advantages of Latin Hypercube sampling (LHS) over Monte Carlo are so minimal that "LHS does not deserve a place in modern simulation software." [1]  He makes some interesting points, yet products like Analytica and Crystal Ball still provide LHS and even offer it as their default method. Why? Are we, the makers of these simulation products naïve? As the lead architect of Analytica for two decades, I've explored this question in detail. I've compared the performance of LHS vs. Monte Carlo on hundreds of real-world models. And I've concluded that yes —it does make sense to keep Latin Hypercube as the default method. Let me explain why I disagree with David Vose on some issues and agree with him on others. Several of his complaints are specific to Crystal Ball or @Risk and don't apply to Analytica. Then I'll add some key insights garnered from my own experience. What is Latin Hypercube Sampling? First some background. (Feel … [Read more...]

Estimation of Mutual Information

uncorrelated association

Abstract: This article explores the difficult problem of estimating the mutual information between two variables from a sample of data. I use examples to demonstrate the challenges, and introduce a new algorithm for estimating mutual information along with an explicit representation of its uncertainty. A measure of association is a function that rates the strength of statistical dependence between two variables. In casual talk, people often express this idea by asserting that two things are "correlated"; however, in statistics the term correlated has a more precise meaning, where correlation (or more precisely, Pearson correlation) is a measure of how linear a statistical dependence is. Also in common usage is rank correlation, which measures how monotonic a statistical dependency is. But many dependencies can elude these measures, motivating the use of other measures of association. Another possible measure is mutual information, I(x,y). In Figure 1, the variables have a … [Read more...]

Introduction to Analytica

[Read more...]