Latin Hypercube Sampling Software Download

 admin  
Latin Hypercube Sampling Software Download

By David Vose Most risk analysis simulation software products offer Latin Hypercube Sampling (LHS). It is a method for ensuring that each probability distribution in your model is evenly sampled which at first glance seems very appealing. The technique dates back to 1980 (even though the @RISK manual describes LHS as “a new sampling technique”) when computers were very slow, the number of distributions in a model was extremely modest and simulations took hours or days to complete. It was, at the time, an appealing technique because it allowed one to obtain a stable output with a much smaller number of samples than simple Monte Carlo simulation, making simulation more practical with the computing tools available at the time. However, desktop computers are now at least 1,000 times faster than the early 1980s, and the value of LHS has disappeared as a result.

Monte Carlo Simulation

LHS does not deserve a place in modern simulation software. We are often asked why we don’t implement LHS in our ModelRisk software, since nearly all other Monte Carlo simulation applications do, so we thought it would be worthwhile to provide an explanation here. What is Latin Hypercube sampling? LHS is a type of stratified sampling. It works by controlling the way that random samples are generated for a probability distribution. Probability distributions can be described by a cumulative curve, like the one below. The vertical axis represents the probability that the variable will fall at or below the horizontal axis value.

Sampling

Imagine we want to take 5 samples from this distribution. We can split the vertical scale into 5 equal probability ranges: 0-20%, 20-40%, 80-100%. If we take one random sample within each range and calculate the variable value that has this cumulative probability, we have created 5 Latin Hypercube samples for this variable.

TBCUnc is a software package that allows the user to estimate the thermal conductivities and associated uncertainties of the component parts of a layered system by matching the predictions of a model to values measured using the laser flash thermal diffusivity experiment. The model results are matched to the measured values using the Levenburg-Marquardt optimisation algorithm. Uncertainties are calculated using a Latin hypercube sample of a size chosen by the user. Outputs include mean, standard deviation and cumulative distribution function for each thermal conductivity, and the correlations between each input and the conductivities. A user manual, two test data sets and a licence file are supplied with the download. The software is a stand-alone executable created in Matlab, and requires installation of the. We are seeking feedback from users on the software, and anyone supplying comments will be given an extra application.

If you supply your contact details on the form on this page you will be contacted in January 2018 for feedback. If you supply comments by the end of January 2018, you will be supplied with an additional executable that packages the model of the laser flash experiment used in the software in a form that generates simulated thermograms for a user-defined layered system. All contact details will only be used for requesting feedback and distributing the additional application, and all contact details will be removed from NPL’s system at the end of the project (May 2018). Please contact Louise Wright with questions or problems.

1 Version. 2.2 MB File size. October 20, 2017 Last updated.

   Coments are closed