Simulation

There are multiple types of simulation:

1. Discrete Event Simulation

A discrete event simulation (DES) generates a random sample path through a state transition system, with time delays associated with each state.

The times between events are random variables with an associated distribution. To get a sample path, we need to sample these distributions.

1.1 Designing a DES

To design a simulation model, we need to:

  1. Identify the entities in the system that have to be modelled.
  2. Identify the model states (program state variables) - these specify where each entity is and what it is doing.
  3. Identify the event types, recalling each state transition is triggered by an event.
  4. For each event, specify how it changes the current state, and what new events need to be scheduled and what events need to be cancelled when it fires.
  5. Add code to accumulate measurements while the simulation is running.
  6. Add code to output results when the program terminates.

2. Output Analysis

We focus on non-terminating simulations. Assume we are using a simulation to estimate some steady state performance measure. The initial state is typicaly fixed, so the initial probability distribution is different to the distribution after some time , and measures take time to converge.

To avoid initialisation bias, we can discard the initialization transient by resetting the measures after some warm-up time, or render long-enough to render any bias insignificant.

2.1 Confidence Intervals

Discrete event simulations are stochasitc, so outputs are random variables and an each an observation of some measure . If for some , is a steady-state observation from a simulation, then an estimator for is the sample mean . Knowing that , we can construct a confidence interval for :

  1. For large and known , .
  2. Unlike in statistics, we do not know . However, we can generate many intervals using different simulations and conclude that wit h 95% confidence, the true lies in at least 95% of the intervals.
  3. This is the 95% confidence interval for .

2.1.1 Confidence Interval for Mean

For any desired coverage probability of , we can define a confidence interval for the mean as:

So, amongst all possible intervals, we might have obsered % of them not containing the true mean.

2.1.2 Unknown Variance

When we measure a population, we dont know but only the bias corrected sample variance . We can use the t-distribution to construct a confidence interval for if :

Where is the th quantile of the -distribution with degrees of freedom.

2.1.3 Applying to DES

Looking at a single simulation run, a wider confidence interval suggests more uncertainty. Howver, a narrow confidence interval can suggest if we could stop the simulation or not.

Running independent replications of the simulation, guarantees the independence of the .

Another approach is to run the model once, waiting for it to reach approximate equilibrium, and then divide the measurement into batches, with coming from batch . If each is the sample mean of batch , this is called the batch means method. Yet, the may not be independent because the state at the end of one batch is the state at the beginning of the next. In the case that are dependent, we need covariance to construct the confidence interval:

If the covariances are , then becomes an under-estimate of the variance of , and the confidence interval is too narrow.

3. Distribution Sampling

Simulations depend on the ability to sample continuous random distributions. For a random variable (RV) , we want a sampling function .

3.1 Inverse Transform Method

Suppose is a continuous RV with CDF . By setting an RV as and solving for (inverting), we get a transformation from to . This method also works for discrete RVs.

3.2 Acceptance-Rejection Method

If cannot be inverted, we choose a density function that is easy to sample from. Then, we find a constant such that and . By construction, .

Steps:

  1. Let be a sample from an RV whose density function is .
  2. Generate a sample, .
  3. Let .
  4. If (i.e., ), accept . Otherwise, reject it and start again.

The probability of accepting is . The number of required iterations before accepting is geometrically distributed, so the expected iterations .

3.3 Convolution Method

To sample a sum of independent RVs, sample them individually and then add the results.

3.4 Composition Method

Consider a discrete RV with and a continuous RV with . To sample:

  1. Pick an with probability .
  2. Sample from the density .
Back to Home