Performance metric for optical communication system is typically defined by its bit-error-rate (BER) and system outage (unavailability). Actual BER of a system is dependent on noise generated by the optical amplifiers and other random system parameters (e.g., dispersion, polarization, etc). Extreme (or rare) events in the fluctuation of random parameters and processes cause most of the errors. As a result, quantifying system performance by estimating BER involves estimating the system performance when these extreme events occur. It is very difficult to simulate a realistic environment in the laboratory. In addition, experimental estimation of outage probability will take unrealistic length of time. As a result numerical simulation is the tool of choice. Mote-Carlo simulations are not generally very effective in simulating rare events. Techniques like biased Mote-Carlo simulations and other variance reduction techniques evolved to mitigate the difficulty. However, in order to bias the system to simulate a rare (or extreme) event which impacts the system adversely one needs to understand how to bias a system variable in order to degrade performance. Biasing parameters become even more difficult when multiple random variables are involved.
Difficulty of simulating such rare events in optical system simulation is discussed by demonstrating an example for polarization mode dispersion (PMD). The goal is to review the state-of-the-art in rare event simulation for PMD and then to ask the question "how to extend this state-of-the-art for more comprehensive set of system variables.