<-- Zurück

Nonasymptotic bounds for suboptimal importance sampling

von Carsten Hartmann, Lorenz Richter

Jahr:

2021

Publikation:

eprint arXiv:2102.09606

Abstrakt:

Importance sampling is a popular variance reduction method for Monte Carlo estimation, where a notorious question is how to design good proposal distributions. While in most cases optimal (zero-variance) estimators are theoretically possible, in practice only suboptimal proposal distributions are available and it can often be observed numerically that those can reduce statistical performance significantly, leading to large relative errors and therefore counteracting the original intention.

Link:

Read the paper

Additional Information


Brief introduction of the dida co-author(s) and relevance for dida's ML developments.

About the Co-Author

With an original focus on stochastics and numerics (FU Berlin), the mathematician has been dealing with deep learning algorithms for some time now. Besides his interest in the theory, he has practically solved multiple data science problems in the last 10 years. Lorenz leads the machine learning team.