Didrik Nielsen
Title
Cited by
Cited by
Year
Tree boosting with xgboost-why does xgboost win" every" machine learning competition?
D Nielsen
NTNU, 2016
173*2016
Fast and scalable bayesian deep learning by weight-perturbation in adam
ME Khan, D Nielsen, V Tangkaratt, W Lin, Y Gal, A Srivastava
arXiv preprint arXiv:1806.04854, 2018
972018
Slang: Fast structured covariance approximations for bayesian deep learning with natural gradient
A Mishkin, F Kunstner, D Nielsen, M Schmidt, ME Khan
Advances in Neural Information Processing Systems, 6245-6255, 2018
212018
Fast yet simple natural-gradient descent for variational inference in complex models
ME Khan, D Nielsen
2018 International Symposium on Information Theory and Its Applications …, 2018
162018
Variational adaptive-Newton method for explorative learning
ME Khan, W Lin, V Tangkaratt, Z Liu, D Nielsen
arXiv preprint arXiv:1711.05560, 2017
82017
Survae flows: Surjections to bridge the gap between vaes and flows
D Nielsen, P Jaini, E Hoogeboom, O Winther, M Welling
Advances in Neural Information Processing Systems 33, 2020
32020
Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow
D Nielsen, O Winther
arXiv preprint arXiv:2002.02547, 2020
12020
Natural-Gradient Stochastic Variational Inference for Non-Conjugate Structured Variational Autoencoder
W Lin, ME Khan, N Hubacher, D Nielsen
12017
Argmax Flows: Learning Categorical Distributions with Normalizing Flows
E Hoogeboom, D Nielsen, P Jaini, P Forré, M Welling
PixelCNN as a Single-Layer Flow
D Nielsen, O Winther
The Variational Adaptive-Newton Method
ME Khan, W Lin, V Tangkaratt, Z Liu, D Nielsen
The system can't perform the operation now. Try again later.
Articles 1–11