Folgen
Francesco D'Angelo
Francesco D'Angelo
Bestätigte E-Mail-Adresse bei epfl.ch
Titel
Zitiert von
Zitiert von
Jahr
Repulsive Deep Ensembles are Bayesian
F D'Angelo, V Fortuin
NeurIPS 2021, 2021
1262021
Posterior meta-replay for continual learning
C Henning, M Cervera, F D'Angelo, J Von Oswald, R Traber, B Ehret, ...
Advances in neural information processing systems 34, 14135-14149, 2021
752021
Learning the Ising model with generative neural networks
F D'Angelo, L Böttcher
Physical Review Research 2 (2), 023266, 2020
502020
On Stein Variational Neural Network Ensembles
F D'Angelo, V Fortuin, F Wenzel
Workshop on Uncertainty & Robustness in Deep Learning (ICML), 2021
362021
Annealed Stein Variational Gradient Descent
F D’Angelo, V Fortuin
3rd Symposium on Advances in Approximate Bayesian Inference, 2020, 2020
312020
Why do we need weight decay in modern deep learning?
F D'Angelo, M Andriushchenko, A Varre, N Flammarion
NeurIPS 2024, 2024
29*2024
On out-of-distribution detection with Bayesian neural networks
F D'Angelo, C Henning
arXiv. org, 2021
18*2021
Are Bayesian neural networks intrinsically good at out-of-distribution detection?
C Henning, F D'Angelo, BF Grewe
ICML 2021 Workshop on Uncertainty and Robustness in Deep Learning., 2021
162021
Uncertainty estimation under model misspecification in neural network regression
MR Cervera, R Dätwyler, F D'Angelo, H Keurti, BF Grewe, C Henning
NeurIPS 2021 Workshop Your Model Is Wrong: Robustness and Misspecification …, 2021
62021
Selective induction Heads: How Transformers Select Causal Structures in Context
F D'Angelo, F Croce, N Flammarion
ICLR 2025, 2025
2025
Why Do We Need Weight Decay for Overparameterized Deep Networks?
F D'Angelo, A Varre, M Andriushchenko, N Flammarion
NeurIPS 2023 Workshop on Mathematics of Modern Machine Learning, 2023
2023
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–11