Segui
Stefano Recanatesi
Titolo
Citata da
Citata da
Anno
Dimensionality compression and expansion in deep neural networks
S Recanatesi, M Farrell, M Advani, T Moore, G Lajoie, E Shea-Brown
arXiv preprint arXiv:1906.00443, 2019
702019
Predictive learning as a network mechanism for extracting low-dimensional latent space representations
S Recanatesi, M Farrell, G Lajoie, S Deneve, M Rigotti, E Shea-Brown
Nature communications 12 (1), 1-13, 2021
65*2021
Dimensionality in recurrent spiking networks: Global trends in activity and local origins in connectivity
S Recanatesi, GK Ocker, MA Buice, E Shea-Brown
PLoS computational biology 15 (7), e1006446, 2019
592019
Gradient-based learning drives robust representations in recurrent neural networks by balancing compression and expansion
M Farrell, S Recanatesi, T Moore, G Lajoie, E Shea-Brown
Nature Machine Intelligence 4 (6), 564-573, 2022
51*2022
Neural network model of memory retrieval
S Recanatesi, M Katkov, S Romani, M Tsodyks
Frontiers in computational neuroscience 9, 149, 2015
472015
Metastable attractors explain the variable timing of stable behavioral action sequences
S Recanatesi, U Pereira-Obilinovic, M Murakami, Z Mainen, L Mazzucato
Neuron 110 (1), 139-153. e9, 2022
442022
Strong coupling and local control of dimensionality across brain areas
D Dahmen, S Recanatesi, GK Ocker, X Jia, M Helias, E Shea-Brown
Biorxiv, 2020.11. 02.365072, 2020
222020
Memory states and transitions between them in attractor neural networks
S Recanatesi, M Katkov, M Tsodyks
Neural computation 29 (10), 2684-2711, 2017
162017
A scale-dependent measure of system dimensionality
S Recanatesi, S Bradde, V Balasubramanian, NA Steinmetz, ...
Patterns 3 (8), 2022
142022
Emergence of hierarchical organization in memory for random material
M Naim, M Katkov, S Recanatesi, M Tsodyks
Scientific reports 9 (1), 10448, 2019
142019
Recurrent neural networks learn robust representations by dynamically balancing compression and expansion. bioRxiv
M Farrell, S Recanatesi, T Moore, G Lajoie, E Shea-Brown
December 3, 564476, 2019
92019
Autoencoder networks extract latent variables and encode these variables in their connectomes
M Farrell, S Recanatesi, RC Reid, S Mihalas, E Shea-Brown
Neural Networks 141, 330-343, 2021
62021
From lazy to rich to exclusive task representations in neural networks and neural codes
M Farrell, S Recanatesi, E Shea-Brown
Current opinion in neurobiology 83, 102780, 2023
42023
Signatures of low-dimensional neural predictive manifolds
S Recanatesi, M Farrell, G Lajoie, S Deneve, M Rigotti, E Shea-Brown
Cosyne Abstracts, 2019
22019
Single circuit in V1 capable of switching contexts during movement using VIP population as a switch
D Voina, S Recanatesi, B Hu, E Shea-Brown, S Mihalas
bioRxiv, 2020.09. 24.309500, 2020
1*2020
A non-Hebbian code for episodic memory
R Pang, S Recanatesi
bioRxiv, 2024.02. 28.582531, 2024
2024
Emergence of Surprise and Predictive Signals from Local Contrastive Learning
AL Smith, LP Jiang, S Recanatesi, MS Bull
2023
A simple connection from loss flatness to compressed representations in neural networks
S Chen, S Recanatesi, E Shea-Brown
arXiv preprint arXiv:2310.01770, 2023
2023
Gradient-based learning drives robust representations in recurrent neural networks by balancing compression and expansion (vol 4, pg 564, 2022)
M Farrell, S Recanatesi, T Moore, G Lajoie, E Shea-Brown
NATURE MACHINE INTELLIGENCE 4 (11), 1053-1053, 2022
2022
Author Correction: Gradient-based learning drives robust representations in recurrent neural networks by balancing compression and expansion
M Farrell, S Recanatesi, T Moore, G Lajoie, E Shea-Brown
Nature Machine Intelligence 4 (11), 1053-1053, 2022
2022
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–20