Seguir
Libin Zhu
Título
Citado por
Citado por
Año
Loss landscapes and optimization in over-parameterized non-linear systems and neural networks
C Liu, L Zhu, M Belkin
Applied and Computational Harmonic Analysis 59, 85-116, 2022
323*2022
On the linearity of large non-linear models: when and why the tangent kernel is constant
C Liu, L Zhu, M Belkin
Advances in Neural Information Processing Systems 33, 15954-15964, 2020
1722020
Quadratic models for understanding catapult dynamics of neural networks
L Zhu, C Liu, A Radhakrishnan, M Belkin
The Twelfth International Conference on Learning Representations, 0
20*
Catapults in SGD: spikes in the training loss and their impact on generalization through feature learning
L Zhu, C Liu, A Radhakrishnan, M Belkin
arXiv preprint arXiv:2306.04815, 2023
142023
Restricted strong convexity of deep learning models with smooth activations
A Banerjee, P Cisneros-Velarde, L Zhu, M Belkin
arXiv preprint arXiv:2209.15106, 2022
102022
Transition to linearity of general neural networks with directed acyclic graph architecture
L Zhu, C Liu, M Belkin
Advances in neural information processing systems 35, 5363-5375, 2022
62022
Neural tangent kernel at initialization: linear width suffices
A Banerjee, P Cisneros-Velarde, L Zhu, M Belkin
Uncertainty in Artificial Intelligence, 110-118, 2023
52023
Transition to linearity of wide neural networks is an emerging property of assembling weak models
C Liu, L Zhu, M Belkin
arXiv preprint arXiv:2203.05104, 2022
52022
Emergence in non-neural models: grokking modular arithmetic via average gradient outer product
N Mallinar, D Beaglehole, L Zhu, A Radhakrishnan, P Pandit, M Belkin
arXiv preprint arXiv:2407.20199, 2024
12024
Toward Understanding the Dynamics of Over-parameterized Neural Networks
L Zhu
University of California, San Diego, 2024
2024
A note on Linear Bottleneck networks and their Transition to Multilinearity
L Zhu, P Pandit, M Belkin
arXiv preprint arXiv:2206.15058, 2022
2022
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–11