Seguir
Sung Min Park
Sung Min Park
Dirección de correo verificada de mit.edu - Página principal
Título
Citado por
Citado por
Año
Datamodels: Predicting predictions from training data
A Ilyas, SM Park, L Engstrom, G Leclerc, A Madry
International Conference on Machine Learning, 9525-9587, 2022
148*2022
Trak: Attributing model behavior at scale
SM Park, K Georgiev, A Ilyas, G Leclerc, A Madry
arXiv preprint arXiv:2303.14186, 2023
1002023
FFCV: Accelerating training by removing data bottlenecks
G Leclerc, A Ilyas, L Engstrom, SM Park, H Salman, A Mądry
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023
75*2023
A Data-Based Perspective on Transfer Learning
S Jain, H Salman, A Khaddaj, E Wong, SM Park, A Mądry
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023
392023
Structured learning of sum-of-submodular higher order energy functions
A Fix, T Joachims, SM Park, R Zabih
Proceedings of the IEEE International Conference on Computer Vision, 3104-3111, 2013
332013
Modeldiff: A framework for comparing learning algorithms
H Shah, SM Park, A Ilyas, A Madry
International Conference on Machine Learning, 30646-30688, 2023
242023
The journey, not the destination: How data guides diffusion models
K Georgiev, J Vendrow, H Salman, SM Park, A Madry
arXiv preprint arXiv:2312.06205, 2023
132023
Sparse PCA from sparse linear regression
G Bresler, SM Park, M Persu
Advances in Neural Information Processing Systems 31, 2018
122018
On distinctive properties of universal perturbations
SM Park, KA Wei, K Xiao, J Li, A Madry
arXiv preprint arXiv:2112.15329, 2021
32021
Attribute-to-Delete: Machine Unlearning via Datamodel Matching
K Georgiev, R Rinberg, SM Park, S Garg, A Ilyas, A Madry, S Neel
arXiv preprint arXiv:2410.23232, 2024
2024
On the equivalence of sparse statistical problems
SM Park
Massachusetts Institute of Technology, 2016
2016
Unlearning via Simulated Oracle Matching
K Georgiev, R Rinberg, SM Park, S Garg, A Ilyas, A Madry, S Neel
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–12