Publications

(2024). Learning Noisy Halfspaces with a Margin: Massart is No Harder than Random. NeurIPS 2024, to appear.

Cite

(2024). Smoothed Analysis for Learning Concepts with Low Intrinsic Dimension. Proceedings of Thirty Seventh Conference on Learning Theory.
Best paper award.

PDF Cite URL

(2024). Learning Intersections of Halfspaces with Distribution Shift: Improved Algorithms and SQ Lower Bounds. Proceedings of Thirty Seventh Conference on Learning Theory.

PDF Cite URL

(2024). Testable Learning with Distribution Shift. Proceedings of Thirty Seventh Conference on Learning Theory.

PDF Cite URL

(2024). Tolerant Algorithms for Learning with Arbitrary Covariate Shift. NeurIPS 2024, to appear.

PDF Cite URL

(2024). Efficient Discrepancy Testing for Learning with Distribution Shift. NeurIPS 2024, to appear.

PDF Cite URL

(2024). An Efficient Tester-Learner for Halfspaces. The Twelfth International Conference on Learning Representations.

PDF Cite URL

(2023). Tester-Learners for Halfspaces: Universal Algorithms. Advances in Neural Information Processing Systems
Oral presentation.

PDF Cite URL

(2023). Agnostically Learning Single-Index Models using Omnipredictors. Advances in Neural Information Processing Systems.

PDF Cite URL

(2022). Learning and Covering Sums of Independent Random Variables with Unbounded Support. Advances in Neural Information Processing Systems
Oral presentation.

PDF Cite URL

(2021). Aggregating Incomplete and Noisy Rankings . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics.

PDF Cite URL