Konstantinos Stavropoulos

Konstantinos Stavropoulos (Kostas)

Ph.D. Student

University of Texas at Austin

About me

I am currently a fourth-year Ph.D. student in Computer Science at UT Austin. I am fortunate to be advised by Prof. Adam Klivans. Before that, I studied Electrical and Computer Engineering at the National Technical University of Athens, where I was fortunate to work with Prof. Dimitris Fotakis.

My research is in the intersection of machine learning and theoretical computer science. I am particularly interested in designing efficient learning algorithms with provable guarantees that do not rely on the strong assumptions typically made in learning theory, especially in challenging scenarios like learning with distribution shift and/or noise.

My CV can be found here (last update: October 2024).

    [email protected]

Conference Publications (alphabetical author order)

(2024). Smoothed Analysis for Learning Concepts with Low Intrinsic Dimension. Proceedings of Thirty Seventh Conference on Learning Theory.
Best paper award.

PDF Cite URL

(2024). Learning Intersections of Halfspaces with Distribution Shift: Improved Algorithms and SQ Lower Bounds. Proceedings of Thirty Seventh Conference on Learning Theory.

PDF Cite URL

(2024). Testable Learning with Distribution Shift. Proceedings of Thirty Seventh Conference on Learning Theory.

PDF Cite URL

(2024). An Efficient Tester-Learner for Halfspaces. The Twelfth International Conference on Learning Representations.

PDF Cite URL

(2023). Tester-Learners for Halfspaces: Universal Algorithms. Advances in Neural Information Processing Systems
Oral presentation.

PDF Cite URL

(2023). Agnostically Learning Single-Index Models using Omnipredictors. Advances in Neural Information Processing Systems.

PDF Cite URL

(2022). Learning and Covering Sums of Independent Random Variables with Unbounded Support. Advances in Neural Information Processing Systems
Oral presentation.

PDF Cite URL

(2021). Aggregating Incomplete and Noisy Rankings . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics.

PDF Cite URL

Manuscripts

(2024). Learning Noisy Halfspaces with a Margin: Massart is No Harder than Random. NeurIPS 2024, to appear.

Cite

(2024). Efficient Discrepancy Testing for Learning with Distribution Shift. NeurIPS 2024, to appear.

PDF Cite URL

(2024). Tolerant Algorithms for Learning with Arbitrary Covariate Shift. NeurIPS 2024, to appear.

PDF Cite URL

Awards and Achievements

   Best paper award at Conference on Learning Theory (COLT) 2024
Our paper on Smoothed Analysis for Learning Concepts with Low Intrinsic Dimension won the best paper award at COLT.
Jun 2024


  Oral presentations at NeurIPS 2022 and 2023
Our paper Tester-Learners for Halfspaces: Universal Algorithms was selected for oral presentation at NeurIPS 2023 (top 0.54% of all submissions). In NeurIPS 2022, the paper Learning and Covering Sums of Independent Random Variables with Unbounded Support was selected for oral presentation as well (top 1.76% of all submissions).
Dec 2022, 2023


  Scholarships from Bodossaki and Leventis Foundations and scholarship award from the Hellenic Professional Society of Texas
My Ph.D. studies are generously supported by scholarships from Bodossaki and Leventis foundations. In, May 2022, I received a scholarship award from HPST in recognition of academic excellence.
Sep 2022 - Aug 2025, May 2022


   Award from State Scholarships Foundation and “Thomaideio” Award from NTUA
Upon graduating from the ECE department of NTUA, I received the Award of Excellence from the State Scholarships Foundation of Greece for graduating first in my cohort within the nominal period of studies. During my studies, I was awarded for having the highest GPA among all undergraduate students of the ECE department during the academic year 2018-2019.
2019, 2020

Reviewing

ICLR 2024, ICML 2024, NeurIPS 2023