Ahmad Ghasemi
Graph algorithms and theoretical machine learning
Minimax learning theory, graph learning, and structure-aware complexity
Department of Electrical & Computer Engineering
University of Massachusetts Amherst
aghasemi (at) umass [dot] edu
I am a Postdoctoral Research Fellow and Adjunct Faculty in Electrical & Computer Engineering at the University of Massachusetts Amherst. My research is in the foundations of machine learning, with an emphasis on graph-structured learning, minimax analysis, and structure-aware complexity. I study how graph topology, dependence, and problem geometry change the effective difficulty of estimation and generalization, and how these structural effects should inform algorithm design.
My work centers on two linked threads: (i) theoretical foundations of learning under dependence, especially for graph-structured learning (including minimax lower bounds, regime characterization, and topology-aware complexity), and (ii) efficient and reliable inference in modern ML systems. A recurring goal across both is to identify the relevant structural regime, prove sharp lower or upper bounds where possible, and use those results to guide algorithmic design and evaluation.
Quick pathways
- Theory overview (graph learning, minimax theory, learning under dependence): Open
- Systems / CPS / Edge AI overview (deployable ML, autonomy, resource-aware inference): Open
Much of my work starts from a simple observation: classical i.i.d.-based intuition often breaks down in the graph-structured and dependent settings that matter most for modern machine learning. In these regimes, nominal sample size can be a poor proxy for effective information, and performance may be limited by topology, dependence, or computational budget rather than model capacity alone. My approach is to identify the relevant structural regime, prove sharp lower or upper bounds where possible, and use those results to design algorithms, diagnostics, and evaluation protocols that remain meaningful in practice.
Active projects and proof points
-
Minimax sample complexity of GNNs under topology and dependence (ICLR 2026): I developed minimax lower bounds and structural regime characterizations for message-passing GNNs, identifying when generalization follows classical sample-size behavior versus when graph-induced dependence and mixing constraints dominate effective sample complexity.
-
Low-rank / compact graph inference for deployable ML: I work on low-rank and constrained training for graph models to make inference cost controllable under latency/energy/memory limits, including substantial compression gains with modest performance degradation and tunable deployment tradeoffs.
-
Efficient inference and confidence-guided computation for large models (CGES): We developed Confidence-Guided Early Stopping, a Bayesian stopping rule for budgeted inference that halts sampling when posterior evidence crosses a threshold (or a hard budget binds). Across five reasoning benchmarks, CGES reduces model calls by 69.4% while matching self-consistency accuracy.
-
NSF CPS AERIAL (Co-PI): I lead work on trustworthy and budget-aware learning-and-control and digital-twin validation protocols for UAV swarm autonomy, with emphasis on reliability metrics, stress testing, and evaluation under operational constraints.
Research agenda
My current research agenda centers on graph-structured learning and structure-aware complexity in modern machine learning, with three connected directions:
- Learning under dependence: minimax limits, regime characterization, and effective sample-size phenomena in graph and sequential settings.
- Graph representation learning theory: topology-aware generalization, structural bottlenecks in message passing, and minimal mechanisms that overcome them.
- Efficient and reliable inference: adaptive computation, confidence-aware stopping, and compute-quality tradeoffs for large models and constrained deployments.
Across these directions, I aim to combine theorem-level analysis with theory-grounded empirical diagnostics that distinguish structural regimes and test predicted scaling behavior.
Teaching
I teach ML, generative models, DSP, image processing, and data science foundations, with an emphasis on linking theory to reproducible engineering practice. My goal is to help students move from technical familiarity to research-grade reasoning: specifying assumptions, building meaningful baselines, debugging under shift and partial observability, and justifying accuracy–reliability–cost tradeoffs. My UMass evaluations reflect strong execution and clarity (recent instructor effectiveness 4.7/5.0, with 5.0/5.0 on preparation and use of class time in a recent offering).
Students, collaborators, and funding
I am building research around shared interfaces that make systems dependable in practice: reliability contracts, telemetry/monitoring, and trace-based evaluation suites spanning Networking/IoT, embedded/SoC, and autonomy.
I currently mentor PhD and undergraduate researchers and regularly collaborate across ML systems and wireless/autonomy.
I welcome collaborations in graph algorithms, theoretical machine learning, graph learning, learning under dependence, and structure-aware complexity.
news
| Jul 15, 2025 | Our NSF proposal, titled “AERIAL (AI-Embedded Responsive Intelligent Agents with Trajectory-Induced Digital Twin Learning” is awarded by National Science Foundation (NSF). |
|---|---|
| Feb 16, 2024 | Our paper on Tiny Graph Neural Network has been accepted for presentation at the 2024 TinyML Research Symposium! |
| Jan 17, 2024 | Our paper is accepted to IEEE Transactions on Vehicular Technology. |
| Oct 1, 2023 | I served as a reviewer for IEEE Transactions on Wireless Communications. |
| Jul 1, 2022 | I received Travel Grant Award from School of Arts & Sciences, WPI, to present my paper at IEEE AP-S/URSI 2022! |
| May 22, 2022 | Our paper is accepted to IEEE AP-S/URSI 2022. |
| Nov 1, 2021 | I served as a reviewer for IEEE Transactions on Wireless Communications and IEEE Transactions on Communications. |