Ishani Karmarkar
I am currently a PhD student at Stanford University. My research primarily focuses on the design and analysis of algorithms for problems arising in graph theory, machine learning, and optimization.
I am advised by Aaron Sidford and Ellen Vitercik.
Previously, I earned a BS in Applied and Computational Mathematics with a Minor in Computer Science from the California Institute of Technology (Caltech), where I did undergraduate research with Bamdad Hosseini and Andrew Stuart. I also had the opportunity to work as an intern at Facebook (now Meta), where I worked on a variety of machine learning projects.
Email /
GitHub /
Google Scholar /
LinkedIn
|
|
|
From Large to Small Datasets: Size Generalization for Clustering Algorithm Selection
Vaggos Chatziafratis, Ishani Karmarkar, and Ellen Vitercik
arxiv preprint, 2024
Clustering is a fundamental problem in machine learning. There are a
variety of clustering algorithms in the literature; however, there is
scant guidance on choosing a good clustering algorithm for a dataset
at hand. In this paper we approach the problem of clustering
algorithm selection. We present
algorithms that use a small subset of the data to provably estimate the
performance of three classic clustering algorithms: k-means++, k-centers, and
single linkage hierarchical clustering on the full dataset. We support
our theoretical results with experiments on several real-world clustering
instances. (arxiv)
|
|
Truncated Variance-Reduced Value Iteration
Yujia Jin, Ishani Karmarkar, Aaron Sidford, and Jiayi Wang
arxiv preprint, 2024
We study the problem of solving a discounted MDP with a generative model, a fundamental and extensively studied reinforcement learning setting. We present a new randomized variant of value iteration that improves the runtime for computing a coarse optimal solution. Importantly, our method is model-free and takes a step towards improving the sample efficiency gap between model-based and model-free methods in this setting. (arxiv)
|
|
Faster Spectral Density Estimation and Sparsification in the Nuclear Norm
Yujia Jin, Ishani Karmarkar, Christopher Musco, Aaron Sidford, and Apoorv Vikram Singh
Conference on Learning Theory (COLT), 2024
Spectral Density Estimation (SDE) is the problem of efficiently learning the distribution of eigenvalues of a graph and has broad applications in computational science and network science. In this paper, we present new fast and simple randomized and deterministic algorithms for spectral density estimation via a new notion of graph sparsification, which we call nuclear sparsification. (arxiv)
|
|
Towards Optimal Effective Resistance Estimation
Rajat Dwaraknath, Ishani Karmarkar, and Aaron Sidford
Neural Information Processing Systems (NeurIPS), 2023
Computing the effective resistance of nodes in a network (or, undirected graph) is a fundamental task with many applications, for instance, in graph machine learning, graph data analysis, and the design of efficient graph algorithms. In this paper we introduce new efficient algorithms for estimating effective resistances in well-connected graphs. We also prove improved time lower bounds for effective resistance estimation. In addition, we provide natural extensions
of our work to related problems on symmetric diagonally dominant (SDD) and
positive semi-definite (PSD) matrices. (arxiv)
|
Internship Projects
Webpage in progress, to be updated soon :)
Other Projects
Webpage in progress, to be updated soon :)
Teaching Experience
Aside from research, I also enjoy teaching mathematics. At Stanford, I had the opportunity to serve as a teaching assistance for CME 302 (Numerical Linear Algebra) and ME 300 (Linear Algebra with Engineering Applications.)
At Caltech, I enjoyed being a teaching assistant for CME 213 (Markov Chains, Discrete Stochastic Processes, and Applications), ACM 104 (Applied Linear Algebra), CS 157 (Statistical Inference), and ACM 95/100 (Introductory Methods of Applied Math).
|