Kalin Nonchev, MSc

"From error to error, one discovers the entire truth." - Sigmund Freud

PhD Student

E-Mail
kalin.nonchev@get-your-addresses-elsewhere.inf.ethz.ch
Address
ETH Zurich
Biomedical Informatics Group
Department of Computer Science
Universitätstrasse 6
Room
CAB F 51.1
twitter
@nonchevk

Committed to advancing machine learning research in biomedicine to improve clinical decision-making.

My research focuses on understanding the gap between genotype and phenotype in human diseases. Machine learning algorithms are key tools for uncovering biological patterns and interpreting medical datasets. I believe this is the key to more accurate disease diagnoses for patients, and, more importantly, the well-proven rationale for their therapies.

I studied bioinformatics at the Technical University of Munich and Ludwig-Maximilian University of Munich, followed by a Master's at ETH Zurich. During my Bachelor's, I worked with Prof. Julien Gagneur on rare disease genomics. In my Master's, I contributed to transcriptomics projects at the Functional Genomics Center Zurich and Roche Diagnostics. Currently, I am a doctoral candidate under Prof. Gunnar Rätsch's supervision. Meanwhile, I'm grateful to work with exceptional people who continually push the boundaries in biomedical informatics.

I am very open to research collaborations and mentoring BSc and MSc students. Please reach out if interested. Find out more on my homepage https://kalinnonchev.github.io.

Abstract Spot-based spatial transcriptomics (ST) technologies like 10x Visium quantify genome-wide gene expression and preserve spatial tissue organization. However, their coarse spot-level resolution aggregates signals from multiple cells, preventing accurate single-cell analysis and detailed cellular characterization. Here, we present DeepSpot2Cell, a novel DeepSet neural network that leverages pretrained pathology foundation models and spatial multi-level context to effectively predict virtual single-cell gene expression from histopathological images using spot-level supervision. DeepSpot2Cell substantially improves gene expression correlations on a newly curated benchmark we specifically designed for single-cell ST deconvolution and prediction from H&E images. The benchmark includes 20 lung, 7 breast, and 2 pancreatic cancer samples, across which DeepSpot2Cell outperformed previous super-resolution methods, achieving respective improvements of 46%, 65%, and 38% in cell expression correlation for the top 100 genes. We hope that DeepSpot2Cell and this benchmark will stimulate further advancements in virtual single-cell ST, enabling more precise delineation of cell-type-specific expression patterns and facilitating enhanced downstream analyses. Code availability: https://github.com/ratschlab/DeepSpot

Authors Kalin Nonchev, Glib Manaiev, Viktor H Koelzer, Gunnar Rätsch

Submitted NeurIPS 2025 Imageomics

Link DOI

Abstract Histopathology refers to the microscopic examination of diseased tissues and routinely guides treatment decisions for cancer and other diseases. Currently, this analysis focuses on morphological features but rarely considers gene expression information, which can add an important molecular dimension. Here, we introduce SpotWhisperer, an AI method that links histopathological images to spatial gene expression profiles and their text annotations, enabling molecularly grounded histopathology analysis through natural language. Our method outperforms pathology vision-language models on a newly curated benchmark dataset, dedicated to spatially resolved H&E annotation. Integrated into a web interface, SpotWhisperer enables interactive exploration of cell types and disease mechanisms using free-text queries with access to inferred spatial gene expression profiles. In summary, SpotWhisperer analyzes cost-effective pathology images with spatial gene expression and natural-language AI, demonstrating a path for routine integration of microscopic molecular information into histopathology.

Authors Moritz Schaefer, Kalin Nonchev, Animesh Awasthi, Jake Burton, Viktor H Koelzer, Gunnar Rätsch, Christoph Bock

Submitted ICML 2025 FM4LS

Link DOI

Abstract Spatial transcriptomics technology remains resource-intensive and unlikely to be routinely adopted for patient care soon. This hinders the development of novel precision medicine solutions and, more importantly, limits the translation of research findings to patient treatment. Here, we present DeepSpot, a deep-set neural network that leverages recent foundation models in pathology and spatial multi-level tissue context to effectively predict spatial transcriptomics from H&E images. DeepSpot substantially improved gene correlations across multiple datasets from patients with metastatic melanoma, kidney, lung, or colon cancers as compared to previous state-of-the-art. Using DeepSpot, we generated 1 792 TCGA spatial transcriptomics samples (37 million spots) of the melanoma and renal cell cancer cohorts. We anticipate this to be a valuable resource for biological discovery and a benchmark for evaluating spatial transcriptomics models. We hope that DeepSpot and this dataset will stimulate further advancements in computational spatial transcriptomics analysis.

Authors Kalin Nonchev, Sebastian Dawo, Karina Selina, Holger Moch, Sonali Andani, Tumor Profiler Consortium, Viktor Hendrik Koelzer, Gunnar Rätsch

Submitted MedRxiv

Link DOI

Abstract Spatial transcriptomics enables in-depth molecular characterization of samples on a morphology and RNA level while preserving spatial location. Integrating the resulting multi-modal data is an unsolved problem, and developing new solutions in precision medicine depends on improved methodologies. Here, we introduce AESTETIK, a convolutional deep learning model that jointly integrates spatial, transcriptomics, and morphology information to learn accurate spot representations. AESTETIK yielded substantially improved cluster assignments on widely adopted technology platforms (e.g., 10x Genomics™, NanoString™) across multiple datasets. We achieved performance enhancement on structured tissues (e.g., brain) with a 21% increase in median ARI over previous state-of-the-art methods. Notably, AESTETIK also demonstrated superior performance on cancer tissues with heterogeneous cell populations, showing a two-fold increase in breast cancer, 79% in melanoma, and 21% in liver cancer. We expect that these advances will enable a multi-modal understanding of key biological processes.

Authors Kalin Nonchev, Sonali Andani, Joanna Ficek-Pascual, Marta Nowak, Bettina Sobottka, Tumor Profiler Consortium, Viktor Hendrik Koelzer, and Gunnar Rätsch

Submitted MedRxiv

Link DOI