Akshay Gautam

Machine Learning & Computational Neuroscience Researcher

About

Research

I'm a PhD student at the Institute of Machine Learning, University of Edinburgh, working on applying machine learning methods to computational neuroscience.

My research focuses on a fundamental challenge in systems neuroscience: disentangling how inputs, noise, and dynamics shape neural computation. I develop identifiable models for input-driven dynamics through low-rank input decomposition methods for linear dynamical systems and control-theoretic approaches (iLQR, variational inference) for estimating latent inputs in nonlinear recurrent networks. I work primarily in JAX and PyTorch, drawing on dynamical systems, probabilistic modeling, and optimal control.

Entrepreneurship & Applications

Beyond research, I'm interested in building things that matter. I'm drawn to entrepreneurship and applying ML methods to solve real problems, whether in neurotechnology, computational tools, or other domains where these techniques can create practical impact. I'm motivated by opportunities to work across different contexts and contribute to projects with real-world value, from early-stage startups to grassroots initiatives.

Profile photo
Expertise
Machine Learning for Complex Systems
Research Focus
Dynamics and Geometry of Neural Computation
Methods & Tools
Probabilistic Modeling and Dynamical Systems using JAX/PyTorch

Projects

Recent work in computational neuroscience and machine learning

Neural Unmixing: Decoding Recurrent Dynamics and Structured Inputs

Implementation and evaluation of structured Multivariate Autoregressive (MVAR) and Latent Dynamical System (LDS) models with low-rank inputs for analyzing neural time series data. Developed tailored parameter estimation algorithms and a nested cross-validation framework to robustly recover latent dynamics and structured variability across trials. This work forms the foundation for my ongoing PhD research.

NumPySciPy
Learn More

Disentangling Input-Driven and Intrinsic Neural Variability

Integration of two neural dynamics modeling frameworks, Low Tensor Rank RNNs and iLQR-VAE to analyze learning-induced changes in neural connectivity by inferring inputs and connectivity changes in nonlinear dynamical systems from neural data. Work involved wrestling with fundamental identifiability challenges in separating intrinsic dynamics from external forcing in non-linear systems. This work forms the foundation for my ongoing PhD research.

JAXPyTorch
Learn More

Multi-Label Safety Classification with Active Learning

Multi-label safety classification system for imbalanced datasets, integrating active learning with human annotator feedback. Fine-tuned transformer models (T5, LLaMA, BERT) using LoRA and 8-bit quantization, comparing generative multi-label (seq2seq) and discriminative binary classification approaches. Exploration of LLM-assisted annotation pipelines (LLMaAA methodology) combining prompt engineering with active learning strategies for scalable data collection.

PyTorchHuggingFace TransformersLoRAT5LLaMABERT
Learn More