I am a Software Engineer at Google working on the XLA TPU compiler.
My interests are compilers (optimising compilation for deep learning), programming languages (expressive type systems are great) and deep learning. In general, I love complex systems: the MLIR ecosystem is a good example in PL, and deep neural architectures – in AI.
My PhD thesis (2023) (PDF, slides) is in programming languages and compilers for deep learning. I did my PhD at the University of Edinburgh (ICSA), co-supervised by Christophe Dubach, Michel Steuwer, Michael O’Boyle and Kenneth Heafield. My PhD examiners were Murray Cole and Jeremy Singer.
My PhD project focused on optimizing compilation techniques that benefit from functional intermediate representation (IR) with deep neural nets and GPUs as a case study. I also worked on this topic with Ryota Tomioka while on an internship at Microsoft Research. As a research intern at ARM Research, I worked on software/hardware codesign for DL with Giacomo Gabrielli and Ali Zaidi: specifically, I worked on a compiler that generates FPGA designs in the Spatial HLS language with LSTM networks as a case study. I also participated in the teaching of algorithms, machine learning, Java and cognitive science.