Curriculum Vitae
Work experience
January 2023—Now
Software Developer at Grazper Technologies.
Recently started as software developer and currenctly working on projects involving Augmented Reality and Computer Vision.
March 2018—December 2022
Doctoral candidate at the University of Geneva.
Performed research in applied mathematics after first studying pure mathematics for 2 years. Focussed on advanced numerical optimization algorithms, particularly for tensors and their applications to machine learning. Developed 3 high-quality numerical software libraries in Python and contributed to 2 open-source projects as part of several research projects resulting in 4 manuscripts.
Taught 3 courses per year as an assistant, receiving consistently positive feedback from students for clear solutions and lectures. Designed Python programming homework using Jupyter and unit tests for 5 courses well-received by both students and teaching staff.
Taught 3 courses per year as an assistant, receiving consistently positive feedback from students for clear solutions and lectures. Designed Python programming homework using Jupyter and unit tests for 5 courses well-received by both students and teaching staff.
May 2021—December 2022
Senior Scientific Editor at the Science Breaker.
Edited layperson summaries of published scientific papers to make them suitable for a layperson audience. Collaborated on manuscripts with 16 different scientist.
Automated part of the publishing process by automatically turning web version of articles into PDF versions built with LaTeX, saving 15 minutes per published manuscript.
Automated part of the publishing process by automatically turning web version of articles into PDF versions built with LaTeX, saving 15 minutes per published manuscript.
2014—2016
Teaching assistant at Utrecht University.
Assisted 4 different courses during my time as a student at Utrecht.
Education
2018/03 - 2022/12
Ph.D. in Mathematics | University of Geneva (mention: très bien).
2016-2017
Masterclass Geometry, Topology and Physics | University of Geneva.
2015-2018
Msc. Mathematical Sciences | Utrecht University (cum laude, GPA 4.00).
2012-2015
Bsc. Mathematics | Utrecht University (cum laude, GPA 4.00).
Bsc. Physics and Astronomy | Utrecht University (cum laude, GPA 4.00).
Online courses
2021/02
Neuroscience and Neuroimaging Specialization | Johns Hopkins University (Coursera certificate).
2020/09
Genomic Data Science Specialization | Johns Hopkins University (Coursera certificate).
2019/08
Advanced Machine Learning Specialization | Higher School of Economics (Coursera certificate).
Skills
Programming languages
Advanced
Python
Intermediate
LaTeX
Mathematica
C/C++
Mathematica
C/C++
Beginner
R
SQL
HTML/CSS
Javascript / Typescript
SQL
HTML/CSS
Javascript / Typescript
Tools
Armadillo,
Bash,
CVXPY,
Cython,
Docker,
Git,
Linux,
NumPy,
OpenCV,
Pandas,
PyTorch,
Sagemath,
SciPy,
Tensorflow,
Windows
Languages
C2 (native) Level
Dutch
English
English
B1 Level
French
A2 Level
Japanese
Russian
Spanish
Russian
Spanish
Publications and preprints
December 2022 (PhD Thesis)
Tensor Train Approximations: Riemannian Methods, Randomized Linear Algebra and Applications to Machine Learning
More info
This is my PhD thesis, which is mostly an extended version of my two papers together with Bart Vandereycken, together with a comphrensive preliminaries section. You can also find slides for my oral defense by clicking here.August 2022
Streaming Tensor Train Approximation
Joint work with Daniel Kressner and Bart Vandereycken
Joint work with Daniel Kressner and Bart Vandereycken
More info
Inspired by the generalized Nyström approximation for low-rank matrices, we created a randomized streaming algorithm for computing tensor trains. This algorithm can quickly compute accurate approximations of a wide variety of tensors (multi-dimensional arrays). It does this by 'sketching' the tensor by multiplying with random tensors to quickly find a small subspace in which it is easy to approximate the tensor. This is a streaming algorithm, which means that we only need to do a single pass over the data of the tensor we want to approximate. It also means it works really well in a distributed setting, and scales very well to big data.March 2022
TTML: tensor trains for general supervised machine learning
Joint work with Bart Vandereycken
Joint work with Bart Vandereycken
More info
We describe how to use tensor trains to parametrize discretized functions, and how to get a useful supervised machine learning estimator out of it. Learning and initializing these tensor trains is a bit tricky, but definitely doable. I also wrote a blog post about this paper, explaining the main ideas in a more accessible way.April 2021
Recovering data you have never seen, published in The Science Breaker
More info
I wrote an piece in a science outreach journal describing an article about low-rank matrix completion. The aim of this article is to make the core concepts accessible and interesting to a wide audience.April 2021
On certain Hochschild cohomology groups for the small quantum group, published in Journal of Algebra
Joint work with Nicolas Hemelsoet
Joint work with Nicolas Hemelsoet
More info
We apply the algorithm for the BGG resolution developed in the previous paper to compute Hochschild cohomology of blocks of the small quantum group. This allows us to study the center of the small quantum group, and our computations give stronger evidence for several conjectures concerning the small quantum group. My contribution was writing all the code needed for this project.November 2019
A computer algorithm for the BGG resolution, published in Journal of Algebra
Joint work with Nicolas Hemelsoet
Joint work with Nicolas Hemelsoet
More info
In this work we describe an algorithm to compute the BGG resolution for modules over a simple Lie algebra. This is then used to compute various thing like the Hochschild cohomology of some flag varieties. My contribution was coding the implementation of the algorithm, and solving several algorithmic problems.October 2018
Parallel 2-transport and 2-group torsors
More info
This work is a continuation of my masters thesis. The idea is to study a toy model of principal 2-bundles and 2-transport by restricting to a stricter notion, where the fibers are all strict 2-groups. This allows to get some nice generalizations of the classical theory, which would be harder to proof in the more general setting.February 2018 (master thesis)
Open source contributions
More info
A randomized streaming algorithm for computing tensor trains. This library implements several fast algorithms for approximating tensors by tensor trains. It is written in an abstract object-oriented fashion, and it is very easy to extent. It supports quite a few different types of tensors.More info
A novel machine learning estimator based on Tensor Trains. I wrote this library by myself, and includes many features for optimizing tensor trains in addition to the machine learning capabilities.More info
I wrote a Sagemath package used for computing the BGG resolution of simple Lie algebra modules and the associated cohomology.More info
This is a package for Riemannian optimization using PyTorch. I added a Riemannian line search and conjugate gradient optimizer to this project.More info
This is a package meant to help writing backend agnostic code, i.e. code that can manipulate objects from different numerical libraries like numpy, tensorflow or pytorch (among others). I added a few extra translations from the numpy API to other libraries, I improved the functionality to infer the backend of the arguments of some functions, and I made data type handling for array creation operations more consistent.More info
Fixed a bug preventingscipy.linalg.lu_factor
only accepting square matrices. Additionally discovered performance issues with the implementation of the matrix logarithm scipy.linalg.logm
.