Curriculum Vitae

Below is a detailed CV. If you want to download a traditional CV, click the icon below.

Research

Research interests:
– Numerical linear algebra
– Tensor networks
– Non-convex and Riemannian optimization
– Machine learning

Generally I’m interested in tensors and numerical linear algebra with a focus on applications to machine learning. Currently I’m working on developing a streaming sketch algorithm for tensor trains. This will make it feasible to compute tensor train decompositions of very large tensors in a distributed setting.

On the weekends I like to study topics in data science, bioinformatics and scientific computing to broaden my knowledge. I do this by either taking online courses, reading text books, or doing small programming projects. For the latter I usually write blog posts on this website.

Publications and preprints

Streaming Tensor Train Approximation August 2022
Joint work with Daniel Kressner and Bart Vandereycken

Inspired by the generalized Nyström approximation for low-rank matrices, we created a randomized streaming algorithm for computing tensor trains. This algorithm can quickly compute accurate approximations of a wide variety of tensors (multi-dimensional arrays). It does this by 'sketching' the tensor by multiplying with random tensors to quickly find a small subspace in which it is easy to approximate the tensor. This is a streaming algorithm, which means that we only need to do a single pass over the data of the tensor we want to approximate. It also means it works really well in a distributed setting, and scales very well to big data.


TTML: tensor trains for general supervised machine learning March 2022
Joint work with Bart Vandereycken

We describe how to use tensor trains to parametrize discretized functions, and how to get a useful supervised machine learning estimator out of it. Learning and initializing these tensor trains is a bit tricky, but definitely doable. I also wrote a blog post about this paper, explaining the main ideas in a more accessible way.


Recovering data you have never seen April 2021, published in The Science Breaker

I wrote an piece in a science outreach journal describing an article about low-rank matrix completion. The aim of this article is to make the core concepts accessible and interesting to a wide audience.


On certain Hochschild cohomology groups for the small quantum group April 2021, published in Journal of Algebra
Joint work with Nicolas Hemelsoet

We apply the algorithm for the BGG resolution developed in the previous paper to compute Hochschild cohomology of blocks of the small quantum group. This allows us to study the center of the small quantum group, and our computations give stronger evidence for several conjectures concerning the small quantum group. My contribution was writing all the code needed for this project.


A computer algorithm for the BGG resolution November 2019, published in Journal of Algebra
Joint work with Nicolas Hemelsoet

In this work we describe an algorithm to compute the BGG resolution for modules over a simple Lie algebra. This is then used to compute various thing like the Hochschild cohomology of some flag varieties. My contribution was coding the implementation of the algorithm, and solving several algorithmic problems.


Parallel 2-transport and 2-group torsors October 2018

This work is a continuation of my masters thesis. The idea is to study a toy model of principal 2-bundles and 2-transport by restricting to a stricter notion, where the fibers are all strict 2-groups. This allows to get some nice generalizations of the classical theory, which would be harder to proof in the more general setting.


Higher Gauge Theory February 2018 (master thesis)


Open source contributions

tt-sketch python package

A randomized streaming algorithm for computing tensor trains. This library implements several fast algorithms for approximating tensors by tensor trains. It is written in an abstract object-oriented fashion, and it is very easy to extent. It supports quite a few different types of tensors.


ttml python package

A novel machine learning estimator based on Tensor Trains. I wrote this library by myself, and includes many features for optimizing tensor trains in addition to the machine learning capabilities.


bgg-cohomology sage package

I wrote a Sagemath package used for computing the BGG resolution of simple Lie algebra modules and the associated cohomology.


geoopt

This is a package for Riemannian optimization using PyTorch. I added a Riemannian line search and conjugate gradient optimizer to this project.


autoray

This is a package meant to help writing backend agnostic code, i.e. code that can manipulate objects from different numerical libraries like numpy, tensorflow or pytorch (among others). I added a few extra translations from the numpy API to other libraries, I improved the functionality to infer the backend of the arguments of some functions, and I made data type handling for array creation operations more consistent.


Work experience

March 2018–December 2022 (expected):

PhD student at University of Geneva.
I was working in pure mathematics from 2018 until early 2020, when I switched research direction to applied math. Over the past few years a significant fraction of my time is spent writing research code in Python, both numerical code and code for computer algebra. I spend about 20% of my time teaching. I also spend about 20% of my time studying to broaden my knowledge about data science and scientific computing, either by doing online courses, reading text books, or doing small programming projects.


May 2021–present:

Senior Scientific Editor at The Science Breaker.
The goal of this journal is to make the core ideas behind published scientific research accessible to a wide audience to foster interest in science. It is also an excellent and informal way for scientists to get a flavor of the research and scientific methods of very different fields. As an editor I propose new articles and edit them to make them easier to read for laypersons.


2014-2016:

Teaching assistant at Utrecht University.
I was a teaching assistant for four different courses during my time as a student at Utrecht.


Education


2021/02
– Neuroscience and Neuroimaging Specialization, at Coursera.


2020/09
– Genomic Data Science Specialization, at Coursera.


2019/08
– Advanced Machine Learning Specialization, at Coursera.


2016-2017
– Masterclass Geometry, Topology and Physics, at University of Geneva.


2015-2018
– Masters degree Mathematical Sciences, at Utrecht University (cum laude, GPA 4.00).

– Honors degree “Utrecht Geometry Center”, at Utrecht University.


2012-2015
– Bachelor degree Mathematic, at Utrecht University (cum laude, GPA 4.00).

– Bachelor degree Physics and Astronomy, at Utrecht University (cum laude, GPA 4.00).

Skills

Programming languages

Advanced
Python

Intermediate
LaTeX
Mathematica
C/C++

Beginner
R
SQL

Tools
Armadillo, Bash, CVXPY, Cython, Docker, Linux, Networkx, NumPy, Pandas, PyTorch, Sagemath, SciPy, Tensorflow, Windows

Languages

C2 (native) Level
– Dutch
– English

B1 Level
– French

A2 Level
– Japanese
– Russian
– Spanish

Mathematical expertise

I have a wide background in pure and applied mathematics, and I feel comfortable with research-level mathematics in the following areas:

Applied mathematics:
– Bayesian statistics
– Computer vision
– Convex optimization
– Inverse problems
– Machine learning
– Multivariate statistics
– Neural networks
– Non-convex optimization
– Numerical linear algebra
– Quantum computing
– Riemannian optimization
– Signal processing
– Tensor networks
– Time series analysis

Pure mathematics:
– Algebraic topology
– Category theory
– Deformation quantization
– Differential geometry
– Fiber bundles
– Homological algebra
– Lie groupoids / algebroids
– Lie theory
– Moduli spaces
– Operads
– Poisson geometry
– Tensor / monoidal categories