Curriculum Vitae

Below is a detailed CV. If you want to download a traditional CV, click one of the two icons below.


Research interests:
– Numerical linear algebra
– Tensor networks
– Non-convex and Riemannian optimization
– Machine learning

Generally I’m interested in tensors and numerical linear algebra with a focus on applications to machine learning. Currently I’m working on developing a streaming sketch algorithm for tensor trains. This will make it feasible to compute tensor train decompositions of very large tensors in a distributed setting.

On the weekends I like to study topics in data science, bioinformatics and scientific computing to broaden my knowledge. I do this by either taking online courses, reading text books, or doing small programming projects. For the latter I usually write blog posts on this website.

Publications and preprints

TTML: tensor trains for general supervised machine learning March 2022
Joint work with Bart Vandereycken

We describe how to use tensor trains to parametrize discretized functions, and how to get a useful supervised machine learning estimator out of it. Learning and initializing these tensor trains is a bit tricky, but definitely doable. I also wrote a blog post about this paper, explaining the main ideas in a more accessible way.

Recovering data you have never seen April 2021, published in The Science Breaker

I wrote an piece in a science outreach journal describing an article about low-rank matrix completion. The aim of this article is to make the core concepts accessible and interesting to a wide audience.

On certain Hochschild cohomology groups for the small quantum group April 2021
Joint work with Nicolas Hemelsoet

We apply the algorithm for the BGG resolution developed in the previous paper to compute Hochschild cohomology of blocks of the small quantum group. This allows us to study the center of the small quantum group, and our computations give stronger evidence for several conjectures concerning the small quantum group. My contribution was writing all the code needed for this project.

A computer algorithm for the BGG resolution November 2019, published in Journal of Algebra
Joint work with Nicolas Hemelsoet

In this work we describe an algorithm to compute the BGG resolution for modules over a simple Lie algebra. This is then used to compute various thing like the Hochschild cohomology of some flag varieties. My contribution was coding the implementation of the algorithm, and solving several algorithmic problems.

Parallel 2-transport and 2-group torsors October 2018

This work is a continuation of my masters thesis. The idea is to study a toy model of principal 2-bundles and 2-transport by restricting to a stricter notion, where the fibers are all strict 2-groups. This allows to get some nice generalizations of the classical theory, which would be harder to proof in the more general setting.

Higher Gauge Theory February 2018 (master thesis)

Open source contributions

ttml python package

A novel machine learning estimator based on Tensor Trains. I wrote this library by myself, and includes many features for optimizing tensor trains in addition to the machine learning capabilities.

bgg-cohomology sage package

I wrote a Sagemath package used for computing the BGG resolution of simple Lie algebra modules and the associated cohomology.


This is a package for Riemannian optimization using PyTorch. I added a Riemannian line search and conjugate gradient optimizer to this project.


This is a package meant to help writing backend agnostic code, i.e. code that can manipulate objects from different numerical libraries like numpy, tensorflow or pytorch (among others). I added a few extra translations from the numpy API to other libraries, I improved the functionality to infer the backend of the arguments of some functions, and I made data type handling for array creation operations more consistent.

Work experience

May 2021–present:

Senior Scientific Editor at The Science Breaker.
The goal of this journal is to make the core ideas behind published scientific research accessible to a wide audience to foster interest in science. It is also an excellent and informal way for scientists to get a flavor of the research and scientific methods of very different fields. As an editor I propose new articles and edit them to make them easier to read for laypersons.

March 2018–present:

PhD student at University of Geneva.
I was working in pure mathematics from 2018 until early 2020, when I switched research direction to applied math. Over the past few years a significant fraction of my time is spent writing research code in Python, both numerical code and code for computer algebra. I spend about 20% of my time teaching. I also spend about 20% of my time studying to broaden my knowledge about data science and scientific computing, either by doing online courses, reading text books, or doing small programming projects.


Teaching assistant at Utrecht University.
I was a teaching assistant for four different courses during my time as a student at Utrecht.


– Neuroscience and Neuroimaging Specialization, at Coursera.

– Genomic Data Science Specialization, at Coursera.

– Advanced Machine Learning Specialization, at Coursera.

– Masterclass Geometry, Topology and Physics, at University of Geneva.

– Masters degree Mathematical Sciences, at Utrecht University (cum laude, GPA 4.00).

– Honors degree “Utrecht Geometry Center”, at Utrecht University.

– Bachelor degree Mathematic, at Utrecht University (cum laude, GPA 4.00).

– Bachelor degree Physics and Astronomy, at Utrecht University (cum laude, GPA 4.00).


Programming languages




Armadillo, Bash, CVXPY, Cython, Docker, Linux, Networkx, NumPy, Pandas, PyTorch, Sagemath, SciPy, Tensorflow, Windows


C2 (native) Level
– Dutch
– English

B1 Level
– French

A2 Level
– Japanese
– Russian
– Spanish

Mathematical expertise

I have a wide background in pure and applied mathematics, and I feel comfortable with research-level mathematics in the following areas:

Applied mathematics:
– Bayesian statistics
– Computer vision
– Convex optimization
– Inverse problems
– Machine learning
– Neural networks
– Non-convex optimization
– Numerical linear algebra
– Quantum computing
– Riemannian optimization
– Signal processing
– Tensor networks

Pure mathematics:
– Algebraic topology
– Category theory
– Deformation quantization
– Differential geometry
– Fiber bundles
– Homological algebra
– Lie groupoids / algebroids
– Lie theory
– Moduli spaces
– Operads
– Poisson geometry
– Tensor / monoidal categories