I am Professor of Computer Science at University Savoie Mont Blanc, and member of the Laboratory of Mathematics. I am responsible of the Cursus Master en Ingénierie Informatique, and I teach in the Computer Science department of UFR Science and Montagne. My research interests includes digital geometry and topology, image segmentation and analysis, geometry processing, variational models, and discrete calculus. Albeit not an expert, I am also quite fond of arithmetic, word combinatorics, computational geometry, combinatorial topology, differential geometry and I often use several related results in my research.
Download my resumé: short (en) / short (fr) ) / long (fr) , my HDR dissertation (fr) and slides (fr) , my PhD dissertation (fr)
HDR in Computer science (Habilitation à Diriger des Recherches), 2006
University Bordeaux 1
PhD in Computer Science, 1998
University Joseph Fourier
MSc in Computer Science, 1994
University Joseph Fourier
Engineer in Applied Mathematics and Computer Science, 1994
ENSIMAG School of Engineering
In a recent work, full convexity has been proposed as an alternative definition of digital convexity. It solves many problems related to its usual definitions, for instance: Fully convex sets are digitally convex in the usual sense, but are also connected and simply connected. However, full convexity is not a monotone property; hence, intersections of fully convex sets may be neither fully convex nor connected. This defect might forbid digital polyhedral models with fully convex faces and edges. This can be detrimental since classical standard and naive planes are fully convex. In this paper, we study several methods that builds a fully convex set from a digital set. One is particularly appealing and is based on an iterative process: This envelope operator solves in arbitrary dimension the problem of extending a digital set into a fully convex set, while leaving fully convex sets invariant. This extension naturally leads to digital polyhedra whose cells are fully convex. Then a relative envelope operator is proposed, which can be used to force digital planarity of fully convex sets. We provide experiments showing that our method produces coherent polyhedral models for any polyhedron in arbitrary dimension. Finally we study how we can speed up full convexity checks and envelope operations, with a worst-case complexity lowered by a factor $2^d$ in ${\mathbb {Z}}^d$.
The estimation of differential quantities on oriented point cloud is a classical step for many geometry processing tasks in computer graphics and vision. Even if many solutions exist to estimate such quantities, they usually fail at satisfying both a stable estimation with theoretical guarantee, and the efficiency of the associated algorithm. Relying on the notion of corrected curvature measures [LRT22, LRTC20] designed for surfaces, the method introduced in this paper meets both requirements. Given a point of interest and a few nearest neighbours, our method estimates the whole curvature tensor information by generating random triangles within these neighbours and normalising the corrected curvature measures by the corrected area measure. We provide a stability theorem showing that our pointwise curvatures are accurate and convergent, provided the noise in position and normal information has a variance smaller than the radius of neighbourhood. Experiments and comparisons with the state-of-the-art confirm that our approach is more accurate and much faster than alternatives. The method is fully parallelizable, requires only one nearest neighbour request per point of computation, and is trivial to implement.
UV mapping is a classical problem in computer graphics aiming at computing a planar parameterization of the input mesh with the lowest possible distortion while minimizing the seams length. Recent works propose optimization methods for solving these two joint problems at the same time with variational models, but they tend to be slower than other cutting methods. We present a new variational approach for this problem inspired by the Ambrosio-Tortorelli functional, which is easier to optimize than already existing methods. This functional has widely been used in image and geometry processing for anisotropic denoising and segmentation applications. The key feature of this functional is to model both regions where smoothing is applied, and the loci of discontinuities corresponding to the cuts. Our approach relies on this principle to model both the low distortion objective of the UV map, and the minimization of the seams length (sequences of mesh edges). Our method significantly reduces the distortion in a faster way than state-of-the art methods, with comparable seam quality. We also demonstrate the versatility of the approach when external constraints on the parameterization is provided (packing constraints, seam visibility).
Computing differential quantities or solving partial derivative equations on discrete surfaces is at the core of many geometry processing and simulation tasks. For digital surfaces in $\mathbb{Z}^3$ (boundary of voxels), several challenges arise when trying to define a discrete calculus framework on such surfaces mimicking the continuous one: the vertex positions and the geometry of faces do not capture well the geometry of the underlying smooth Euclidean object, even when refined asymptotically. Furthermore, the surface may not be a combinatorial 2-manifold even for discretizations of smooth regular shape. In this paper, we adapt a discrete differential calculus defined on polygonal meshes to the specific case of digital surfaces. We show that this discrete differential calculus accurately mimics the continuous calculus operating on the underlying smooth object, through several experiments: convergence of gradient and weak Laplace operators, spectral analysis and geodesic computations, mean curvature approximation and tolerance to non-manifold locii.
This paper proposes full convexity as an alternative definition of digital convexity, which is valid in arbitrary dimension. It solves many problems related to its usual definitions, like possible non connectedness or non simple connectedness, while encompassing its desirable features. Fully convex sets are digitally convex, but are also connected and simply connected. They have a morphological characterisation, which induces a simple convexity test algorithm. Arithmetic planes are fully convex too. Full convexity implies local full convexity, hence it enables local shape analysis, with an unambiguous definition of convex, concave and planar points. As a kind of relative full convexity, we propose a natural definition of tangent subsets to a digital set. It gives rise to the tangential cover in 2D, and to consistent extensions in arbitrary dimension. Finally we present two applications of tangency: the first one is a simple algorithm for building a polygonal mesh from a set of digital points, with reversibility property, the second one is the definition and computation of shortest paths within digital sets.
This paper proposes a new mathematical and computational tool for infering the geometry of shapes known only through approximations such as triangulated or digital surfaces. The main idea is to decouple the position of the shape boundary from its normal vector field. To do so, we extend a classical tool of geometric measure theory, the normal cycle, so that it takes as input not only a surface but also a normal vector field. We formalize it as a current in the oriented Grassmann bundle $\mathbb{R}^3 \times \mathbb{S}^2$. By choosing adequate differential forms, we define geometric measures like area, mean and Gaussian curvatures. We then show the stability of these measures when both position and normal input data are approximations of the underlying continuous shape. As a byproduct, our tool is able to correctly estimate curvatures over polyhedral approximations of shapes with explicit bounds, even when their natural normal are not correct, as long as an external convergent normal vector field is provided. Finally, the accuracy, convergence and stability under noise perturbation is evaluated experimentally onto digital surfaces.
We propose an original method for vectorizing an image or zooming it at an arbitrary scale. The core of our method relies on the resolution of a geometric variational model and therefore offers theoretic guarantees. More precisely, it associates a total variation energy to every valid triangulation of the image pixels. Its minimization induces a triangulation that reflects image gradients. We then exploit this triangulation to precisely locate discontinuities, which can then simply be vectorized or zoomed. This new approach works on arbitrary images without any learning phase. It is particularly appealing for processing images with low quantization like pixel art and can be used for depixelizing such images. The method can be evaluated with an online demonstrator, where users can reproduce results presented here or upload their own images.
PDF Cite Project DOI Supplementary (PDF) Code (DGtal) Code shrouds (DGtal) URL
PDF Cite Project DOI Video Supplementary Code (DGtal) Code (basic) URL
Stable Geometry Processing and High Performance Calculus on Heterogeneous Geometrical Data
(ANR project ANR-22-CE46)
Feb. 2023 - Jan. 2028
Team Leader LAMA
Year 2023-2024 (in progress)
Most of my lectures are accesible on LAMA’s wiki.
Most of my lectures are accesible on LAMA’s wiki.
Research at Laboratory of Mathematics LAMA. Teach computer science at Department of Computer Science, UFR Scem.
Scientific responsibilities include:
Administrative responsibilities include: