**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.

Publication# Short homology bases for hyperelliptic hyperbolic surfaces

2023

Journal paper

Journal paper

Abstract

Given a hyperelliptic hyperbolic surface S of genus g >= 2, we find bounds on the lengths of homologically independent loops on S. As a consequence, we show that for any lambda is an element of (0, 1) there exists a constant N(lambda) such that every such surface has at least [lambda center dot 2/3 g] homologically independent loops of length at most N(lambda), extending the result in [Mu] and [BPS]. This allows us to extend the constant upper bound obtained in [Mu] on the minimal length of non-zero period lattice vectors of hyperelliptic Riemann surfaces to almost 2/3 g linearly independent vectors.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (33)

Related publications (43)

Related MOOCs (15)

Ontological neighbourhood

Linear independence

In the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concepts are central to the definition of dimension. A vector space can be of finite dimension or infinite dimension depending on the maximum number of linearly independent vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space is linearly dependent are central to determining the dimension of a vector space.

Linear span

In mathematics, the linear span (also called the linear hull or just span) of a set S of vectors (from a vector space), denoted span(S), is defined as the set of all linear combinations of the vectors in S. For example, two linearly independent vectors span a plane. The linear span can be characterized either as the intersection of all linear subspaces that contain S, or as the smallest subspace containing S. The linear span of a set of vectors is therefore a vector space itself. Spans can be generalized to matroids and modules.

Basis (linear algebra)

In mathematics, a set B of vectors in a vector space V is called a basis (: bases) if every element of V may be written in a unique way as a finite linear combination of elements of B. The coefficients of this linear combination are referred to as components or coordinates of the vector with respect to B. The elements of a basis are called . Equivalently, a set B is a basis if its elements are linearly independent and every element of V is a linear combination of elements of B.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 2)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Jan Sickmann Hesthaven, Junming Duan

Reduced-order models are indispensable for multi-query or real-time problems. However, there are still many challenges to constructing efficient ROMs for time-dependent parametrized problems. Using a linear reduced space is inefficient for time-dependent n ...

,

We generalize the class vectors found in neural networks to linear subspaces (i.e., points in the Grassmann manifold) and show that the Grassmann Class Representation (GCR) enables simultaneous improvement in accuracy and feature transferability. In GCR, e ...

Fabio Nobile, Yoshihito Kazashi, Fabio Zoccolan

In this paper, we set the mathematical foundations of the Dynamical Low Rank Approximation (DLRA) method for high-dimensional stochastic differential equations. DLRA aims at approximating the solution as a linear combination of a small number of basis vect ...

2023