Finite ringIn mathematics, more specifically abstract algebra, a finite ring is a ring that has a finite number of elements. Every finite field is an example of a finite ring, and the additive part of every finite ring is an example of an abelian finite group, but the concept of finite rings in their own right has a more recent history. Although rings have more structure than groups, the theory of finite rings is simpler than that of finite groups.
Symmetric polynomialIn mathematics, a symmetric polynomial is a polynomial P(X1, X2, ..., Xn) in n variables, such that if any of the variables are interchanged, one obtains the same polynomial. Formally, P is a symmetric polynomial if for any permutation σ of the subscripts 1, 2, ..., n one has P(Xσ(1), Xσ(2), ..., Xσ(n)) = P(X1, X2, ..., Xn). Symmetric polynomials arise naturally in the study of the relation between the roots of a polynomial in one variable and its coefficients, since the coefficients can be given by polynomial expressions in the roots, and all roots play a similar role in this setting.
Monic polynomialIn algebra, a monic polynomial is a non-zero univariate polynomial (that is, a polynomial in a single variable) in which the leading coefficient (the nonzero coefficient of highest degree) is equal to 1. That is to say, a monic polynomial is one that can be written as with Monic polynomials are widely used in algebra and number theory, since they produce many simplifications and they avoid divisions and denominators. Here are some examples. Every polynomial is associated to a unique monic polynomial.
Polynomial long divisionIn algebra, polynomial long division is an algorithm for dividing a polynomial by another polynomial of the same or lower degree, a generalized version of the familiar arithmetic technique called long division. It can be done easily by hand, because it separates an otherwise complex division problem into smaller ones. Sometimes using a shorthand version called synthetic division is faster, with less writing and fewer calculations. Another abbreviated method is polynomial short division (Blomqvist's method).
AlgorithmIn mathematics and computer science, an algorithm (ˈælɡərɪðəm) is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually.
Polynomial-time reductionIn computational complexity theory, a polynomial-time reduction is a method for solving one problem using another. One shows that if a hypothetical subroutine solving the second problem exists, then the first problem can be solved by transforming or reducing it to inputs for the second problem and calling the subroutine one or more times. If both the time required to transform the first problem to the second, and the number of times the subroutine is called is polynomial, then the first problem is polynomial-time reducible to the second.
Algebraic equationIn mathematics, an algebraic equation or polynomial equation is an equation of the form where P is a polynomial with coefficients in some field, often the field of the rational numbers. For many authors, the term algebraic equation refers only to univariate equations, that is polynomial equations that involve only one variable. On the other hand, a polynomial equation may involve several variables. In the case of several variables (the multivariate case), the term polynomial equation is usually preferred to algebraic equation.
Time complexityIn computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor.
Lattice (order)A lattice is an abstract structure studied in the mathematical subdisciplines of order theory and abstract algebra. It consists of a partially ordered set in which every pair of elements has a unique supremum (also called a least upper bound or join) and a unique infimum (also called a greatest lower bound or meet). An example is given by the power set of a set, partially ordered by inclusion, for which the supremum is the union and the infimum is the intersection.
Aurifeuillean factorizationIn number theory, an aurifeuillean factorization, named after Léon-François-Antoine Aurifeuille, is factorization of certain integer values of the cyclotomic polynomials. Because cyclotomic polynomials are irreducible polynomials over the integers, such a factorization cannot come from an algebraic factorization of the polynomial. Nevertheless, certain families of integers coming from cyclotomic polynomials have factorizations given by formulas applying to the whole family, as in the examples below.