Median graphIn graph theory, a division of mathematics, a median graph is an undirected graph in which every three vertices a, b, and c have a unique median: a vertex m(a,b,c) that belongs to shortest paths between each pair of a, b, and c. The concept of median graphs has long been studied, for instance by or (more explicitly) by , but the first paper to call them "median graphs" appears to be . As Chung, Graham, and Saks write, "median graphs arise naturally in the study of ordered sets and discrete distributive lattices, and have an extensive literature".
Large language modelA large language model (LLM) is a language model characterized by its large size. Their size is enabled by AI accelerators, which are able to process vast amounts of text data, mostly scraped from the Internet. The artificial neural networks which are built can contain from tens of millions and up to billions of weights and are (pre-)trained using self-supervised learning and semi-supervised learning. Transformer architecture contributed to faster training.
Deep learningDeep learning is part of a broader family of machine learning methods, which is based on artificial neural networks with representation learning. The adjective "deep" in deep learning refers to the use of multiple layers in the network. Methods used can be either supervised, semi-supervised or unsupervised.
Types of artificial neural networksThere are many types of artificial neural networks (ANN). Artificial neural networks are computational models inspired by biological neural networks, and are used to approximate functions that are generally unknown. Particularly, they are inspired by the behaviour of neurons and the electrical signals they convey between input (such as from the eyes or nerve endings in the hand), processing, and output from the brain (such as reacting to light, touch, or heat). The way neurons semantically communicate is an area of ongoing research.
Machine learningMachine learning (ML) is an umbrella term for solving problems for which development of algorithms by human programmers would be cost-prohibitive, and instead the problems are solved by helping machines 'discover' their 'own' algorithms, without needing to be explicitly told what to do by any human-developed algorithms. Recently, generative artificial neural networks have been able to surpass results of many previous approaches.
Artificial intelligenceArtificial intelligence (AI) is the intelligence of machines or software, as opposed to the intelligence of human beings or animals. AI applications include advanced web search engines (e.g., Google Search), recommendation systems (used by YouTube, Amazon, and Netflix), understanding human speech (such as Siri and Alexa), self-driving cars (e.g., Waymo), generative or creative tools (ChatGPT and AI art), and competing at the highest level in strategic games (such as chess and Go).
Small-world networkA small-world network is a mathematical graph in which most nodes are not neighbors of one another, but the neighbors of any given node are likely to be neighbors of each other. Due to this, most neighboring nodes can be reached from every other node by a small number of hops or steps. Specifically, a small-world network is defined to be a network where the typical distance L between two randomly chosen nodes (the number of steps required) grows proportionally to the logarithm of the number of nodes N in the network, that is: while the global clustering coefficient is not small.
Universal approximation theoremIn the mathematical theory of artificial neural networks, universal approximation theorems are results that put limits on what neural networks can theoretically learn, i.e. that establish the density of an algorithmically generated class of functions within a given function space of interest. Typically, these results concern the approximation capabilities of the feedforward architecture on the space of continuous functions between two Euclidean spaces, and the approximation is with respect to the compact convergence topology.
Locally compact spaceIn topology and related branches of mathematics, a topological space is called locally compact if, roughly speaking, each small portion of the space looks like a small portion of a compact space. More precisely, it is a topological space in which every point has a compact neighborhood. In mathematical analysis locally compact spaces that are Hausdorff are of particular interest; they are abbreviated as LCH spaces. Let X be a topological space. Most commonly X is called locally compact if every point x of X has a compact neighbourhood, i.
Regular spaceIn topology and related fields of mathematics, a topological space X is called a regular space if every closed subset C of X and a point p not contained in C admit non-overlapping open neighborhoods. Thus p and C can be separated by neighborhoods. This condition is known as Axiom T3. The term "T3 space" usually means "a regular Hausdorff space". These conditions are examples of separation axioms. A topological space X is a regular space if, given any closed set F and any point x that does not belong to F, there exists a neighbourhood U of x and a neighbourhood V of F that are disjoint.