Animal communicationAnimal communication is the transfer of information from one or a group of animals (sender or senders) to one or more other animals (receiver or receivers) that affects the current or future behavior of the receivers. Information may be sent intentionally, as in a courtship display, or unintentionally, as in the transfer of scent from predator to prey with kairomones. Information may be transferred to an "audience" of several receivers. Animal communication is a rapidly growing area of study in disciplines including animal behavior, sociology, neurology and animal cognition.
Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Quantities of informationThe mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, or more correctly the shannon, based on the binary logarithm.
Entropy (information theory)In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Augmentative and alternative communicationAugmentative and alternative communication (AAC) encompasses the communication methods used to supplement or replace speech or writing for those with impairments in the production or comprehension of spoken or written language. AAC is used by those with a wide range of speech and language impairments, including congenital impairments such as cerebral palsy, intellectual impairment and autism, and acquired conditions such as amyotrophic lateral sclerosis and Parkinson's disease.
Interpersonal communicationInterpersonal communication is an exchange of information between two or more people. It is also an area of research that seeks to understand how humans use verbal and nonverbal cues to accomplish a number of personal and relational goals. Interpersonal communication research addresses at least six categories of inquiry: 1) how humans adjust and adapt their verbal communication and nonverbal communication during face-to-face communication; 2) how messages are produced; 3) how uncertainty influences behavior and information-management strategies; 4) deceptive communication; 5) relational dialectics; and 6) social interactions that are mediated by technology.
Symmetric polynomialIn mathematics, a symmetric polynomial is a polynomial P(X1, X2, ..., Xn) in n variables, such that if any of the variables are interchanged, one obtains the same polynomial. Formally, P is a symmetric polynomial if for any permutation σ of the subscripts 1, 2, ..., n one has P(Xσ(1), Xσ(2), ..., Xσ(n)) = P(X1, X2, ..., Xn). Symmetric polynomials arise naturally in the study of the relation between the roots of a polynomial in one variable and its coefficients, since the coefficients can be given by polynomial expressions in the roots, and all roots play a similar role in this setting.
Trigonometric functionsIn mathematics, the trigonometric functions (also called circular functions, angle functions or goniometric functions) are real functions which relate an angle of a right-angled triangle to ratios of two side lengths. They are widely used in all sciences that are related to geometry, such as navigation, solid mechanics, celestial mechanics, geodesy, and many others. They are among the simplest periodic functions, and as such are also widely used for studying periodic phenomena through Fourier analysis.
Jacobi elliptic functionsIn mathematics, the Jacobi elliptic functions are a set of basic elliptic functions. They are found in the description of the motion of a pendulum (see also pendulum (mathematics)), as well as in the design of electronic elliptic filters. While trigonometric functions are defined with reference to a circle, the Jacobi elliptic functions are a generalization which refer to other conic sections, the ellipse in particular. The relation to trigonometric functions is contained in the notation, for example, by the matching notation for .
Complete homogeneous symmetric polynomialIn mathematics, specifically in algebraic combinatorics and commutative algebra, the complete homogeneous symmetric polynomials are a specific kind of symmetric polynomials. Every symmetric polynomial can be expressed as a polynomial expression in complete homogeneous symmetric polynomials. The complete homogeneous symmetric polynomial of degree k in n variables X1, ..., Xn, written hk for k = 0, 1, 2, ..., is the sum of all monomials of total degree k in the variables.