Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Mutualism (biology)Mutualism describes the ecological interaction between two or more species where each species has a net benefit. Mutualism is a common type of ecological interaction. Prominent examples include most vascular plants engaged in mutualistic interactions with mycorrhizae, flowering plants being pollinated by animals, vascular plants being dispersed by animals, and corals with zooxanthellae, among many others. Mutualism can be contrasted with interspecific competition, in which each species experiences reduced fitness, and exploitation, or parasitism, in which one species benefits at the expense of the other.
FtsZFtsZ is a protein encoded by the ftsZ gene that assembles into a ring at the future site of bacterial cell division (also called the Z ring). FtsZ is a prokaryotic homologue of the eukaryotic protein tubulin. The initials FtsZ mean "Filamenting temperature-sensitive mutant Z." The hypothesis was that cell division mutants of E. coli would grow as filaments due to the inability of the daughter cells to separate from one another. FtsZ is found in almost all bacteria, many archaea, all chloroplasts and some mitochondria, where it is essential for cell division.