Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Information theoryInformation theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
Narrative inquiryNarrative inquiry or narrative analysis emerged as a discipline from within the broader field of qualitative research in the early 20th century, as evidence exists that this method was used in psychology and sociology. Narrative inquiry uses field texts, such as stories, autobiography, journals, field notes, letters, conversations, interviews, family stories, photos (and other artifacts), and life experience, as the units of analysis to research and understand the way people create meaning in their lives as narratives.