Artificial consciousnessArtificial consciousness (AC), also known as machine consciousness (MC), synthetic consciousness or digital consciousness, is the consciousness hypothesized to be possible in artificial intelligence. It is also the corresponding field of study, which draws insights from philosophy of mind, philosophy of artificial intelligence, cognitive science and neuroscience. The same terminology can be used with the term "sentience" instead of "consciousness" when specifically designating phenomenal consciousness (the ability to feel qualia).
ConsciousnessConsciousness, at its simplest, is awareness of internal and external existence. However, its nature has led to millennia of analyses, explanations and debates by philosophers, theologians, linguists, and scientists. Opinions differ about what exactly needs to be studied or even considered consciousness. In some explanations, it is synonymous with the mind, and at other times, an aspect of mind. In the past, it was one's "inner life", the world of introspection, of private thought, imagination and volition.
Western philosophyWestern philosophy encompasses the philosophical thought and work of the Western world. Historically, the term refers to the philosophical thinking of Western culture, beginning with the ancient Greek philosophy of the pre-Socratics. The word philosophy itself originated from the Ancient Greek (φιλοσοφία), literally, "the love of wisdom" φιλεῖν , "to love" and σοφία sophía, "wisdom").
Philosophical zombieA philosophical zombie (or "p-zombie") is a being in a thought experiment in philosophy of mind that is physically identical to a normal person but does not have conscious experience. For example, if a philosophical zombie were poked with a sharp object, it would not feel any pain, but it would behave exactly the way any conscious human would. Philosophical zombie arguments are used against forms of physicalism and in defense of the "hard problem of consciousness", which is the problem of accounting in physical terms for subjective, intrinsic, first-person, what-it's-like-ness experiences.
Functionalism (philosophy of mind)In philosophy of mind, functionalism is the thesis that each and every mental state (for example, the state of having a belief, of having a desire, or of being in pain) is constituted solely by its functional role, which means its causal relation to other mental states, sensory inputs, and behavioral outputs. Functionalism developed largely as an alternative to the identity theory of mind and behaviorism. Functionalism is a theoretical level between the physical implementation and behavioral output.
QualiaIn philosophy of mind, qualia (ˈkwɑːliə,_ˈkweɪ-; singular form: quale -li) are defined as instances of subjective, conscious experience. The term qualia derives from the Latin neuter plural form (qualia) of the Latin adjective quālis (ˈkwaːlɪs) meaning "of what sort" or "of what kind" in relation to a specific instance, such as "what it is like to taste a specific applethis particular apple now". Examples of qualia include the perceived sensation of pain of a headache, the taste of wine, and the redness of an evening sky.
Property dualismProperty dualism describes a category of positions in the philosophy of mind which hold that, although the world is composed of just one kind of substance—the physical kind—there exist two distinct kinds of properties: physical properties and mental properties. In other words, it is the view that at least some non-physical, mental properties (such as thoughts, imagination and memories) exist in, or naturally supervene upon, certain physical substances (namely brains).
Explanatory gapIn the philosophy of mind and consciousness, the explanatory gap is the difficulty that physicalist philosophies have in explaining how physical properties give rise to the way things feel subjectively when they are experienced. It is a term introduced by philosopher Joseph Levine. In the 1983 paper in which he first used the term, he used as an example the sentence, "Pain is the firing of C fibers", pointing out that while it might be valid in a physiological sense, it does not help us to understand how pain feels.
Neural correlates of consciousnessThe neural correlates of consciousness (NCC) refer to the relationships between mental states and neural states and constitute the minimal set of neuronal events and mechanisms sufficient for a specific conscious percept. Neuroscientists use empirical approaches to discover neural correlates of subjective phenomena; that is, neural changes which necessarily and regularly correlate with a specific experience.
Chinese roomThe Chinese room argument holds that a digital computer executing a program cannot have a "mind", "understanding", or "consciousness", regardless of how intelligently or human-like the program may make the computer behave. The argument was presented by philosopher John Searle in his paper "Minds, Brains, and Programs", published in Behavioral and Brain Sciences in 1980. Similar arguments were presented by Gottfried Leibniz (1714), Anatoly Dneprov (1961), Lawrence Davis (1974) and Ned Block (1978).