Central processing unitA central processing unit (CPU)—also called a central processor or main processor—is the most important processor in a given computer. Its electronic circuitry executes instructions of a computer program, such as arithmetic, logic, controlling, and input/output (I/O) operations. This role contrasts with that of external components, such as main memory and I/O circuitry, and specialized coprocessors such as graphics processing units (GPUs). The form, design, and implementation of CPUs have changed over time, but their fundamental operation remains almost unchanged.
Dynamic logic (digital electronics)In integrated circuit design, dynamic logic (or sometimes clocked logic) is a design methodology in combinational logic circuits, particularly those implemented in metal–oxide–semiconductor (MOS) technology. It is distinguished from the so-called static logic by exploiting temporary storage of information in stray and gate capacitances. It was popular in the 1970s and has seen a recent resurgence in the design of high-speed digital electronics, particularly central processing units (CPUs).
Error correction codeIn computing, telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels. The central idea is that the sender encodes the message in a redundant way, most often by using an error correction code or error correcting code (ECC). The redundancy allows the receiver not only to detect errors that may occur anywhere in the message, but often to correct a limited number of errors.
Magnetic-core memoryMagnetic-core memory was the predominant form of random-access computer memory for 20 years between about 1955 and 1975. Such memory is often just called core memory, or, informally, core. Core memory uses toroids (rings) of a hard magnetic material (usually a semi-hard ferrite) as transformer cores, where each wire threaded through the core serves as a transformer winding. Two or more wires pass through each core. Magnetic hysteresis allows each of the cores to "remember", or store a state.
Error detection and correctionIn information theory and coding theory with applications in computer science and telecommunication, error detection and correction (EDAC) or error control are techniques that enable reliable delivery of digital data over unreliable communication channels. Many communication channels are subject to channel noise, and thus errors may be introduced during transmission from the source to a receiver. Error detection techniques allow detecting such errors, while error correction enables reconstruction of the original data in many cases.
Molten-Salt Reactor ExperimentThe Molten-Salt Reactor Experiment (MSRE) was an experimental molten salt reactor research reactor at the Oak Ridge National Laboratory (ORNL). This technology was researched through the 1960s, the reactor was constructed by 1964, it went critical in 1965, and was operated until 1969. The costs of a cleanup project were estimated at about $130 million. The MSRE was a 7.4 MWth test reactor simulating the neutronic "kernel" of a type of inherently safer epithermal thorium breeder reactor called the liquid fluoride thorium reactor.
Liquid fluoride thorium reactorThe liquid fluoride thorium reactor (LFTR; often pronounced lifter) is a type of molten salt reactor. LFTRs use the thorium fuel cycle with a fluoride-based molten (liquid) salt for fuel. In a typical design, the liquid is pumped between a critical core and an external heat exchanger where the heat is transferred to a nonradioactive secondary salt. The secondary salt then transfers its heat to a steam turbine or closed-cycle gas turbine. Molten-salt-fueled reactors (MSRs) supply the nuclear fuel mixed into a molten salt.
Computer performanceIn computing, computer performance is the amount of useful work accomplished by a computer system. Outside of specific contexts, computer performance is estimated in terms of accuracy, efficiency and speed of executing computer program instructions. When it comes to high computer performance, one or more of the following factors might be involved: Short response time for a given piece of work. High throughput (rate of processing work). Low utilization of computing resource(s). Fast (or highly compact) data compression and decompression.