Quantum supremacyIn quantum computing, quantum supremacy, quantum primacy or quantum advantage is the goal of demonstrating that a programmable quantum computer can solve a problem that no classical computer can solve in any feasible amount of time, irrespective of the usefulness of the problem. The term was coined by John Preskill in 2012, but the concept dates back to Yuri Manin's 1980 and Richard Feynman's 1981 proposals of quantum computing.
Quantum Fourier transformIn quantum computing, the quantum Fourier transform (QFT) is a linear transformation on quantum bits, and is the quantum analogue of the discrete Fourier transform. The quantum Fourier transform is a part of many quantum algorithms, notably Shor's algorithm for factoring and computing the discrete logarithm, the quantum phase estimation algorithm for estimating the eigenvalues of a unitary operator, and algorithms for the hidden subgroup problem. The quantum Fourier transform was discovered by Don Coppersmith.
Thermodynamic temperatureThermodynamic temperature is a quantity defined in thermodynamics as distinct from kinetic theory or statistical mechanics. Historically, thermodynamic temperature was defined by Lord Kelvin in terms of a macroscopic relation between thermodynamic work and heat transfer as defined in thermodynamics, but the kelvin was redefined by international agreement in 2019 in terms of phenomena that are now understood as manifestations of the kinetic energy of free motion of microscopic particles such as atoms, molecules, and electrons.
Quantum logicIn the mathematical study of logic and the physical analysis of quantum foundations, quantum logic is a set of rules for manipulation of propositions inspired by the structure of quantum theory. The formal system takes as its starting point an observation of Garrett Birkhoff and John von Neumann, that the structure of experimental tests in classical mechanics forms a Boolean algebra, but the structure of experimental tests in quantum mechanics forms a much more complicated structure.
CryogenicsIn physics, cryogenics is the production and behaviour of materials at very low temperatures. The 13th IIR International Congress of Refrigeration (held in Washington DC in 1971) endorsed a universal definition of "cryogenics" and "cryogenic" by accepting a threshold of 120 K (or –153 °C) to distinguish these terms from the conventional refrigeration. This is a logical dividing line, since the normal boiling points of the so-called permanent gases (such as helium, hydrogen, neon, nitrogen, oxygen, and normal air) lie below 120 K, while the Freon refrigerants, hydrocarbons, and other common refrigerants have boiling points above 120 K.
Quantum cryptographyQuantum cryptography is the science of exploiting quantum mechanical properties to perform cryptographic tasks. The best known example of quantum cryptography is quantum key distribution which offers an information-theoretically secure solution to the key exchange problem. The advantage of quantum cryptography lies in the fact that it allows the completion of various cryptographic tasks that are proven or conjectured to be impossible using only classical (i.e. non-quantum) communication.
Quantum annealingQuantum annealing (QA) is an optimization process for finding the global minimum of a given objective function over a given set of candidate solutions (candidate states), by a process using quantum fluctuations. Quantum annealing is used mainly for problems where the search space is discrete (combinatorial optimization problems) with many local minima; such as finding the ground state of a spin glass or the traveling salesman problem. The term "quantum annealing" was first proposed in 1988 by B. Apolloni, N.
Software performance testingIn software quality assurance, performance testing is in general a testing practice performed to determine how a system performs in terms of responsiveness and stability under a particular workload. It can also serve to investigate, measure, validate or verify other quality attributes of the system, such as scalability, reliability and resource usage. Performance testing, a subset of performance engineering, is a computer science practice which strives to build performance standards into the implementation, design and architecture of a system.
Operating systemAn operating system (OS) is system software that manages computer hardware and software resources, and provides common services for computer programs. Time-sharing operating systems schedule tasks for efficient use of the system and may also include accounting software for cost allocation of processor time, mass storage, peripherals, and other resources.
Reconfigurable computingReconfigurable computing is a computer architecture combining some of the flexibility of software with the high performance of hardware by processing with very flexible high speed computing fabrics like field-programmable gate arrays (FPGAs). The principal difference when compared to using ordinary microprocessors is the ability to make substantial changes to the datapath itself in addition to the control flow. On the other hand, the main difference from custom hardware, i.e.