Reconfigurable computingReconfigurable computing is a computer architecture combining some of the flexibility of software with the high performance of hardware by processing with very flexible high speed computing fabrics like field-programmable gate arrays (FPGAs). The principal difference when compared to using ordinary microprocessors is the ability to make substantial changes to the datapath itself in addition to the control flow. On the other hand, the main difference from custom hardware, i.e.
Beta particleA beta particle, also called beta ray or beta radiation (symbol β), is a high-energy, high-speed electron or positron emitted by the radioactive decay of an atomic nucleus during the process of beta decay. There are two forms of beta decay, β− decay and β+ decay, which produce electrons and positrons respectively. Beta particles with an energy of 0.5 MeV have a range of about one metre in the air; the distance is dependent on the particle energy.
Programmable logic deviceA programmable logic device (PLD) is an electronic component used to build reconfigurable digital circuits. Unlike digital logic constructed using discrete logic gates with fixed functions, a PLD has an undefined function at the time of manufacture. Before the PLD can be used in a circuit it must be programmed to implement the desired function. Compared to fixed logic devices, programmable logic devices simplify the design of complex logic and may offer superior performance.
DEC AlphaAlpha (original name Alpha AXP) is a 64-bit reduced instruction set computer (RISC) instruction set architecture (ISA) developed by Digital Equipment Corporation (DEC). Alpha was designed to replace 32-bit VAX complex instruction set computers (CISC) and to be a highly competitive RISC processor for Unix workstations and similar markets. Alpha is implemented in a series of microprocessors originally developed and fabricated by DEC.
Particle identificationParticle identification is the process of using information left by a particle passing through a particle detector to identify the type of particle. Particle identification reduces backgrounds and improves measurement resolutions, and is essential to many analyses at particle detectors. Charged particles have been identified using a variety of techniques. All methods rely on a measurement of the momentum in a tracking chamber combined with a measurement of the velocity to determine the charged particle mass, and therefore its identity.
Particle acceleratorA particle accelerator is a machine that uses electromagnetic fields to propel charged particles to very high speeds and energies, and to contain them in well-defined beams. Large accelerators are used for fundamental research in particle physics. The largest accelerator currently active is the Large Hadron Collider (LHC) near Geneva, Switzerland, operated by the CERN. It is a collider accelerator, which can accelerate two beams of protons to an energy of 6.5 TeV and cause them to collide head-on, creating center-of-mass energies of 13 TeV.
Place and routePlace and route is a stage in the design of printed circuit boards, integrated circuits, and field-programmable gate arrays. As implied by the name, it is composed of two steps, placement and routing. The first step, placement, involves deciding where to place all electronic components, circuitry, and logic elements in a generally limited amount of space. This is followed by routing, which decides the exact design of all the wires needed to connect the placed components.
Graphics processing unitA graphics processing unit (GPU) is a specialized electronic circuit initially designed to accelerate computer graphics and (either on a video card or embedded on the motherboards, mobile phones, personal computers, workstations, and game consoles). After their initial design, GPUs were found to be useful for non-graphic calculations involving embarrassingly parallel problems due to their parallel structure. Other non-graphical uses include the training of neural networks and cryptocurrency mining.
Data-driven testingData-driven testing (DDT), also known as table-driven testing or parameterized testing, is a software testing methodology that is used in the testing of computer software to describe testing done using a table of conditions directly as test inputs and verifiable outputs as well as the process where test environment settings and control are not hard-coded. In the simplest form the tester supplies the inputs from a row in the table and expects the outputs which occur in the same row.
OverfittingIn mathematical modeling, overfitting is "the production of an analysis that corresponds too closely or exactly to a particular set of data, and may therefore fail to fit to additional data or predict future observations reliably". An overfitted model is a mathematical model that contains more parameters than can be justified by the data. In a mathematical sense, these parameters represent the degree of a polynomial. The essence of overfitting is to have unknowingly extracted some of the residual variation (i.