The history of computing hardware covers the developments from early simple devices to aid calculation to modern day computers. The first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result. Later, computers represented numbers in a continuous form (e.g. distance along a scale, rotation of a shaft, or a voltage). Numbers could also be represented in the form of digits, automatically manipulated by a mechanism. Although this approach generally required more complex mechanisms, it greatly increased the precision of results. The development of transistor technology and then the integrated circuit chip led to a series of breakthroughs, starting with transistor computers and then integrated circuit computers, causing digital computers to largely replace analog computers. Metal-oxide-semiconductor (MOS) large-scale integration (LSI) then enabled semiconductor memory and the microprocessor, leading to another key breakthrough, the miniaturized personal computer (PC), in the 1970s. The cost of computers gradually became so low that personal computers by the 1990s, and then mobile computers (smartphones and tablets) in the 2000s, became ubiquitous. Timeline of computing hardware before 1950 Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick. The Lebombo bone from the mountains between Eswatini and South Africa may be the oldest known mathematical artifact. It dates from 35,000 BCE and consists of 29 distinct notches that were deliberately cut into a baboon's fibula. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers. The use of counting rods is one example. The abacus was early used for arithmetic tasks.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (1)
HUM-439: Digital humanities I
Alliant numérique et histoire, cet enseignement aborde le traitement par la presse des grands événements internationaux du XXe siècle. En travaillant sur les archives numérisées, les étudiant-e-s util
Related lectures (13)
Physics-inspired Computing and Neuromorphic Algorithms
Explores physics-inspired computing, analog computing, optical computing, and the potential of spin waves for physics-based computing.
Neuromorphic Computing: Concepts and Hardware Implementations
Covers neuromorphic computing, challenges in ternary and binary computing, hardware simulations of the brain, and new materials for artificial brain cells.
LabVIEW Programming Essentials
Explores LabVIEW essentials, troubleshooting common issues, managing cache, and data visualization techniques.
Show more
Related publications (52)

ALPINE: Analog In-Memory Acceleration with Tight Processor Integration for Deep Learning

David Atienza Alonso, Marina Zapater Sancho, Giovanni Ansaloni, Alexandre Sébastien Julien Levisse, Irem Boybat Kara, Yasir Mahmood Qureshi, Joshua Alexander Harrison Klein, Abu Sebastian

Analog in-memory computing (AIMC) cores offers significant performance and energy benefits for neural network inference with respect to digital logic (e.g., CPUs). AIMCs accelerate matrix-vector multiplications, which dominate these applications’ run-time. ...
2022

Parallel Analog Computing Based on a 2×2 Multiple-Input Multiple-Output Metasurface Processor With Asymmetric Response

Romain Christophe Rémy Fleury, Ali Momeni, Amirhossein Babaee

We present a polarization-insensitive metasurface processor to perform spatial asymmetric filtering of an incident beam, thereby allowing for real-time parallel analog processing. To enable massive parallel processing, we introduce a multiple-input multipl ...
2021

Blade: An in-Cache Computing Architecture for Edge Devices

David Atienza Alonso, Marina Zapater Sancho, Alexandre Sébastien Julien Levisse, Marco Antonio Rios, William Andrew Simon, Yasir Mahmood Qureshi

Area and power constrained edge devices are increasingly utilized to perform compute intensive workloads, necessitating increasingly area and power efficient accelerators. In this context, in-SRAM computing performs hundreds of parallel operations on spati ...
2020
Show more
Related concepts (34)
Information technology
Information technology (IT) is the use of computers to create, process, store, retrieve and exchange all kinds of data and information. IT forms part of information and communications technology (ICT). An information technology system (IT system) is generally an information system, a communications system, or, more specifically speaking, a computer system — including all hardware, software, and peripheral equipment — operated by a limited group of IT users, and an IT project usually refers to the commissioning and implementation of an IT system.
ENIAC
ENIAC ('ɛniæk; Electronic Numerical Integrator and Computer) was the first programmable, electronic, general-purpose digital computer, completed in 1945. There were other computers that had combinations of these features, but the ENIAC had all of them in one computer. It was Turing-complete and able to solve "a large class of numerical problems" through reprogramming. Although ENIAC was designed and primarily used to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory (which later became a part of the Army Research Laboratory), its first program was a study of the feasibility of the thermonuclear weapon.
Information Age
The Information Age (also known as the Computer Age, Digital Age, Silicon Age, New Media Age, or Media Age) is a historical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology. The onset of the Information Age has been linked to the development of the transistor in 1947, the optical amplifier in 1957, and Unix time, which began on January 1, 1970.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.