Computer simulation is the process of mathematical modelling, performed on a computer, which is designed to predict the behaviour of, or the outcome of, a real-world or physical system. The reliability of some mathematical models can be determined by comparing their results to the real-world outcomes they aim to predict. Computer simulations have become a useful tool for the mathematical modeling of many natural systems in physics (computational physics), astrophysics, climatology, chemistry, biology and manufacturing, as well as human systems in economics, psychology, social science, health care and engineering. Simulation of a system is represented as the running of the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems too complex for analytical solutions.
Computer simulations are realized by running computer programs that can be either small, running almost instantly on small devices, or large-scale programs that run for hours or days on network-based groups of computers. The scale of events being simulated by computer simulations has far exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modeling. In 1997, a desert-battle simulation of one force invading another involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait, using multiple supercomputers in the DoD High Performance Computer Modernization Program.
Other examples include a 1-billion-atom model of material deformation; a 2.64-million-atom model of the complex protein-producing organelle of all living organisms, the ribosome, in 2005;
a complete simulation of the life cycle of Mycoplasma genitalium in 2012; and the Blue Brain project at EPFL (Switzerland), begun in May 2005 to create the first computer simulation of the entire human brain, right down to the molecular level.
Because of the computational cost of simulation, computer experiments are used to perform inference such as uncertainty quantification.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
An intensive, hands-on, pragmatic introduction to computer programming. Students learn basic concepts like data types, control structures, string processing, functions, input/output. They perform simu
This course addresses the relationship between specific technological features and the learners' cognitive processes. It also covers the methods and results of empirical studies on this topic: do stud
Computer graphics deals with generating s and art with the aid of computers. Today, computer graphics is a core technology in digital photography, film, video games, digital art, cell phone and computer displays, and many specialized applications. A great deal of specialized hardware and software has been developed, with the displays of most devices being driven by computer graphics hardware. It is a vast and recently developed area of computer science. The phrase was coined in 1960 by computer graphics researchers Verne Hudson and William Fetter of Boeing.
A simulation is the imitation of the operation of a real-world process or system over time. Simulations require the use of models; the model represents the key characteristics or behaviors of the selected system or process, whereas the simulation represents the evolution of the model over time. Often, computers are used to execute the simulation. Simulation is used in many contexts, such as simulation of technology for performance tuning or optimizing, safety engineering, testing, training, education, and video games.
A computer simulation language is used to describe the operation of a simulation on a computer. There are two major types of simulation: continuous and discrete event though more modern languages can handle more complex combinations. Most languages also have a graphical interface and at least a simple statistic gathering capability for the analysis of the results. An important part of discrete-event languages is the ability to generate pseudo-random numbers and variants from different probability distributions.
The course provides a comprehensive overview of digital signal processing theory, covering discrete time, Fourier analysis, filter design, sampling, interpolation and quantization; it also includes a
Adaptive signal processing, A/D and D/A. This module provides the basic
tools for adaptive filtering and a solid mathematical framework for sampling and
quantization
Computer simulations are often used as support material for science education, as they can engage students through inquiry-based learning, promote their active interaction in the experimentation phase, and help them visualize abstract concepts. For instanc ...
The goal of this work is to use anisotropic adaptive finite elements for the numerical simulation of aluminium electrolysis. The anisotropic adaptive criteria are based on a posteriori error estimates derived for simplified problems. First, we consider an ...
The aircraft assembly system is highly complex involving different stakeholders from multiple domains. The design of such a system requires comprehensive consideration of various industrial scenarios aiming to optimize key performance indicators. Tradition ...