In computing, code generation is part of the process chain of a compiler and converts intermediate representation of source code into a form (e.g., machine code) that can be readily executed by the target system. Sophisticated compilers typically perform multiple passes over various intermediate forms. This multi-stage process is used because many algorithms for code optimization are easier to apply one at a time, or because the input to one optimization relies on the completed processing performed by another optimization. This organization also facilitates the creation of a single compiler that can target multiple architectures, as only the last of the code generation stages (the backend) needs to change from target to target. (For more information on compiler design, see Compiler.) The input to the code generator typically consists of a parse tree or an abstract syntax tree. The tree is converted into a linear sequence of instructions, usually in an intermediate language such as three-address code. Further stages of compilation may or may not be referred to as "code generation", depending on whether they involve a significant change in the representation of the program. (For example, a peephole optimization pass would not likely be called "code generation", although a code generator might incorporate a peephole optimization pass.)parse tree In addition to the basic conversion from an intermediate representation into a linear sequence of machine instructions, a typical code generator tries to optimize the generated code in some way. Tasks which are typically part of a sophisticated compiler's "code generation" phase include: Instruction selection: which instructions to use. Instruction scheduling: in which order to put those instructions. Scheduling is a speed optimization that can have a critical effect on pipelined machines. Register allocation: the allocation of variables to processor registers Debug data generation if required so the code can be debugged.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (4)
MICRO-315: Embedded Systems and Robotics
Ce cours aborde la programmation de systèmes embarqués: la cross-compilation, l'utilisation d'une FPU dans des microcontrôleurs, l'utilisation d'instructions DSP et les mécanismes à disposition dans l
CS-320: Computer language processing
We teach the fundamental aspects of analyzing and interpreting computer languages, including the techniques to build compilers. You will build a working compiler from an elegant functional language in
PENS-230: Digital ENAC: le codage en contexte
Digital ENAC aims to provide students with the ability to apply the principles of coding to the practical life of designers and engineers. We will not focus on a specific coding language, but will ext
Show more
Related lectures (33)
Name Analysis: Compiler Phases and Symbol Tables
Explores compiler phases and symbol tables' role in mapping variables to declarations.
DBMS Processing Models
Explores DBMS processing models, operator boundaries, fusion, and code generation using LLVM.
Materialization Problems in Database Systems
Discusses materialization challenges in databases, query execution strategies, and performance implications.
Show more
Related publications (44)

Scalable Metaprogramming in Scala 3

Nicolas Alexander Stucki

A metaprogrammer should be able to reason about the semantics of the generated code.Multi-stage programming introduced an elegant and powerful solution to this problem.It follows a semantically driven approach to code generation, where semantics are fully ...
EPFL2023

Silent Bugs Matter: A Study of Compiler-Introduced Security Bugs

Mathias Josef Payer, Jianhao Xu

Compilers assure that any produced optimized code is semantically equivalent to the original code. However, even "correct" compilers may introduce security bugs as security properties go beyond translation correctness. Security bugs introduced by such corr ...
Berkeley2023

MOD2IR: High-Performance Code Generation for a Biophysically Detailed Neuronal Simulation DSL

Felix Schürmann, Pramod Shivaji Kumbhar, Omar Awile, Ioannis Magkanaris

Advances in computational capabilities and large volumes of experimental data have established computer simulations of brain tissue models as an important pillar in modern neuroscience. Alongside, a variety of domain specific languages (DSLs) have been dev ...
ACM2023
Show more
Related concepts (17)
LLVM
LLVM is a set of compiler and toolchain technologies that can be used to develop a frontend for any programming language and a backend for any instruction set architecture. LLVM is designed around a language-independent intermediate representation (IR) that serves as a portable, high-level assembly language that can be optimized with a variety of transformations over multiple passes. The name LLVM originally stood for Low Level Virtual Machine, though the project has expanded and the name is no longer officially an initialism.
Program optimization
In computer science, program optimization, code optimization, or software optimization, is the process of modifying a software system to make some aspect of it work more efficiently or use fewer resources. In general, a computer program may be optimized so that it executes more rapidly, or to make it capable of operating with less memory storage or other resources, or draw less power. Although the word "optimization" shares the same root as "optimal", it is rare for the process of optimization to produce a truly optimal system.
Source-to-source compiler
A source-to-source translator, source-to-source compiler (S2S compiler), transcompiler, or transpiler is a type of translator that takes the source code of a program written in a programming language as its input and produces an equivalent source code in the same or a different programming language. A source-to-source translator converts between programming languages that operate at approximately the same level of abstraction, while a traditional compiler translates from a higher level programming language to a lower level programming language.
Show more
Related MOOCs (15)
Parallelism and Concurrency
(merge of parprog1, scala-reactive, scala-spark-big-data)
Functional Programming
In this course you will discover the elements of the functional programming style and learn how to apply them usefully in your daily programming tasks. You will also develop a solid foundation for rea
Functional Programming Principles in Scala [retired]
This advanced undergraduate programming course covers the principles of functional programming using Scala, including the use of functions as values, recursion, immutability, pattern matching, higher-
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.