Software deployment is all of the activities that make a software system available for use.
The general deployment process consists of several interrelated activities with possible transitions between them. These activities can occur on the producer side or on the consumer side or both. Because every software system is unique, the precise processes or procedures within each activity can hardly be defined. Therefore, "deployment" should be interpreted as a general process that has to be customized according to specific requirements or characteristics.
When computers were extremely large, expensive, and bulky (mainframes and minicomputers), the software was often bundled together with the hardware by manufacturers. If business software needed to be installed on an existing computer, this might require an expensive, time-consuming visit by a systems architect or a consultant. For complex, on-premises installation of enterprise software today, this can still sometimes be the case.
However, with the development of mass-market software for the new age of microcomputers in the 1980s came new forms of software distribution first cartridges, then Compact Cassettes, then floppy disks, then (in the 1990s and later) optical media, the internet and flash drives. This meant that software deployment could be left to the customer. However, it was also increasingly recognized over time that configuration of the software by the customer was important and that this should ideally have a user-friendly interface (rather than, for example, requiring the customer to edit registry entries on Windows).
In pre-internet software deployments, deployments (and their closely related cousin, new software releases) were of necessity expensive, infrequent, bulky affairs. It is arguable therefore that the spread of the internet made end-to-end agile software development possible. Indeed, the advent of cloud computing and software as a service meant that software could be deployed to a large number of customers in minutes, over the internet.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Ce cours couvre des sujets liés à l'architecture software moderne et industrielle : la gestion agile de projets, la spécification des besoins, le développement d'applications critiques, la programmat
Technology is a driver of long-term welfare. Yet it also sometimes threatens sustainable development. This course investigates the links between technology and sustainable development, models causes o
Human Factors Engineering theory and research methods will be covered through an interdisciplinary focus on human cognition, behavior and physiology (ergonomics) in the design, development and evaluat
In software engineering, continuous integration (CI) is the practice of merging all developers' working copies to a shared mainline several times a day. Nowadays it is typically implemented in such a way that it triggers an automated build with testing. Grady Booch first proposed the term CI in his 1991 method, although he did not advocate integrating several times a day. Extreme programming (XP) adopted the concept of CI and did advocate integrating more than once per day – perhaps as many as tens of times per day.
In software engineering, a software development process is a process of planning and managing software development. It typically involves dividing software development work into smaller, parallel, or sequential steps or sub-processes to improve design and/or product management. It is also known as a software development life cycle (SDLC). The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application.
Continuous delivery (CD) is a software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time and, following a pipeline through a "production-like environment", without doing so manually. It aims at building, testing, and releasing software with greater speed and frequency. The approach helps reduce the cost, time, and risk of delivering changes by allowing for more incremental updates to applications in production.
Covers the principles and tools for reproducible research in biostatistics, emphasizing the importance of complete documentation and the use of text editors for compiling source documents.
Explores the integration of security practices within the DevOps culture, emphasizing the importance of adding security measures throughout the software development lifecycle.
Introduces the fundamentals of multiprocessor architecture, covering post-Moore servers, sustainable datacenters, parallel programming, and GPU utilization.
The real-time, and accurate inference of model parameters is of great importance in many scientific and engineering disciplines that use computational models (such as a digital twin) for the analysis and prediction of complex physical processes. However, f ...
2022
, , ,
The paper details the process of developing the ITER Plasma Control System (PCS), that is, how to design and deploy it systematically, in the most efficient and effective manner. The integrated nature of the ITER PCS, with its multitude of coupled control ...
Elsevier Science Sa2024
, , ,
With the recent advent of blockchains, we have witnessed a plethora of blockchain proposals. These proposals range from using work to using time, storage or stake in order to select blocks to be appended to the chain. As a drawback it makes it difficult fo ...