**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Computer scientist

Summary

A computer scientist is a scholar who specializes in the academic study of computer science.
Computer scientists typically work on the theoretical side of computation, as opposed to the hardware side on which computer engineers mainly focus (although there is overlap). Although computer scientists can also focus their work and research on specific areas (such as algorithm and data structure development and design, software engineering, information theory, database theory, computational complexity theory, numerical analysis, programming language theory, computer graphics, and computer vision), their foundation is the theoretical study of computing from which these other fields derive.
A primary goal of computer scientists is to develop or validate models, often mathematical, to describe the properties of computational systems (processors, programs, computers interacting with people, computers interacting with other computers, etc.) with an overall objective of discovering designs th

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications

Loading

Related people

Loading

Related units

Loading

Related concepts

Loading

Related courses

Loading

Related lectures

Loading

Related people (3)

Related publications (33)

Loading

Loading

Loading

Related courses (6)

Related units (3)

DH-405: Foundations of digital humanities

This course gives an introduction to the fundamental concepts and methods of the Digital Humanities, both from a theoretical and applied point of view. The course introduces the Digital Humanities circle of processing and interpretation, from data acquisition to new understandings.

CS-307: Introduction to multiprocessor architecture

Multiprocessors are a core component in all types of computing infrastructure, from phones to datacenters. This course will build on the prerequisites of processor design and concurrency to introduce the essential technologies required to combine multiple processing elements into a single computer.

MATH-329: Continuous optimization

This course introduces students to continuous, nonlinear optimization. We study the theory of optimization with continuous variables (with full proofs), and we analyze and implement important algorithms to solve constrained and unconstrained problems.

Related concepts (39)

Computer science

Computer science is the study of computation, information, and automation. Computer science spans theoretical disciplines (such as algorithms, theory of computation, and information theory) to applied

Alan Turing

Alan Mathison Turing (ˈtjʊərɪŋ; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly in

Computational complexity theory

In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each ot

Related lectures (7)

The conventional CAD/CAM approach to design does not show the essential spatial relationships between user and product that are crucial for intuitive design analysis. As populations age and the home appliance market stagnates, Universal Design principles implemented with computerized virtual worlds become more important for meeting the ergonomic problems of heterogeneous populations that are increasingly difficult to adequately test with real-world subjects. Digital Human Modelling (DHM) is an emerging area that bridges computer-aided engineering design, human factors engineering and applied ergonomics. The most advanced forms of this technology are being used by many researchers for practical applications, including ergonomic analysis. However, a state of the art model of this technology has never been conceived for the conceptual design stage of a product development cycle as most conventional DHM techniques lack real time interaction, require considerable user intervention, and have inefficient control facilities and non-adequate validation techniques, all contributing to slow production pipelines. They have also not addressed the needs of the growing ageing population in many societies across the globe. The focus of this dissertation is to introduce a complete framework for ergonomic simulation at the conceptual design stage of a product development cycle based on parametric virtual humans in a prioritized inverse kinematics framework while taking biomechanical knowledge in to account. Using an intuitive control facility, design engineers can input a simple CAD model, design variables and human factors in to the system. The evaluation engine generates the required simulation in real-time by making use of an Anthropometric Database, Physical Characteristic Database and Prioritized Inverse Kinematics architecture. The key components of the total system are described and the results are demonstrated with a few applications such as kitchen, wash-basin and bath-tub. By introducing a quantitative estimation of ageing algorithm for anthropometric digital human models, products can be designed from the start to suit the ergonomic needs of the user rather than the biases and assumptions of the designer. Also, by creating a tool that can be used intuitively by non-specialists in a dynamic, real-time environment, designers can stop relying on specialists to test the safety of their ideas and start to effectively use data about populations to discover designs that can be used more easily by more people. Results have been validated with real human subjects indicating the practical implication of the total system as an ergonomic design tool for the conceptual design stage of a product development cycle.

,

This chapter shows some of the practices of engineers use when they are confronted to completely new situations, when they enter into an emerging field where methods and paradigms are not yet stabilized. Following the engineers here would help to shed light on their practices when they are confronted to new fields and new interlocutors. This is the case for engineers and computer scientists who engage themselves with human and social sciences to imagine, design, develop and implement digital humanities (DH) with specific hardware, software and infrastructure.

This thesis focuses on the analysis of the trajectories of a mobile agent. It presents different techniques to acquire a quantitative measure of the difference between two trajectories or two trajectory datasets. A novel approach is presented here, based on the Point Distribution Model (PDM). This model was developed by computer vision scientists to compare deformable shapes. This thesis presents the mathematical reformulation of the PDM to fit spatiotemporal data, such as trajectory information. The behavior of a mobile agent can rarely be represented by a unique trajectory, as its stochastic component will not be taken into account. Thus, the PDM focuses on the comparison of trajectory datasets. If the difference between datasets is greater than the variation within each dataset, it will be observable in the first few dimensions of the PDM. Moreover, this difference can also be quantified using the inter-cluster distance defined in this thesis. The resulting measure is much more efficient than visual comparisons of trajectories, as are often made in existing scientific literature. This thesis also compares the PDM with standard techniques, such as statistical tests, Hidden Markov Models (HMMs) or Correlated Random Walk (CRW) models. As a PDM is a linear transformation of space, it is much simpler to comprehend. Moreover, spatial representations of the deformation modes can easily be constructed in order to make the model more intuitive. This thesis also presents the limits of the PDM and offers other solutions when it is not adequate. From the different results obtained, it can be pointed out that no universal solution exists for the analysis of trajectories, however, solutions were found and described for all of the problems presented in this thesis. As the PDM requires that all the trajectories consist of the same number of points, techniques of resampling were studied. The main solution was developed for trajectories generated on a track, such as the trajectory of a car on a road or the trajectory of a pedestrian in a hallway. The different resampling techniques presented in this thesis provide solutions to all the experimental setups studied, and can easily be modified to fit other scenarios. It is however very important to understand how they work and to tune their parameters according to the characteristics of the experimental setup. The main principle of this thesis is that analysis techniques and data representations must be appropriately selected with respect to the fundamental goal. Even a simple tool such as the t-test can occasionally be sufficient to measure trajectory differences. However, if no dissimilarity can be observed, it does not necessarily mean that the trajectories are equal – it merely indicates that the analyzed feature is similar. Alternatively, other more complex methods could be used to highlight differences. Ultimately, two trajectories are equal if and only if they consist of the exact same sequence of points. Otherwise, a difference can always be found. Thus, it is important to know which trajectory features have to be compared. Finally, the diverse techniques used in this thesis offer a complete methodology to analyze trajectories.