Concept# Laboratoires Bell

Résumé

Nokia Bell Labs , plus connus sous l'appellation de Bell Labs, ou Les Bell Labs), furent fondés en 1925 et implantés à Murray Hill dans l'État américain du New Jersey. En 2009, ils font partie du centre de recherche et développement d'Alcatel-Lucent racheté en 2016 par Nokia.
Les Laboratoires Bell ont déposé jusqu'en 2012 plus de . Les recherches menées par les Laboratoires Bell ont pris une importance capitale dans des domaines tels que les télécommunications (réseau téléphonique, transmission télévisuelle, communications satellite) et l'informatique (Unix, C et C++). Ce sont des Laboratoires Bell que proviennent aussi le transistor, la cellule photoélectrique, le laser et le développement des communications par fibre optique.
Historique
Antécédents
vignette|Le pavillon Volta de Bell à Washington, D.C. (1893).
Le Laboratoire A. G. Bell, également connu sous le nom de Volta Bureau est créé à Washington à la demande d'Alexander Graham Bell. En 1880, le gouvern

Source officielle

Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.

Publications associées

Chargement

Personnes associées

Chargement

Unités associées

Chargement

Concepts associés

Chargement

Cours associés

Chargement

Séances de cours associées

Chargement

Publications associées (25)

Chargement

Chargement

Chargement

Personnes associées (2)

,

Unités associées (5)

Concepts associés (235)

Unix

Unix, officiellement UNIX, est une famille de systèmes d'exploitation multitâche et multi-utilisateur dérivé du Unix d'origine créé par AT&T, le développement de ce dernier ayant commencé dans les ann

Transistor

vignette|Quelques modèles de transistors.
Le transistor est un composant électronique à semi-conducteur permettant de contrôler ou d'amplifier des tensions et des courants électriques. C'est le compo

Électrotechnique

L’électrotechnique se rapporte . Elle concerne par exemple la production, le transport, la distribution, le traitement, la transformation, la gestion et l’utilisation de l’énergie électrique. Parfois

Claude Elwood Shannon in 1948, then of the Bell Labs, published one of the ground breaking papers in the history of engineering [1]. This paper (”A Mathematical Theory of Communication”, Bell System Tech. Journal, Vol. 27, July and October 1948, pp. 379 - 423 and pp. 623 - 656) laid the groundwork of an entirely new scientific discipline, information Theory, that enabled engineers for the first time to deal quantitatively with the elusive concept of information”. In his celebrated work, Shannon cleanly laid the foundation for transmission and storage of information. Using a probabilistic model, his theory helped to get further insight into the achievable limits of information transfer over perturbed medium called channel. Indeed the very same concept is used to predict the limits on data compression and achievable transmission rate on a probabilistic channel.These underlying concepts can be thought of as inequalities involving measures of probability distributions. Shannon defined several such basic measures in his original work. The field of Information Theory grew with researchers finding more results and insights into the fundamental problem of transmission of and storage using probabilistic models. By nature of the subject itself, the results obtained are usually inequalities involving basic Shannon’s measures such as entropies. Some of them are elementary, some rather complicated expressions. In order to prove further theorems as well it required to check whether certain expressions are true in an Information Theoretic sense. This motivated researchers to seek a formal method to check all possible inequalities. Raymond Yeung [2] in 1998 came out with a remarkable framework, which could verify many of the inequalities in this field. His framework thus enabled to verify all inequalities, derived from the basic Shannon measure properties. A central notion of Information Theory is entropy, which Shannon defines as measure of information itself. Given a set of jointly distributed random variables X1, X2, . . . , Xn, we can consider entropies of all random variables H(Xi), entropies of all pairs H (Xi , Xj ), etc. (2n − 1 entropy values for all nonempty subsets of {X1 , X2 , ..., Xn }). For every n-tuple of random variables we get a point in R2n−1, representing entropies of the given distribution. Following [2] we call a point in R2n−1 constructible if it represents entropy values of some collection of n random variables. The set of all constructible points is denoted b y Γ ∗n It is hard to characterize Γ∗n for an arbitrary n (for n ≥ 3, it is not even closed [?]). A more feasible (but also highly non- trivial) problem is to describe the closure Γ ̄∗n of the set Γ∗n. The set Γ ̄∗n n is a convex cone [?], and to characterize it we should describe the class of all linear inequalities of the form λ1H(X1) + . . . + λnH(Xn) + λ1,2H(X1X2) + . . . + λ1,2,3H(X1, X2, X3) + . . . + λ1,2,3,...,nH(X1, X2, X3, . . . , Xn) which are true for any random variables X1, X2, . . . , Xn (λi are real coefficients). Information inequalities are widely used for proving converse coding theorems in Information Theory. Recently interesting applications of information inequalities beyond Information Theory were found [10],[12],[14]. So investigation of the class of all valid information inequalities is an interesting problem. We refer the reader to [15] for a comprehensive treatment of the subject. Yeung’s framework thus helped to verify all the Shannon type inequalities. Yeung and Yan have also developed a software, to computationally verify such inequalities. Since the software is rather outdated, we have made an attempt to make a more efficient and user friendly implementation of the software, hinging from the original work of Yeung. The software, which we call information inequality solver (iis) is freely available for download from EPFL website. The new software suit has the added advantage that it is freed of dependencies on any licensed products such as Matlab (or toolboxes).

2007Cours associés (48)

Ce cours couvre les fondements des systèmes numériques. Sur la base d'algèbre Booléenne et de circuitscombinatoires et séquentiels incluant les machines d'états finis, les methodes d'analyse et de synthèse de systèmelogiques sont étudiées et appliquée

Le but du cours est de familiariser l'étudiant-e aux notions de base du droit et de l'éthique applicables à la recherche en STV et à son transfert en applications, et de lui fournir les éléments essentiels pour identifier les enjeux juridiques et éthiques dans sa future pratique professionnelle.

Introduction to the different contrast enhancing methods in optical microscopy. Basic hands-on experience with optical microscopes at EPFL's BioImaging and Optics Facility. How to investigate biological samples? How to obtain high quality images?

Only recently organic light-emitting diode (OLED) technology has successfully managed the transition from research labs into the consumer market, taking a 60% share of the global mobile display market in 2018. The latest discovery of thermally activated delayed fluorescence attracted a lot of attention in research and industry due to the potential to fabricate fluorescence-based OLEDs with high efficiencies comparable to the currently used phosphorescence-based OLEDs, but with the advantage of possibly cheaper and more sustainable emitter materials (no Ir-, Pt-complexes).
For achieving high efficiencies in OLEDs, a substantial number of layers and interfaces of the multilayer stack have to be optimized. A particularly important role is assigned to the emission layer within which light is generated by charge recombination and subsequent energy transfer and radiative decay of excitons. The understanding of charge recombination and exciton dynamics and the determination of the position of light generation are essential for the fabrication of modern OLEDs and are the goal of this thesis. Therefore two different OLED types, phosphorescence-based OLEDs and state-of-the-art TADF exciplex host OLEDs incorporating a fluorescent emitter, are studied by electro-optical characterization and device modelling.
In a first step the emission zones are determined and analyzed by angle-dependent steady-state measurements at different biases and optical simulations. In both OLED types split emission zones are obtained with densities of emissive excitons that decay way from both emission layer interfaces toward the center. For the phosphorescence-based OLEDs an additional bias-dependence of the split emission zone is observed, meaning that at low bias the main emission is located at the cathode side and shifts to the anode side for increasing bias. In a second step, with transient EL decay measurements and electro-optical simulations the split emission zones are correlated to an EL peak appearing after OLED turn-off. To study the influence of the emission zone and the exciton dynamics on the OLED efficiency an electro-optical device model is established to reproduce the experimentally obtained measurement data. As the model includes charge carrier dynamics, light outcoupling and time- and position-dependent exciton processes, such as the formation, diffusion, transfer, decay and quenching, the physical mechanisms in the OLEDs are elucidated. For the phosphorescence-based OLED a surprising current efficiency increase of up to 60% for increasing bias as well as a subsequent decrease is explained with the shift of the emission zone and its influence on exciton quenching and light outcoupling. Similarly, for the TADF exciplex host OLEDs a model parameter study illustrates promising EQE enhancement routes, which could lead to EQEs as high as 42%.
This thesis emphasizes the need of accurate knowledge of the emission zone and its bias-dependence due to its potentially strong influence on the OLED efficiency and its importance for the optimization of the OLED layer stack. In addition, this thesis shows that full electro-optical device modelling (including electrons, excitons and photons) combined with advanced electro-optical characterization techniques is crucial for elucidating the physical mechanisms in state-of-the-art OLEDs as well as for the prediction of promising routes for future efficiency enhancements.

Séances de cours associées (60)

Aristide Ramondgwendé Ouédraogo

The academic research and education risk problem as well as the diversification of risk types have increased concurrently with research development. Moreover, academia induces important worsening factors affecting risk such as high turnover of collaborators, students in education being not salaried and not trained to lab work and high concentration of research labs with diverse hazards. A problem that frequently faces safety personnel in research/teaching labs is to determine how serious each known hazard is? And to decide to what extent it should concentrate his resources to correct the situation. Questions like which risks must be first assessed and how to prioritize them are commonly asked. In response to that preoccupation, we developed a new risk ranking index namely Laboratory Criticity Index-LCI under the so called Laboratory Assessment and Risk Analysis-LARA methodology. The proposed methodology is based on multi-criteria modeling, combining two approaches: the Risk Priority Number-Failure Mode Effect and Critically Analysis (RPN-FMECA) and the Analytic Hierarchy Process (AHP). The common project, between the two polytechnic schools (EPFL and ETHZ), has consisted to formulate and build up the following methodology for assessing risks in research environment. It concentrates on two major sides playing a role in safety: Research and education in chemistry and physics. In research/teaching labs and specifically for sciences involving chemistry, physics or biology, risk could be defined, in a first step, as a subtle relationship between hazard probability and severity induced by this hazard. This first step is not sufficient, other actors such as, worsening factors, exposure factors, cost factors for example, will also play an important role in defining risk. To calculate the risk level, hence the LCI index, several parameters characterizing the site and processes are identified. This include description of process, hazard identification, risk perception (RP), measurement of hazard impact (Ih), probability of occurrence of an accident (POA), research specificities (RS), hazard detectability (HD), implementation of corrective actions (CM). It follows that the LCI, based on these independent variables, becomes a single and unique index, and one-dimensional quantification and prioritization of risks to rapidly locate risk in an area acceptable, tolerable or unacceptable. Through this project, a database is developed, inventorying all hazards presented in laboratories, the related processes in which they are involved, the sources of hazards, their causes and consequences, and worsening factors. Parameters for LCI calculation of each hazard are also stored. LCI is implemented in the database and a web application is create to systemize the used of information. As a unified and interactive tool built on a multidisciplinary approach, it provides a way to perform a preliminary and rapid analysis of hazards and risks, helping and guiding safety teams in research/teaching labs and education. The developed methodology is simple, easy to perform, robust, user-friendly, intuitive and delivers a hierarchisation of risks for correctives actions. The ranking helps to clearly identify critical areas in research/teaching labs. The association between risk level and geographical zones on site is primordial to have a good safety management and risks in over 800 research labs will be analyzed using this methodology.