**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Modeling of the emittance growth due to decoherence in collision at the Large Hadron Collider

Abstract

The transverse emittance growth rate of colliding hadron beams driven by external sources of noise is investigated based on existing analytical model as well as on macro-particle simulations and comparison to experimental data at the Large Hadron Collider (LHC). It is shown that an analytical description of the emittance growth rate neglecting the existence coherent beam-beam mode can nevertheless provide accurate estimate for operational conditions, featuring notably a high chromaticity. The model is used to investigate the level of noise experienced by the LHC beams. The results indicate that a significant reduction of the noise floor of the transverse feedback's beam position monitor is required for operation with a large beam-beam tune shift, as the one anticipated for the High Luminosity LHC (HL-LHC).

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (5)

Related publications (3)

Large Hadron Collider

The Large Hadron Collider (LHC) is the world's largest and highest-energy particle collider. It was built by the European Organization for Nuclear Research (CERN) between 1998 and 2008 in collaboration with over 10,000 scientists and hundreds of universities and laboratories, as well as more than 100 countries. It lies in a tunnel in circumference and as deep as beneath the France–Switzerland border near Geneva. The first collisions were achieved in 2010 at an energy of 3.

Very Large Hadron Collider

The Very Large Hadron Collider (VLHC) was a proposed future hadron collider planned to be located at Fermilab. The VLHC was planned to be located in a ring, using the Tevatron as an injector. The VLHC would run in two stages, initially the Stage-1 VLHC would have a collision energy of 40 TeV, and a luminosity of at least 1⋅1034 cm−2⋅s−1 (matching or surpassing the LHC design luminosity, however the LHC has now surpassed this).

Statistical significance

In statistical hypothesis testing, a result has statistical significance when a result at least as "extreme" would be very infrequent if the null hypothesis were true. More precisely, a study's defined significance level, denoted by , is the probability of the study rejecting the null hypothesis, given that the null hypothesis is true; and the p-value of a result, , is the probability of obtaining a result at least as extreme, given that the null hypothesis is true. The result is statistically significant, by the standards of the study, when .

Massimiliano Schwarz, Joël Repond

The Super Proton Synchrotron (SPS) at CERN is the injector of the Large Hadron Collider (LHC). Multi-bunch instabilities limit the intensity of the beam that can be accelerated to 450 GeV in the SPS and transferred to the LHC. Without mitigation measures, the threshold of bunch intensity of the longitudinal instability is three times below the nominal bunch intensity of the LHC beam. The High Luminosity LHC project (HL-LHC), which requires a doubling of the nominal bunch intensity, relies on improvement of beam stability in the SPS. A fourth harmonic RF system allows, presently, to stabilize the beam up to nominal LHC intensity. It increases the synchrotron frequency spread inside the bunch, providing more efficient Landau damping of beam instability. However, nonlinearities of the synchrotron frequency distribution inside the bunch pose a limitation on bunch length. This paper explores possible intensity increase in the SPS by studying the effect on beam stability of the voltage ratio between two RF systems. The results are substantiated by beam measurements and particle-tracking simulations. An optimized voltage program of the second RF system during the cycle has been tested in operation and beam stability and the quality have been successfully improved.

2019César Octavio Domínguez Sánchez De La Blanca

The Large Hadron Collider (LHC) is the world’s largest and most powerful particle collider. Its main objectives are to explore the validity of the standard model of particle physics and to look for new physics beyond it, at unprecedented collision energies and rates. A good luminosity performance is imperative to attain these goals. In the last stage of the LHC commissioning (2011-2012), the limiting factor to achieving the design bunch spacing of 25 ns has been the electron cloud effects. The electron cloud is also expected to be the most important luminosity limitation after the first Long Shut-Down of the LHC (LS1), when the machine should be operated at higher energy and with 25-ns spacing, as well as for the planned luminosity upgrade (HL-LHC) and future high energy proton colliders (HE-LHC and VHE-LHC). This thesis contributes to the understanding of the electron cloud observations during the first run of the LHC (2010-2012), presents the first beam dynamics analysis for the next generation of high energy hadron colliders, and assists in the prediction of how electron clouds will impact the performance of the future high-luminosity and high-energy machines. In particular, the thesis discusses a method to benchmark pressure measurements at the LHC against electron cloud build-up simulations for identifying the most relevant surface parameters. This method allowed monitoring the effectiveness of LHC “scrubbing runs”, revealing that in the warm regions, the maximum Secondary Electron Yield, δmax, decreased from an initial value of about 1.9 down to about 1.2 (with a low-energy electron reflectivity R ≈ 0.3), thanks to surface conditioning. In addition, the “map formalism”, a good approximation to quickly explain and predict electron cloud effects, has been further developed and applied, for the first time, to optimize the scrubbing process at the LHC. For the HL-LHC, several novel filling schemes have been analyzed in terms of luminosity performance and electron cloud activity. Only a few of them are compatible with an electron cloud activity lower than for the baseline scenario. We highlight a promising option which could be a good fallback scenario in case the electron cloud effects prevent the injection of the baseline beam. This option could also be considered for the nominal LHC after the LS1 if electron cloud turns out to be a serious obstacle. Regarding the future high-energy proton colliders (HE- and VHE-LHC), in the frame of this thesis a performance model was developed to predict the luminosity as a function of time and to optimize the beam parameters, carrying out the first ever performance analysis for these machines. Several scenarios have been considered, including round and flat beams as well as different bunch spacings. The parameters presented in this thesis have been submitted as an input for the most recent update of the European Strategy for Particle Physics. Finally, we also report the electron cloud studies performed for both high energy machines. The large amount of primary photoelectrons generated by synchrotron radiation at these high energies motivates the consideration of high efficiency photon stops as well as other mitigation techniques (e.g. a-C coatings and clearing electrodes). Although for both machines (HE- and VHE-LHC) a tentative bunch spacing of 25 ns has been considered as baseline assumption, the results of this thesis suggest the possibility of going down to 5 ns, since such a beam would present several advantages.

The Super Proton Synchrotron (SPS) at CERN is the injector of the Large Hadron Collider (LHC), the world's largest particle collider. The High-Luminosity LHC (HL-LHC) project is a major step forward in the improvement of the LHC performances and it requires a doubling of the nominal bunch intensity of the current LHC beam.
In the SPS, multi-bunch instabilities and particle losses limit the beam intensity that can be accelerated to 450 GeV/c and transferred to the LHC. Without mitigation measures, the bunch intensity threshold for longitudinal instabilities is three times below the nominal intensity of the LHC beam. Moreover, the present limited RF power is not sufficient to accelerate beams with intensities well above nominal without substantial particle losses and a reduction of the RF voltage available for the beam at the flat top energy.
The SPS will undergo significant upgrades but they may not be sufficient to ensure the stability of the HL-LHC beam. The objectives of this doctoral research are to study the longitudinal intensity limitations of the LHC proton beam in the SPS and to find possible mitigation measures to ensure the beam stability and quality at HL-LHC intensity.
Beam measurements and particle simulations are used in conjunction with analytical estimations to study the multi-bunch instabilities during the cycle in the SPS. This work attempts to identify the main sources of instabilities and beam quality degradation. Possible scenarios of mitigation measures are investigated to explore the future beam parameters achievable after upgrades. The effects on beam stability of the foreseen RF upgrade, the double RF operation and the reduction of various longitudinal beam-coupling impedances are analysed in detail. The scenario of a lower-harmonic RF system in the SPS, for particle losses reduction, is also studied.