This lecture covers Conditional Maximum Likelihood (CML) estimation, which involves solving the argmax of the likelihood function given the sample. The instructor explains the consistency but lack of efficiency of CML, and its application using Bayes' theorem. The lecture delves into the contribution to the conditional likelihood, simplifying for logit and MEV models, and discusses the correction for pure choice-based sampling. Practical procedures for estimating parameters and correcting constants are also presented, along with examples and sampling strategies. The MEV model and its application in choice-based samples are explored, emphasizing the differences with the logit model. The lecture concludes with a bibliography for further reading.