This lecture focuses on specifying the error between f'(x0) and its approximation using a backward finite difference formula, presenting theorem 2.1 for any function f and x0 in R. The error is bounded by c times h, where c can depend on f and x0 but not on h. The result implies that the error is halved when h is halved, or divided by ten when h is divided by ten. The demonstration, left as an exercise, is similar to the previous one, involving the exhibition of a constant c that depends on f and x0 but not on h, being half the maximum of the absolute values of second derivatives between x0 and x0-h.