Unnecessary health care (overutilization, overuse, or overtreatment) is health care provided with a higher volume or cost than is appropriate. In the United States, where health care costs are the highest as a percentage of GDP, overuse was the predominant factor in its expense, accounting for about a third of its health care spending (2.6 trillion) in 2012. Factors that drive overuse include paying health professionals more to do more (fee-for-service), defensive medicine to protect against litigiousness, and insulation from price sensitivity in instances where the consumer is not the payer—the patient receives goods and services but insurance pays for them (whether public insurance, private, or both). Such factors leave many actors in the system (doctors, patients, pharmaceutical companies, device manufacturers) with inadequate incentive to restrain health care prices or overuse. This drives payers, such as national health insurance systems or the U.S. Centers for Medicare and Medicaid Services, to focus on medical necessity as a condition for payment. However, the threshold between necessity and lack thereof can often be subjective. Overtreatment, in the strict sense, may refer to unnecessary medical interventions, including treatment of a self-limited condition (overdiagnosis) or to extensive treatment for a condition that requires only limited treatment. It is economically linked with overmedicalization. A forerunner of the term was what Jack Wennberg called unwarranted variation, different rates of treatments based upon where people lived, not clinical rationale. He had discovered that in studies that began in 1967 and were published in the 1970s and the 1980s: "The basic premise – that medicine was driven by science and by physicians capable of making clinical decisions based on well-established fact and theory – was simply incompatible with the data we saw. It was immediately apparent that suppliers were more important in driving demand than had been previously realized.