Historical references to the disease we call Malaria (Italian “bad air”) date back almost 4000 years. Hippocrates, a physician in ancient Greece, was the first to relate the occurrence of the disease to the time of year and to where the patients lived. The association with stagnant waters (breeding grounds mosquito) led the Romans to begin drainage programs, the first intervention against malaria. Historically, malaria has been present in most human inhabited regions of the globe.
In the southern United States, for example, records show millions of cases of malaria in the 1930s. In 1946, the Center of Disease Control (CDC) was founded with the specific goal of combating malaria. U.S. government-funded anti-malarial efforts included draining vast wetlands that served as mosquito habitat and breeding grounds. In 1955, the World Health Organization (WHO) launched the Global Malaria Eradication Programme with the goal of eliminating the disease within 10 years. Although the program—which relied heavily on use of the preventative insecticide dichlorodiphenyltrichloroethane (DDT) and the treatment drug chloroquine—was eventually stopped, it was successful at ending transmission of the disease in the US, Europe, and other wealthy nations (Finkel, 2007).
The program was ended in large part due to emerging research on the potential adverse effects of insecticides (including DDT) on people and wildlife (Greenwood and Mutabingwa, 2002). Additionally, elimination of malaria with the above approaches proved easier in temperate than in tropical regions. Seasonal temperature variations and cold, dormant winter seasons caused mosquito and Plasmodium life cycles to occur more slowly in temperate zones, resulting in weaker populations more susceptible to only moderately intensive control methods (Colluzzi, 1999; Sachs and Malaney, 2002).
The elimination of malaria from higher-income, temperate regions, new research that caused controversy and uncertainty around the use of insecticides, and the more challenging environmental conditions in the tropics resulted in an era of decreased interest and attention towards malaria control and research from the early 1970s through late 1990s (Greenwood and Mutabingwa, 2002). The burden of malaria increased in this time despite its reduced geographic range. Factors contributing to this growth include human population growth, expansions in mosquito range due to climate changes, environmental degradation and deforestation, and weak health public systems. Additionally, the concentration of disease in the tropics contributed to an economic disparity between wealthier temperate nations and less wealthy tropical nations (Sachs and Malaney, 2002). Malaria control is difficult in less wealthy nations due to the lack of quality health services and access to treatments and prevention methods. Factors such as increasing insecticide resistance, drug resistance, and war and civil disturbances add to this difficulty and threaten the work of public health workers and government actions to control malaria (Greenwood and Mutabingwa, 2002).