It considers historical evidence that correlates a decline in Native American health and fertility with ruptures to indigenous food systems following European colonization. It suggests novel and interdisciplinary ways in which advanced undergraduate or graduate level students might examine the correlation between breached indigenous nutritional practices and a decline in Native American health. These learning objectives bring together students of history and natural science in one classroom and entail new ways of synthesizing hitherto separate scholarly enterprises. In light of the most cutting-edge scientific literature on nutrition, metabolic syndrome, and immunology, they require a new consideration of the historical association between Native American health and indigenous food systems. In the centuries after European contact, many Native American communities were forced to move away from diets that had been comparatively high in animal proteins, animal fats, and fat soluble vitamins, and which also often incorporated important starch and plant sources such as wild rice, tubers, chenopods, beans, seeds, maize, squash, berries, and leafy vegetables. Notwithstanding regional variations, the pre-contact Native American diet was thus relatively nutrient dense; incorporating varied macro-nutrients and micro-nutrients through hunting and gathering practices and indigenous forms of horticulture that were subsequently disrupted. Thanks to the deleterious and often deliberate effects of colonization, which can only be understood through careful historical study and analysis, deeply-rooted food systems were ruptured. From as early as the sixteenth century, new post-contact circumstances forced many Native Americans to adopt diets that favored imported European grain cultivars, to maintain greater calorific reliance on New World maize species,square plastic pot and to reduce their consumption of traditionally hunted animals and fish and cultivated plant sources.
It is important to avoid any crude interpretative framework that might “exoticize” pre-contact Native American communities as having avoided any form of managed agriculture, crop monoculture, or organized land husbandry. Recent historical research, after all, has often employed the metaphor of “gardening” to question the notion that pre-contact Native Americans relied solely on hunting and gathering methods for sustenance. It is also imperative to avoid eschewing the distinct variations between indigenous food cultures both during and after the period of European contact: veering between the cultivation of maize, tubers, and starchy seeds alongside hunted animals in the Southwest to a relatively homogenous reliance on fats gathered from hunted meats and fish in the sub-Artic; as well as many gradations in between, such as the cultivation of wild rice alongside more traditional hunting and gathering patterns in the Great Lakes region. Yet this article – and the proposed educational course it defines – attempts at least some degree of generalization in discussing the differences between indigenous food systems and those that were introduced after European contact; and in discussing how students and researchers might view those distinctions in light of the modern scientific literature on metabolic and nutritional health. In questioning crude definitions of pre-contact Native Americans as noble hunter-gatherers – including those that are sometimes used by advocates of ancestral health and Paleolithic nutritional principles – it is important to avoid going to the other extreme by deemphasizing the relative environmental and dietary importance of hunting and gathering systems in many different parts of North America immediately prior to, and even after, European contact. While indigenous agricultural activities were present throughout the American continent, hunter gathering practices were also continued to a far greater extent than in post-Paleolithic Europe and the Middle East – potentially heralding important ecological and nutritional differences between the two regional populations over the following centuries. Those differences may inform our understanding of the role of nutrition in evolutionary health, particularly by comparing the pre and post-contact history of Native Americans. Recent scientific research has suggested that we may be able to locate specific loci in the DNA of some Native Americans that affect their insulin sensitivity. Individuals with certain genetic variants at these loci would be more likely to develop diseases such as diabetes following a move towards a higher carbohydrate diet, as has often taken place from the period of European colonization to the present day.
Examining the Pima Indian community of Arizona as a case study, researchers have found several loci with genetic variants that confer susceptibility to diabetes. For example a genome wide association study by Hanson et al identified polymorphisms in the DNER locus that are associated with increased risk of diabetes in Pima Indians. The Decolonizing the Diet project, and other similar endeavors, thus start with the hypothesis that a return to pre-European contact diets will improve the health of Native American communities, reducing hitherto disproportionately high instances of diabetes, as well as heart disease and other conditions associated with metabolic syndrome. Native American populations, indeed, have often featured as case studies among those scholars who attempt to define a “thrifty gene hypothesis” to explain why some people are prone to diabetes and/or obesity. A “thrifty” genotype, it is suggested, may have been evolutionarily successful for individuals descended from hunter-gatherer populations. Its occurrence would have allowed those populations, particularly child-bearing women, to gain fat more easily during times of abundance . Those with more fat may have better survived times of food scarcity, and thus passed on their genes. But during times of nutritional abundance, they would be more likely to develop metabolic syndromes such as obesity and diabetes, according to the hypothesis. In post-hunter-gathering populations, a similar paradigm has been hypothesized by Sellayah and others, who have suggested that the thrifty genotype may have appeared among those who had “undergone positive selection for genes that favored energy storage as a consequence of the cyclical episodes of famine and surplus after the advent of farming 10 000 years ago.” In any class, project, or research agenda, however, it is important to avoid necessarily deterministic conclusions when assessing the correlation between recent genetic studies and epidemiological data from Native American communities . Firstly, more research is still needed in order to assess whether particular genetic variants for insulin sensitivity are present exclusively or at a higher frequency in Native American populations compared to other populations, or whether they are equally prevalent in other ethnic communities who were not included in present studies. Despite retaining the same genetic variants, those other communities might not suffer from diabetes to the extent of Native Americans. Secondly, moreover, greater genetic susceptibility to insulin insensitivity or any other medical condition need not pre-determine the actual onset of diabetes or other disorders, as is evidenced by the relatively positive health markers among Pima and other Native American communities prior to increasing their consumption of processed and high sugar foods.
The notion that Native Americans have suffered from particular genetic predispositions might also prove problematic in encouraging students, scholars, and researchers to adopt an overly deterministic account of health outcomes,25 liter pot eschewing the disrupting role of human interventions against ancestral food-ways – either in exacerbating Native American susceptibility to metabolic syndromes and/or infectious disease or even as a primary factor in their increasing mortality and declining fertility after European contact. Examining – and problematizing – the link between modern disease susceptibility and genetic predispositions should prepare students for a related scholarly endeavor: assessing the potential tensions and pitfalls associated with the concept of a “biological exchange” of infectious diseases at the period of contact between Europeans and Native Americans . Here too a focus on abstract biological forces risks overlooking the role of human interventions in determining the inevitability of demographic decline in the face of disease. Historians, notably and most famously Albert Crosby, once defined the decimation in number and health of post-contact Native American communities according to a metaphor of biological exchange. Here, we have been told, Native Americans in a “virgin land” were unable to cope with the pathogens inadvertently introduced by Europeans after the arrival of Columbus.These great killer diseases, introduced by germs, spores, and parasites from European and African sources, included smallpox, measles, influenza, bubonic plague, diphtheria, typhus, cholera, scarlet fever, trachoma, whopping cough, chicken pox, and tropical malaria. Yet as the most advanced historical scholarship now suggests, human interventions were necessary to bring about the marked decline in Native American health and fertility, and the increase in mortality, in the centuries after the arrival of Columbus in the western hemisphere – as distinct from the notion of an amorphous biological exchange involving a mismatch between European and Native-American immunity.There is no doubt that Native American communities and Europeans retained different immunities during the period of contact. But suggesting that Native Americans were predisposed to near–total demographic collapse solely due to their relative lack of immunity may lead students and scholars to eschew any further assessment of their nutritional disruption as a co-factor in such a phenomenon; just as modern studies of genetic loci for diabetes might lead researchers to eschew the role of post-colonial interventions in late-nineteenth century and twentieth-century Native American food patterns, which affected their insulin sensitivity above and beyond any genetic predisposition. Scholarship of global infectious disease has shown that societies have most often been able to recover demographically from near collapse following massive outbreaks, usually in around 150 years. Disturbances such as epidemics have tended to result in only short-term demographic decline, with populations returning to pre-disease levels of growth, decline, or stability.
Describing the response to the European “black death”, for example, McNeill points out that “the period required for medieval European populations to absorb the shock of renewed exposure to plague seems to have been between 100 and 133 years.”As Gottfied has demonstrated, fourteenth and fifteenth century Europeans suffered multiple epidemics including the Black Death, Typhus, influenza and measles, yet their populations were able to recover demographically after around a century.Herring has even shown that early twentieth century Native American populations outside reservations were able to recover numbers following influenza, smallpox and measles epidemics. Taking these general studies as a starting point, students and researchers in biology and public health policy would gain a broader understanding of immunology and epidemiology through a joint course with historians and anthropologists. Rather than assuming that certain communities are more prone to metabolic syndromes and diseases, whether from genetic loci or a comparative lack of exposure to certain pathogens, they would be able to consider the ways in which human interventions – particularly in food ways – exacerbated demographic decline in the face of disease; both in terms of reduced immunity prior to infection and reduced ability to fight pathogenic invasion. Let us now consider how students might use case studies in Native American history to illustrate such a phenomenon, before turning to the ways in which contemporary scientific studies might then inform their analysis of the role of nutrition in enhancing or reducing the potential for recovery after mass epidemics. Historians of Native-American health and fertility have drawn methodologically and conceptually from general epidemic studies in order to question the Biological Exchange thesis for demographic decline between the 1500s and 1800s. Near total demographic collapse, according to a developing historiographical consensus, was made possible by the rupturing of ancestral social mechanisms by European colonization – rather than simply as a result of differing immune capabilities among the two populations. The failure of Native American populations to return to pre-epidemic demographic numbers, many scholars now assert, derived from human interventions that accompanied the spread of disease, rather than simply the diseases as singular factors. One such human intervention lay in the domesticated agricultural practices that were prescribed by European colonization after first contact with Native Americans. The disastrous decline in Native American demography was partly affected by the growing mismatch between their long-evolved ecological frameworks and European cattle-pens and agricultural methods. The latter exacerbated the spread of diseases that Native American communities were already struggling to fight off due to their impaired immunity.According to the pioneering paleo-archeological work of Armelagos, such a phenomenon has commonly affected societies as they have transitioned to concentrated agricultural settlement and animal husbandry. Examining paleo-archeological evidence in European and Middle Eastern populations, Armelagos has noted the problems that followed relatively sudden proximity to domesticated animals and to human and animal waste in newly agricultural societies. Such proximity increased the spread of parasitic disease. In previous hunter-gatherer populations, frequent migrations limited the contact of individuals with human waste. In more sedentary populations, concentrated around grain production and domesticated animals, human and animal waste became more likely to contaminate drinking water.