CHAPTER 2: VULNERABILITY TO DRY PERIODS: KEY CONCEPTS AND METHODS OF ASSESSMENT Return to Table of Contents
In this chapter, I define and describe the relationship between vulnerability, risk, dry periods, and resource marginality. I use the concept of vulnerability to refer to the potential for subsistence agriculturalists to be negatively impacted by dry-period declines in resource productivity. Where resources are marginal, dry-period declines in productivity are expected to increase the risk of food shortfalls and prompt a behavioral response to manage these risks. Resource marginality, by definition, occurs where and when resource productivity is inherently low relative to human food needs. As I discuss in this chapter, resource marginality is a foundational assumption in the U.S. Southwest but there are reasons to question this assumption. Importantly for this study, our models of vulnerability to dry periods (discussed in the next chapter) rely on a marginality assumption to make variation in demographic (demand), environmental (supply), and dry periods meaningful for human behavior. Reliance on a questionable assumption of resource marginality is a primary reason our models of vulnerability need to be evaluated. I conclude the chapter by describing my method of assessing vulnerability to dry periods.
Key Concepts: Vulnerability, Risk, and Dry Periods
Vulnerability is a composite concept “incorporating environmental, social, economic, political, cultural, and psychological factors" (Meyer et al. 1998:239-240). The study of “vulnerability of populations and activities is the most widely used umbrella concept for those factors that mediate between geophysical events and human losses” (Meyer et al. 1998:239). There are many specific definitions of vulnerability (see Cutter 1996:531-532 for a summary), but it is commonly understood as the "potential for loss" (Cutter 1996:529), the “capacity to be wounded” (Kates 1985:17), or the “potential for negative outcomes or consequences” (Meyer et al. 1998:239). More specifically, it is “the degree to which a system [such as a human-environment system], subsystem, or system component is likely to experience harm due to exposure to a hazard, either a perturbation or stress/stressor” (Turner et al. 2003:8074). Hazards are defined as threats to a system and the consequences they produce (Turner et al. 2003:8074). Differences in definitions reflect different disciplinary knowledge domains and emphases (Adger 1996; Fussel 2007; Jannsen and Ostrom 2006).
I follow most closely the definition of vulnerability and the associated "vulnerability as hazard of place" approach developed by Cutter (1996:533). Cutter (1993 as cited in Cutter 1996:532) defines vulnerability as "the likelihood that an individual or group will be exposed to and adversely affected by a hazard. It is the interaction of the hazards of place (risk and mitigation) with the social profile of communities." Vulnerability, in this model, is "conceived as both a biophysical risk as well as a social response, but within a specific areal or geographic domain. This can be geographic space, where vulnerable people and places are located, or social space, who in those places are most vulnerable" (Cutter 1996:533). In other words, "It is the intersection and interaction of both the social vulnerability and biophysical/technological vulnerability that create the vulnerability of places" (Cutter 1996:537). The hazard of place approach to vulnerability is appropriate for my study because it combines characteristics of social units (such as the demographic conditions I consider) with exposures to biophysical hazards or stressors (the dry periods and their severity I consider) and the geographic space (such as the settlement locations and watersheds) where vulnerable people and places may be located.
Other conceptualizations of vulnerability within the hazard of place model appropriate for my study include those developed by the National Research Council and the World Food Programme. The National Research Council (2001:114) defines vulnerability as the "Extent to which a population is liable to be harmed by a hazard event. Depends on [sic] the population's exposure to the hazard and its capacity to adapt or otherwise mitigate adverse impacts." In the context of food insecurity, or what I refer to as the risk of food shortfalls, the World Food Programme (2009) considers vulnerability a “forward-looking concept aimed at assessing community and household exposure and sensitivity to future shocks.” Vulnerability to food insecurity is expected to be “determined by their [households or communities] ability to cope with their exposure to the risk posed by shocks such as droughts, floods, crop blight or infestation, economic fluctuations, and conflict. This ability is determined largely by household and community characteristics, most notably a household’s asset base and the livelihood and food security strategies it pursues” (World Food Programme 2009:27-28). These conceptualizations of vulnerability are also appropriate for my study because they combine 'internal' factors of a vulnerable system with its exposure to 'external' hazards (Fussel 2007:160).
Vulnerability and risk are closely related concepts. Risk has been defined and used in a number of ways (e.g., Cashdan 1990; Tainter and Tainter 1996; Winterhalder et al. 1999) but is generally understood as the probability of a loss (Cashdan 1985; Wiessner 1982; Winterhalder 1986) or negative consequence (such as a food shortfall) multiplied by the magnitude of the consequences. These risks can be real or perceived as human perceptions of changing conditions and associated risks may differ from actual changes in conditions (Burton et al. 1993; Ortiz 1979; Powell 1988:82-86; Whyte 1985). In this study, I do not address perceptions of risk, relying instead on relative changes in key climate, demographic, and environmental variables to indicate relative changes and differences in risk of shortfall. Brooks (2003:6-7) argues that studies of risk and vulnerability are essentially examining the same processes because both are “ultimately interested in the physical hazards that threaten human systems, and in the outcomes of such hazards as mediated by the properties of those systems, described variously in terms of vulnerability, sensitivity, resilience, coping ability and so on" (Brooks 2003:7). In this study, I use a vulnerability framework, while recognizing my analysis is also about risk.
Vulnerability to dry periods is the focus of this study. [Note that a different type of vulnerability is not implied if “to dry periods” is omitted in the text of this study]. Dry periods are multi-year periods of relatively low precipitation and streamflow (see Chapter Five for methods of identifying these dry periods). Dry periods decrease resource productivity (wild and cultivated) in arid and semi-arid regions because water is a primary limiting factor on plant growth (Fischer and Turner 1978) and precipitation levels are typically below the moisture requirements of most cultivated crops such as maize (Muenchrath and Salvador 1995; Shaw 1977). Animals that rely on plant foods are also affected by changes in climate that influence plant growth (Bright and Hervert 2005; Osborn 1993). Hence, animals that rely on these herbivores are also affected. Since my concern is with vulnerability to dry periods and an associated risk of food shortfalls for humans, I define shortfalls as those conditions insufficient to meet human food needs. The focus is multi-year shortfalls associated with multi-year dry periods that likely stress or exhaust typical buffering strategies and necessitate a response to prevent the negative effects of long-term food deprivation. The risk of food shortfalls is the same as the concept of “food insecurity”--terminology often used outside of archaeology (e.g., Bohle et al. 1994; World Food Programme 2009). Decreases in resource productivity created by decreased precipitation or streamflow can increase the risk of food shortfalls (create a “hazard”) among subsistence agriculturalists in dry climates. Although I focus on dry-period-related risks of food shortfalls, it is important to note that many factors affect the risk of shortfall. For example, Sen (1999) has shown that famines (food shortfalls) are also a consequence of social, political, and economic inequalities that affect access to food. In my study area, environmental factors, such as soil type and quality (Sandor et al. 2007), temperatures (Salzer 2000b), insects and plant diseases, and the extent of arable land all affect resource productivity and can also affect shortfall risks. I do not analyze these. Instead, for this study, I consider select demographic and environmental conditions that influence the potential risk of shortfall as identified by the models of vulnerability I evaluate.
Identifying the reasons why people may be vulnerable to dry-period risks of food shortfall is difficult because of the complexity of factors, and changes in these factors, that may contribute to or ameliorate this vulnerability. In general, vulnerability is a function of the exposure and sensitivity of a system to a hazard and the adaptive capacity or resilience of the system to cope, adapt, or recover from the effects of the hazard (Adger 2006:269; Smit et al. 2001:893-895; Smit and Wandel 2006:286; Turner et al. 2003). Specific contributors to vulnerability to dry periods include such factors as population levels and density, subsistence strategies, social stratification, technologies, poverty, health, landlessness, poor soils, social relations including political weakness, gender, occupation, race/ethnicity, immigration status, extent of social networks, and especially combinations of these factors (Blaikie et al. 1994; Liverman 1990a:49; Meyer et al. 1998:238-243; Ribot 1995; Wisner et al. 2004:11-12). Because the states of these potential contributors to vulnerability to dry periods change over time, vulnerability will also vary. Changes in strategies to manage the risk of shortfalls and changes in the extent of implementation and effectiveness of these strategies are also responsible for changes in vulnerability over time. Furthermore, "no standard framework exists for identifying the fundamental sources of differential vulnerability, but clearly they are numerous and complex" (Meyer et al 1998:240). In brief, "To be vulnerable to drought is to lack environmental, technological, economic, or political defenses against its impacts (Liverman 1990a:50).
The U.S. Southwest, like many arid and semi-arid regions, is often considered a “fragile and marginal environment for agriculture” (Diamond 2005:137) and a "harsh and variable region" (Dean et al. 1994:86). This widespread perception was, at least in part, promoted by early 20th century scholars in the Southwest whose perspectives were shaped by the agriculture of the temperate eastern states, the technology of the industrial era, and ultimately European traditions (Fish and Fish 2004:187; Fish 2004:116-117). “By comparison, the hard-won harvests of Indian peoples served mainly to illustrate the vicissitudes that these cultivators had to overcome” (Fish 2004:117). Most Southwestern archaeologists have adopted this perception and seem to accept the view that populations in the region were always undergoing a moderate amount of stress because of the region’s aridity and variability in the timing of rainfall (Cordell 1996:253). Lekson et al. (1994:16) asserts that 2,000 years of human occupation in the Southwest including associated technological, social, and ideological changes were "constrained by a harsh and unpredictable natural environment." Fish and Fish (1994:88) refer to this focus on resource marginality and its consequences as "a unifying theme throughout southwestern archaeology." Similar notions of native Southwestern peoples struggling to survive in a marginal and harsh environment prevail in the popular imagination as well (e.g., Childs 2007).
This perception of climate-related resource marginality, widespread vulnerability to dry periods, and risks of food shortfalls has been informed by:
- ethnohistoric accounts of harvest failures and crop damage due to both flooding and drought (Abruzzi 1989; Bradfield 1971; Castetter and Bell 1942; Russell 1975 ; Slatter 1979; Zarbin 1980);
- archaeological evidence of extensive water control strategies to manipulate and maximize available precipitation (Fish and Fish 1984; Vivian 1974);
- skeletal evidence of nutritional deficiencies (El-Najjar et al. 1976; Martin 1994; Sheridan 2003);
- crop and climate studies indicating that annual precipitation levels are mostly below the moisture requirements of major crops such as maize (Muenchrath and Salvador 1995; Shaw 1977); and,
- climate studies that demonstrate low mean and highly variable precipitation conditions (e.g., Sheppard et al. 2002).
There are reasons to question the extent of resource marginality and the prevalence of climate-related food shortfalls. First, inferences of shortfall risk that rely on ethnohistoric examples of dry-period challenges to agriculture may not be valid because the scale, complexity, social organization, and technology of prehistoric groups changed over time and differ from those observed historically. For example, historically-observed indigenous irrigated agriculture in the Phoenix basin was substantially smaller-scale than irrigated agriculture practiced prehistorically (see Howard 1993 for a description of Hohokam irrigation systems). Larger-scale systems increase the potential for agricultural surplus, depending on the size of populations relying on these systems. Second, much of the ethnographic evidence supporting climate-related shortfalls (e.g., Slatter 1979) is from upland portions of the Southwest without extensive perennial rivers and where climatic conditions, especially cool temperatures and shorter growing seasons, are more agriculturally limiting. Third, wild foods in central Arizona were “predictable, storable, and abundant…and harvest times for staples are spread over much of the year” (Fish 1989:22). Thus, wild foods could have substantially offset declines in cultivated crop productivity and reduced the risk of shortfall if these foods were less sensitive to dry-period declines in productivity or located in less affected areas. Fourth, cultural trajectories in the region spanning at least a millennium suggest effective adaptations to prevailing climatic and environmental conditions including a repertoire of strategies to manage shortfall risks (e.g., Fish 1989). Finally, empirical validation of resource marginality and endemic shortfalls is lacking, except for the few skeletal studies of nutritional deficiencies (El-Najjar et al. 1976; Martin 1994; Sheridan 2003). Such validation might come from agricultural production models that demonstrate recurring patterns of caloric insufficiency. Production models, however, are difficult to construct and defend due to the plethora of assumptions required and inadequate information about the essential variables (e.g., number of people, water requirements and yields of prehistoric maize varieties, field sizes, etc.). That resource marginality led to endemic risks of shortfall, then, is a plausible hypothesis rather than a well supported assumption.
Consequences of the Marginality Assumption
The perception of resource marginality has transformed over time into a foundational assumption of many Southwestern archaeological studies (e.g., Doyel and Dean 2006; Gumerman 1988; Larson et al. 1996; Minnis 1985; Rautman 1993; Tainter and Tainter 1996). For example, the assumption allows climate-related decreases in resource productivity to be strongly linked to the increasing risk of food shortfalls and responses to these risks. Marginality implies a strong linkage because resource levels are assumed to always be minimal and fluctuate around a threshold above which there was enough food and below which there was not. Thus, dry-period declines in productivity are expected to have increased the risk of shortfalls and stimulated behavioral responses to manage these shortfall risks. Marginality, then, provides a critical linking argument between climate and human behavior by making climate-related variation in resource productivity meaningful to people because of the risk of shortfalls.
Importantly for this study, models of vulnerability to dry periods in the U.S. Southwest rely on the marginality assumption to link changes and differences in demographic and environmental conditions to differences in vulnerability to dry periods. The marginality assumption implies that any condition that affects the demand or supply of resources will also meaningfully influence shortfall risks and prompt human responses. For example, people living in settlement areas of relatively low inherent productivity (e.g, in an area of low precipitation) should be more vulnerable to dry periods than people living in areas of relatively high inherent productivity. This is because people living in areas of low productivity are assumed to be closest to the threshold above which there is enough food to eat and below which there is not. Thus, when dry periods decrease productivity, the risk of shortfalls is assumed to increase in all areas but be most meaningful in areas of low productivity where resource supplies may have declined below a threshold so that there was not enough food to eat. However, if resources were abundant (not marginal) and dry-period declines in productivity in areas of both relatively low and high productivity did not approach the threshold below which food needs were unable to be met, then differences in inherent productivity among settlements would not have meaningfully influenced shortfall risks. Any change in human behavior associated with dry-period declines in productivity, then, could not be interpreted as responses to the risk of shortfalls.
The marginality assumption inappropriately obviates considerations of the role of human action in ameliorating or contributing to vulnerability. People have a variety of strategies for living in dry climates, managing vulnerability, and responding to shortfall risks. Such strategies include mobility, resource diversification, physical storage, exchange, and population distribution on the landscape (e.g., Braun and Plog 1982; Burns 1983; Dean 2006; Halstead and O’Shea 1989b:3-4; Minnis 1985; Rautman 1993; Slatter 1979:80-84). These strategies can address shortfall risks and ameliorate vulnerability by increasing resources or access to resources. These strategies and responses are choices among available opportunities (Burton et al. 1993) and their ameliorating or contributing influence on vulnerability can be evaluated. Variation in the effectiveness or implementation of these strategies over time may also explain variation in vulnerability to dry periods. That is, why at some places and times climate appears to have influenced human behavior and at other times it does not. When marginality, shortfalls, and vulnerability are assumed, however, the influence of these choices and changes in their implementation or effectiveness are not adequately considered.
Uncritical acceptance of the marginality assumption can slow the pace of theoretical and methodological advancement in climate-human behavior studies in the U.S. Southwest. The pace can be slowed by failing to identify if serious vulnerabilities existed and/or by failing to stimulate rigorous efforts to identify non-demographic and non-environmental causes of this vulnerability if it existed (e.g., disease, conflict, social stratification). Without a consideration of multiple factors that influence vulnerability to dry periods, we cannot effectively explain the diversity of responses including no response to dry periods evident when the paleoclimatic record of dry periods is compared to the archaeological record of behavioral change (Nelson et al. 2010). Weaknesses in our ability to account for a diversity of responses to dry periods including no response can result in frustration with current approaches that discourage analytical and interpretive advancements. The marginality assumption is at least partly responsible for current limitations in archaeological understanding of climatic influences on human behavior because it has focused attention on developing more refined identification and characterization of climatic events and on mostly demographic and environmental explanations of differences in vulnerability. This study’s challenge to the marginality assumption is inspired by and builds on previous work that has questioned the role of drought as the primary cause of abandonments at various times and places in the U.S. Southwest (e.g., Dean 1996; Hill et al. 2004; Lipe 1995; Kintigh 1985; Van West 1990; Van West and Dean 2000; Kohler and Van West 1996; Varien et al 1996).
The impact of the marginality assumption is not limited to climate-human behavior studies. Yoffee (1994:350) observed that at a conference on prehistoric social complexity in the Southwest, the major theme was "bad weather". In support of his claim he listed multiple references to resource marginality among the papers that comprised the edited volume that resulted from the conference (Yoffee 1994:350; in Tainter and Tainter 1994). It was clear that the shared assumption of marginality had a strong influence on approaches to the problem of complexity. Elsewhere, Harry (2005) examined the influence of agricultural marginality on ceramic specialization at six areas in the prehistoric U.S. Southwest. She found little evidence supporting the influence of agricultural marginality on the adoption of part-time ceramic specializations. Rather than question her method of differentiation of settlements into more or less marginal, she argues that attention must be focused on the differing social and economic contexts between prehistoric and historic/modern-day peasants to understand why agricultural marginality did not influence ceramic specialization. I suggest that her reliance on environmental criteria alone (e.g., potential for floodplain agriculture, precipitation levels and variability) to identify differences in marginality among settlements did not allow an effective consideration of the research problem she considered.
Assessing Vulnerability with Residential Abandonment
Understanding the influence of dry periods on vulnerability requires a method of assessing vulnerability, and changes in vulnerability, during the period of interest throughout the study area. Vulnerability, however, is not a directly observable phenomenon (Moss et al. 2001:8) making measurement and quantification difficult (Leurs et al. 2003:256). A number of studies have developed a set or composite of proxy indicators to quantify vulnerability (e.g., Moss et al. 2001; see Leurs et al. 2003:256-267 for other examples) and the development of vulnerability assessment tools is now a significant research emphasis. For example, the Pacific Northwest Laboratory Vulnerability Assessment Program developed a composite of sixteen variables such as life expectancy, percent of the population with access to safe water, and the percent of non-managed land to assess vulnerability to climate change for 38 countries (Moss et al. 2001).
In some case studies, however, the “relative impacts of stressors in a region” (e.g., dry-period risks of shortfall) and responses can be used as an objective ex-post facto measure of vulnerability (Leurs et al. 2003:256-257). I follow this approach and use one potential response to dry periods to assess vulnerability: residential abandonment. In the nomenclature of modern vulnerability studies, residential abandonment is referred to as "persons displaced" and it has been used as a quantitative index of hazard impacts (Meyer 1998:242). I do not use single abandonment events at any spatial scale as a measure of vulnerability to dry periods; such events may be simply time-space coincidences with a dry-period. Rather, I use the long-term (250-year) relationship between changes in dry-period severity and changes in residential abandonment at the scale of settlements, watersheds, and the total study area as an indicator of potential vulnerability to dry periods (as discussed further in Chapter Five). That is, if over the 1200 to 1450 period of study, changes in dry-period severity were strongly associated with proportionate changes in residential abandonment, then dry-period influences on this abandonment may be reasonably concluded.
Residential abandonment is an archaeological signature of population movement. This movement is a reasonable indicator of potential vulnerability to dry periods for several reasons. First, people can take advantage of the spatial and temporal structure of resource failure across a landscape by moving away from areas of food scarcity and low productivity to areas of higher productivity (Halstead and O’Shea 1989a), reducing their vulnerability to the dry climate conditions. Second, ample ethnohistoric evidence in the U.S. Southwest has documented residential abandonment in response to climate-related resource shortfalls (Abruzzi 1989; Slatter 1979). Third, dry-period influences on residential abandonment (often in combination with other factors) have been identified in a number of archaeological studies in the U.S. Southwest (e.g., Ahlstrom et al. 1995; Adams 1998; Cordell 1975; Cordell et al. 2007; Dean 1988; Dean et al. 1985; Euler et al. 1979; Gumerman 1988; Jett 1964; Judge 1989; Lipe 1995; Minnis 1985; Orcutt 1991; Schlanger 1988; Slatter 1979; Van West and Dean 2000). Fourth, dry periods are among the climatic conditions understood to have led to population movements in many parts of the world (McLeman and Smit 2006; Meze-Hausken 2000). Note that archeologically identified residential abandonment in the prehistoric Southwest is a record of population movements, not a record of the relinquishment of places, ownership, or the disappearance of a people (Nelson and Schachner 2002:169).
Movements in response to dry-period shortfall risks can be either short (intra-watershed) or long-distances (inter-watershed) when reductions in shortfall risk (real or perceived) can be achieved by either (as discussed below). In recent migration studies of living peoples in Ecuador (Gray 2008), Burkina Faso (Henry et al. 2004), and Nepal (Massey et al. 2007), data show that movement out of communities with adverse environmental conditions (e.g., low rainfall, rainfall variability, environmental degradation) resulted more often in short rather than long-distance moves. Short-distance movements can reduce dry-period shortfall risks when there is substantial local-scale spatial heterogeneity of topographic and environmental conditions that affect potential resource productivity (Massey 2007:7). Short-distance movements to new settlement areas and uncultivated lands can also be sufficient to increase productivity and decrease dry-period shortfall risks if these risks were created or exacerbated by declining productivity of continuously cultivated lands.
Substantial spatial heterogeneity in conditions influencing potential productivity exists within central Arizona and the individual watersheds that comprise the study area (see Chapter Four for the study area description). For example, settlements in the study area were located in areas receiving an average of 7 to 35 inches of precipitation annually, both near and far from perennial rivers, and in areas with average annual temperatures ranging from 49 F to 71 F (based on modern climate data from the Western Regional Climate Center 2010). Areas settled range in elevation from 994 to 6,966 feet. The study area also includes seven of the thirty-four biotic communities identified in the Southwest (Brown et al. 1979; The Nature Conservancy in Arizona 2004) and forty-one soil classifications (Natural Resources Conservation Services 2008). In a single watershed covering 5,612 square kilometers (the Upper Salt), settlements were located in areas receiving an average of 15” to 35” of precipitation annually, both near and far from perennial rivers, and in areas with average annual temperatures ranging from 51 F to 69 F. Areas settled range in elevation from 2,097 to 6,966 feet. The watershed includes five biotic communities and fourteen soil classifications. Thus, movements to destinations outside of the study area were not necessary to substantially change environmental conditions and influence potential productivity. This heterogeneity also implies that dry periods impacted neighboring communities differently, creating a mosaic landscape of vulnerability.
Residential abandonment and movement from settlement areas is, however, an imperfect indicator of vulnerability to dry periods because movement, like vulnerability, is a complex phenomenon that lacks a single cause. Thus, any assessment of the extent of movement over time conflates a variety of factors and potential causes (Meze-Hausken 2000). I do not attempt to identify these factors or causes by, for example, distinguishing between moves to places that are far or environmentally different from ones that are close and environmentally similar. Factors that affect decisions to move include “push” factors at the population origin, “pull” factors at the population destination, and the transportation costs between the two (Anthony 1990; Herberle 1938; Lee 1966). If people are vulnerable to dry periods and dry-period risks of shortfall provide a “push” to move, people may decide not to move if more productive locations are limited, or perceived to be limited, or the real or perceived costs of moving are greater than the benefits of remaining in place.
Population movement is also understood as “a strategy of resituating, both socially and ecologically, and perhaps even ideologically” (Nelson 1999:22; see also Nelson and Hegmon 2001 and Nelson and Schachner 2002). People can move for "religious, kinship, trade, artistic, and personal obligations" (Kelley 1992:48) and movement itself can be culturally valued (Naranjo 1995). Vulnerability to dry periods may also be addressed by a variety of strategies that do not include residential abandonment. People may diversify their resources and diets, increase physical food storage, acquire food through exchange, decrease food consumption, and socially reorganize (e.g., Bawden and Reycraft 2000; Braun and Plog 1982; Burns 1983; Dean 2006; Halstead and O’Shea 1989b:3-4; Minnis 1985; Rautman 1993; Slatter 1979:80-84). Decisions to move, then, are not simply a function of vulnerability and risk but of real or perceived opportunities, costs and benefits, and the effectiveness of other strategies to manage declines in resource productivity and the risk of shortfall.
In sum, the intersection of different economic, social, and environmental conditions creates different potentials for vulnerability. Some of these conditions are those considered in this study: dry periods, population levels and density, and local-scale resource productivity. If residential abandonments were a response to dry periods, these movements should correlate over the 250-year period of study with the worst combinations of conditions.
In this chapter, I define and describe the relationships among vulnerability, risk, dry periods, and resource marginality. In both practice and thinking in the U.S. Southwest and other dry climates, these concepts are often closely linked and sometimes difficult to distinguish. One of the contributions of this study is to demonstrate the value of considering each of these concepts separately and questioning the relationship among them. For example, people are vulnerable to dry-period risks of shortfall in dry climates only to the extent that resources were marginal and existing strategies to manage these risks ineffective. Thus, there is not a necessary and direct linkage between dry periods, risk, vulnerability, and human response. I have argued that a questionable assumption of resource marginality has informed this linkage and our models of vulnerability. Questions regarding the validity of the marginality assumption are a strong reason why our models of vulnerability need to be evaluated. In the next chapter, I describe these models that inform our thinking about the challenges of living in dry climates.