I just returned from a splendid few days in beautiful (though icy) Quebec City, where I attended the annual Society for Historical Archaeology conference. While there, I gave a paper (with Maria Theresia Starzmann) titled “Techniques of Power and Archaeologies of the Contemporary Past,” as part of an excellent session on the political economy of identity.
We received a good response to the paper
, and so I wanted to share it here. A few caveats first. This is not a peer-reviewed work; it’s an attempt by Maria and I to push our field’s debate over what constitutes responsible political engagement toward what’s termed praxis, or “theoretically informed action.” Our paper is based upon our reading of relevant literature, our own experience in the field, and our understanding of how archaeology defines the legitimate role of an academic knowledge producer. It has not, though, been subject to a thorough critique by multiple colleagues (“peer-review,”) so we ask that anyone who wishes to cite it to please speak with us first. That said, we welcome any comments. You can download a PDF by clicking the link: Roby and Starzmann SHA 2014-final.
We’d like to publicly thank everyone who was in attendance and who gave us feedback after the session.
EDIT: My mistake, I did not have clearance to post the paper. Edits reflect.
The WordPress.com stats helper monkeys prepared a 2013 annual report for this blog.
Here’s an excerpt:
A New York City subway train holds 1,200 people. This blog was viewed about 4,500 times in 2013. If it were a NYC subway train, it would take about 4 trips to carry that many people.
Last week, Fox News personality Megyn Kelly announced on her program as a true fact that Santa Claus and Jesus were white. You can see the segment here, and as for the reactions, they run the gamut from Fox fellow Bill O’Reilly totally agreeing with her, to a reasoned piece in The Atlantic by Jonathan Merritt, who notes that her comments are both bad history and bad theology.
Kelly herself has protested (here, and here) that her comments were meant to be light-hearted, and dismissed her critics as “race-baiters.” It was for the children, Kelly said last week on the segment in question: “For all you kids watching at home, Santa just is white, but this person is just arguing that maybe we should also have a black Santa.”
Leaving aside Kelly’s severe misunderstanding of her network’s viewer demographics, something’s being missed amid the outrage. As a person who studies and teaches about things like race and history and civil society, I feel the need to weigh in. To me, Santa-Is-White-Gate points to a major failure in our understanding of race and whiteness.
The most obvious problem with Kelly’s comment is that, if we take “white” to mean skin pigmentation that’s more or less similar to that of Kelly and the three others on her panel (see video above), then neither Jesus nor the historical bishop that Santa is modeled on would fit the bill. Both first-century Judea and fourth-century Anatolia were cosmopolitan regions that had seen centuries of population movement and gene flow between people from North and East Africa, Southwest and Central Asia, and Europe. As Merritt notes in his Atlantic article, “If he were taking the red-eye flight from San Francisco to New York today, Jesus might be profiled for additional security screening by TSA.” Same for Saint Nicholas.
But that’s not my point here. Kelly’s remarks are flawed on a deeper level, one that revolves around three related issues.
Skin color varies even among closely related people
The idea that “races” of people fall into neat little categories – white, black, brown, yellow, red – is a conceit that has no relationship to actual human variation. The idea traces back to Linnaeus, who assigned attributes to different populations on the basis of generalizations about skin tone. The problem is, human variation of all traits, including skin color, exists on a continuum. Trying to draw rigid lines of color falls apart with a large sample.
If you were to look at a random person from Nigeria and a random person from Norway, for instance, you’re likely to see a clear difference in skin color. But if you were to look at everyone in the Nigerian’s population, you’d find a huge amount of variation in skin tone – some darker, some lighter – and you might realize that classifying everyone as “black” ignores those who are of lighter skin. The same would be true in reverse for the Norwegian’s population – everyone would have relatively lighter skin than our African sample, but that lightness would exhibit variation itself.
This idea is illustrated very well in an interactive from the American Anthropological Association. Give it a try, where do you draw the line? And for a really good, accessible book that explores this and related topics, I highly recommend Jon Marks’ Alternative Introduction to Biological Anthropology (note: this is an uncompensated recommendation; I don’t get any money if you click that link or buy the book).
accept acknowledge the reality of variation in skin tone, then what Kelly insisted is a true fact seems much less, well, truthful and factual. But it gets better.
The idea of “white people” is a fairly recent concept
It’s no mystery why Linnaeus decided to assign and group people by skin color. He was writing at the height of European colonization of Africa, Asia, and the Americas. Since the 15th century, Europeans had been encountering people of very different appearances, habits, customs, and beliefs. Classifying those “others” became the necessary first step to controlling them. “White” was how the colonizers distinguished between themselves and the colonized.
But it didn’t happen overnight. In fact, there was quite a bit of overlap between color, language, and religion early in the colonization process. For instance, the Spanish missionary Bartolome de las Casas argued that conversion to Christianity meant native americans could not be enslaved and worked to death. Religion, in that sense, trumped color.
In the early years of the Virginia Colony, indentured servants could be of European or African descent. They lived and worked side by side, mated and married, and were able to secure freedom and property after their indenture. Status, in other words, depended not on skin color, but on one’s status as a servant or a landholder. Over that first century, color-based divisions were created by virtue of court rulings, gradually erasing indenture as a status for African-descended people (and then Europeans), and replacing it with a system of lifetime slavery based on color. For more on this, see Edmund Morgan’s American Slavery, American Freedom and Ira Berlin’s Generations of Captivity.
The point is that “white” was created by law and custom and “science,” it wasn’t a status that had meaning to people in the sense that we understand it today, until well into the 18th century. Back into the ancient world, there’s very little evidence that color was considered to be an indicator of difference or ability. Nubians ruled Egypt, Romans married North Africans, and so on.
The idea that the past was filled with white people is simply flawed, regardless of what Kelly says. “White” isn’t something that’s self-evident, it’s something that has a history. It had to be invented. And it had to be invented for a specific reason.
“White” is about power and privilege, not skin color
The people who created and policed the distinction between “white” and “other” were the ones who held the power – it was the colonizers, the scientists (Linnaeus, Haeckel, Vogt, Morton, and others), and the politicians (Jesse Helms, Strom Thurmond, Orval Faubus, etc).
After the end of slavery in the U.S., legal segregation maintained distinctions between white and black for the purpose of keeping political and civil power away from those whose interests would oppose the (white folks) in power. Mechanisms to disenfranchise African descended people were based not just on physical appearance, but on ancestry. This odd system, called “hypodescent,” assigns people whose ancestry comes from multiple groups into that group with the lower amount of prestige, regardless of their skin tone. So a person who had, say, one grandparent who was black would be classified as “black,” no matter how much they appeared “white.” Hence, they might not be able to vote, hold office, get access to education, make a contract with a “white” person, and so on. We still feel vestiges of this today: Consider how President Obama is classified, despite having a white mother and a lightly pigmented skin tone.
Moreover, exactly who is included in the category “white” (in the United States at least), has been enlarged over the past two centuries. Immigrant groups including Italians, Irish, Jews, and Scandinavians had to overcome barriers to social mobility and economic access based on the perception that they were dirty, lazy, uncultured, uncivilized, less intelligent, and other traits constructed in opposition to “white” society. “White” in this sense was not at all about skin color, but about class – access to resources and advancement and social capital was reserved for the already wealthy and privileged: the native born, England and Scotland-descended Protestant elites. By a variety of mechanisms, immigrant groups came to be included in the class-based construction of whiteness. Many of their descendants became the so-called “white ethnics” of today. For more on these cases, see David Roediger’s The Wages of Whiteness, Noel Ignatiev’s How The Irish Became White, and Karen Brodkin’s How Jews Became White Folks and What That Says About Race in America.
So to sum up: “White” as distinct from any other skin color does not exist in actual human populations; “white people” came about only in the past three centuries, and “whiteness” as an index to power and privilege is a construction that serves to exclude others based on a largely arbitrary difference.
None of this is particularly revolutionary or mysterious. Nor is it hard to teach. In my own experience, students ranging from intro-level college undergrads to graduate students in advanced seminars are profoundly interested in these ideas, and they get them on a visceral level. Yet last week we saw a highly compensated and visible news personality publicly stating the polar opposite of what anyone with a passing knowledge of history knows to be actual fact.
What’s the disconnect? Is the history of race simply not taught? That seems wrong to me, because it touches on so many other topics. You can’t teach the so-called Age of Exploration without discussing it. You can’t teach U.S. colonial or Civil War history without talking about it. You can’t teach a class in 20th Century America without talking about it.
And in case anyone thinks Kelly’s remarks were an isolated issue, realize that when people of power and prestige put out false narratives, it has an outsized impact. I point to a story out of New Mexico from this week, where a teacher is accused of chiding a black ninth-grader who dressed as Santa that “Santa is white.” I’m not suggesting this teacher got the idea from Fox News, rather that a lot of people are woefully misinformed about what “white” means, and why it means what it does.
I suggested up top that the problem with Megyn Kelly’s comment is that it highlights how little we know or think about race and whiteness. Kelly herself is highly educated – she has degrees from Syracuse University and Albany Law School, both of them very good institutions. But we clearly need to do a better job teaching, and we need to call out people in media that propagate bad history.
This is another post in my occasional “Friday Feature” series. Friday Features are published on (surprise!) Fridays, and are longer-form discussions of some aspect of archaeology, history, theory, etc., that doesn’t lend itself to a typical post. Friday Features are archived on a single page, linked at the top, for easy access.
Thanksgiving at my house brought a flurry of cooking, a spot of (very disappointing) football, and some quite tasty dinner and wine. But as usual, I had to go and complicate even the most mundane of tasks.
During the several hours of meal prep, I wondered more and more about one product in particular: Corn syrup, specifically, Karo brand corn syrup. This ingredient stuck out for two reasons. First, it was integral to the pecan pie – one of my favorite desserts, and one that I make only during the holiday season. My wife and I have two recipes: One for a traditional pie, from my grandmother; one for tiny pecan pie tarts called “Tassies” that’s from my wife’s grandmother. This year, we chose to try a third one that we found online. But in all three, corn syrup is the most important ingredient, at least by volume. There’s more corn syrup than anything else, including pecans.
The second odd thing about the Karo was how much it differed from the rest of the ingredients in our dinner. We’re pretty non-traditional when it comes to Thanksgiving – no turkey as a rule, and this year, no meat at all, just veggie sides. And we tend to cook with lots of whole ingredients, so we spent most of Thanksgiving Day peeling sweet potatoes, stemming green beans, roasting walnuts, slicing kale leaves, and so on. Amid all that natural-y stuff, the corn syrup stood alone, with its bright label announcing its ingredients, nutritional information, and place of origin.
Glancing at the ingredients, all I learned was it contained “corn syrup.” I’ve been around lots of corn, and I’ve never noticed it to be particularly syrupy. So I investigated. I wanted to know what corn syrup was, where it came from, who made it, and why it’s so ubiquitous. Here’s what I found out. In order, I’ll discuss corn syrup itself, the Karo brand’s corporate owners, and how this particular commodity ended up in my pie filling.
Sweetness and corn syrup
Corn syrup is a sweetener derived from corn, that’s liquid at room temperature, and is often used in cooking because it doesn’t crystallize like refined sugar. Corn itself is high in starch, which when extracted can be chemically modified into various types of sweeteners. Corn syrup contains a high percentage of glucose, a simple plant sugar that is easily absorbed during digestion. A glance at the label of nearly any sweetened drink or packaged food (look for corn syrup or glucose syrup, same thing) will show how common an ingredient it is – though the similarly named high fructose corn syrup (HFCS) is a different animal entirely. (Sources: “Other caloric sweeteners” from The Sugar Association; “Corn Syrup” from the Kitchen Dictionary; “Corn Syrup” from Wikipedia. Note: The Sugar Association is a trade group for the U.S. sugar industry that is currently suing the Corn Refiners Association, linked below, over the latter’s claims about HFCS. Caveat emptor.)
Corn sweeteners represent a small but significant destination of the total U.S. corn crop. In 2011/2012, the total crop yield was 12.3 billion bushels, of which corn sweeteners made up 6.5 percent. In comparison, alcohols (primarily ethanol fuel) represented 41.6 percent of that year’s crop; the rest went to animal feed and human consumption. (The sources for this and the statistics discussed below are the USDA Economic Research Service and the Farm Service Agency.)
In 2011/2012, the U.S. imported 3.2 million metric tons of raw cane sugar and produced 3.2 million metric tons, mostly in Florida and Louisiana (For simplicity, I’m excluding beet sugar, though it accounts for even more U.S. sugar production). Historically, most U.S. sugar imports have come from the Philippines, Brazil, and the Dominican Republic.
The U.S. protects domestic sugar growers and producers through tariffs and subsidies and import quotas (all summarized here and here). That has certainly helped increase domestic production, but, simply put, we sweeten so many things that domestic sugar alone can’t supply it all. Hence, corn sweeteners.
In terms of glucose syrup alone, U.S. manufacturers produced 4 million short tons in 2011, imported virtually none, exported just a fraction, and sent half of that production to food and beverage use. For HFCS, the numbers are even higher: 9.1 million short tons of production in 2011, of which the vast majority went into foods and beverages. Compare that with 5.5 million short tons of refined sugar in 2011 earmarked for “industrial use” (i.e., everything except the sugar you keep in the pantry or use at your local coffee shop), and it’s clear that domestic sugar is a small chunk of the U.S. sweetener landscape, with corn products owning the lion’s share. And in terms of consumer-level corn syrup, in the U.S., the market pretty much belongs to Karo.
The Karo label tells me the bottle is a product of ACH Food Companies Inc. of Memphis, Tennessee. ACH makes a range of food products, including Fleischmann’s Yeast, Spice Island spices, and Mazola oils, in addition to Karo. Helpfully (and somewhat troublingly), its Web site notes that “under no circumstance does ACH support or condone the use of forced or slave labor for any human being, especially children” (source).
ACH, though, is itself part of a larger corporate structure. It is a U.S. subsidiary of Associated British Foods plc, one of the world’s largest producers of sugar products and baking yeast, with retail stores in Europe and private label brands including Twinings and Ovaltine (outside the U.S., where that brand is owned by Nestle).
Associated British Foods might sound familiar, as it has been in the news recently. Earlier this year, the company denied charges that another subsidiary moved profits out of Zambia to avoid paying corporate taxes (see the report, media accounts, and corporate responses linked here). More recently, Oxfam has alleged that Associated British Foods and other companies have knowingly worked to seize land from indigenous groups around the world.
That corporate connection isn’t obvious from the Karo label. And that’s by design. Part of the mystery of commodities is the radical separation that they seem to create between production and consumption. To put that another way, consider the way we make purchases. We go to a supermarket and are confronted by things to consume – things that appear to simply exist in finished form. The little act of buying obscures the long train of steps involved in making, distributing, and marketing commodities – steps that involve vast networks of interconnected people at each stage. We confront that train only at the endpoint, and it takes work to piece together the steps leading up to it.
At the point of purchase, though, I was concerned only with getting the right product, and, to a lesser extent, with cost. Because the whole commodity chain is obscured, we end up thinking that the cost of a product appears only at that moment of purchase. The fact is, there are costs associated with each link in the chain, and many of those costs are not directly economic ones.
A taste of the past
Neither my nor my wife’s grandmother knew or cared anything about ACH foods, tax dodging allegations, or global trade policy. They just knew how to make really great pecan pies. And the key to both their recipes was Karo corn syrup. But why corn syrup and not sugar, or honey, or maple syrup? Because they cooked the way they had been taught to cook, using ingredients that were broadly available and relatively cheap. So why is corn syrup widely available and relatively cheap, and how long has it been that way?
The folks at the Corn Refiners Association, Corn.org, give a bit of history:
In 1921, crystalline dextrose hydrate was introduced. Then in the mid-1950′s, the technology for commercially preparing low conversion products such as maltodextrin and low DE syrups was developed. The purification and crystallization of dextrose meant for the first time that corn based sweeteners could compete in some markets that had been the sole domain of the sugar industry.
Statistics from the USDA Economic Research Service reveal the impact of corn syrup over time. Per-capita sweetener use, in pounds, was 119 in 1970, of which refined sugar accounted for 101 and glucose syrup for nearly 11. By 2000, per-capita use stood at nearly 149 pounds, with refined sugar accounting for 65, glucose syrup 15, and high-fructose corn syrup, 62 (the numbers are summarized in a simple format here).
Of course, it is possible to make something similar to corn syrup – thick, sweet, and liquid – for cooking purposes, by boiling cane sugar in water, stirring, and chilling the liquid. That, though, is time-consuming, and in the past few decades would have become increasingly more expensive relative to corn syrup, due to increasing supply of the latter. Time and cost, then, work to favor corn syrup for this particular kitchen application.
Moreover, in cooking, there is an issue of taste and texture. Using sugar in place of corn syrup for this pecan pie would … well, I don’t know what it would do, simply because I’ve never tried it. I can imagine it would turn out runny and somewhat granular. That’s not necessarily bad, but it’s not how I remember my grandmother’s pies or my wife’s family’s Tassies turning out.
That’s more important than it might seem: Part of the allure of using a handed-down recipe is the sensory delight of tasting something from our past, and the intangible joy that comes from cooking “like granny did.” Our tastes are formed in childhood. As adults, we can re-enact the rituals that result in that remembered taste – it’s a powerful, multi-sensory, embodied sort of practice that’s deeply satisfying on many levels. What likely began in past generations as a convenience and cost-control measure has become, over the years, simply the way this recipe is done. That conclusion, though, hides the complex network of politics, trade, and economic factors that lurk behind this simple commodity.
Producing and consuming change
To say I was shocked that what seemed to be a simple purchase in fact connected me to a host of people and processes that stretch around the world and through time would be an understatement. Surprise led to analysis, which I’ve presented here (in obviously a simple, and probably overly simplistic, form).
If this reads like a call to boycott Karo, then I haven’t been clear. It’s overly simplistic to say the solution to all the problems associated with capitalist provisioning can be solved by consumer boycotts, or even by hyper-local consumption. While both of those are probably good ideas, they don’t address the root problems here: a bottom-line focus in food production; rising prices combined with globally low wages; and the tendency to act as though we’re all radically unconnected to other producers and consumers. In other words, a real solution will have to take production and other social factors into account, as well as consumption.
Having said that, there’s value in educating ourselves about corporate food connections and the ways the commodities we consume are produced and distributed. The trick lies in translating knowledge into change, and that takes collective action on many fronts – political, economic, and social.
The 79th Annual Meeting of the Society for American Archaeology will include a section called “Blogging in Archaeology.” I won’t be able to make the meeting next year, but Doug Rocks-Macqueen at Doug’s Archaeology has come up with a cool way to both build buzz and expand the topic beyond the SAA session. He’s launched a blog carnival and invited archaeology bloggers to contribute in the months leading up to April. Each month, anyone who blogs archaeology is invited to respond to a prompt or question, and Doug will be curating the posts for easy access. Also, we’ll be using #BlogArch on Twitter both during the carnival and at the SAA session. Take a look at Doug’s November question/master post here.
To start things off, Doug asks a simple question: Why are you blogging? I’ve been doing this since February 2012, so it’s a good time, I think, to step back and consider the question.
Why are you blogging?
I noticed a couple of interesting things about this blog after looking back on the nearly two years I’ve been doing it. First is that it’s very easy to track the ebb and flow of my employment by looking at the posts I’ve made. Long stretches of silence during the meat of the academic years, and lots of action over summers and semester breaks. You could chalk that up to available time, but I think there’s more to it, and it touches on my answer to Doug’s question. I blog because I like doing it. And I like doing it because it makes me a better scholar and writer. And it makes me a better scholar and writer because it lets me stretch my intellectual and creative legs.
Whenever I teach a class with a large writing assignment, I give my students lots of leeway in choosing a topic. I tell them to write about things they care about. When you write with joy, it shows through. I try to never blog about something simply because I think I should – rather, I try to show my readers why something that I think is important or interesting, really is, by taking pleasure in my writing. So I blog more when I’m on break because even on break, I enjoy thinking and writing.
The second thing that’s become clear to me in considering Doug’s question is that this blog has shifted in its mission a bit. My About page lays out the vision I had in the beginning:
… the mission here is twofold: First, to highlight the many connections between past and present by drawing attention to and discussing historical material culture and documents; and second, to share my interest in historical objects and archives in the hope that some readers will come to share it.
I still think that’s true. But I’ve noticed that I’ve started to use this platform for more advocacy. Blogging is particularly good for advocacy work because it’s relatively immediate, it’s open, and it’s shareable. My second most-viewed post of all time, “A bad day for a relic hunter,” falls into this category. I’ve also explored issues in higher education, particularly regarding anthropology, like in my most-viewed post, “Anthropology is useless? Not to my students.” Finally, I’ve recently gone a lot more into issues of class and capitalism than I had intended to when I launched (as in here, and here). Aside from a chance to link to a few of my favorite writings, I think this highlights the value of blogging: You can tailor what you write to your interests. To me, teaching and advocacy and all that capitalism stuff is really intertwined. It all fascinates me, and like I said above, that means I write better. (At least that’s the goal: I leave to the readers to judge.) And because of the immediacy of blogging, I can explore ideas in a smaller, more informal format – blogging helps me clarify my thinking, and exposes me to helpful criticism of that thinking early on in the process.
Beyond the question
I’d like to extend Doug’s question a bit, and ask aloud why more people in archaeology aren’t blogging? Last week I was at the annual convention of the American Anthropological Association, and I had the pleasure of meeting one of my favorite social-media-savvy archaeologists, Bob Muckle (he’s on Twitter @BobMuckle and writes a monthly column for Anthropology News). In a talk on the state of the field, Bob wondered aloud where all the American archaeologists are on social media and in the blogosphere. In part, this blog carnival should help make us a bit more visible. But his point is well-taken: Why will Doug’s call draw from (likely) dozens of bloggers, rather than hundreds?
I suspect there are a couple of reasons. First, most American archaeologists work in cultural resource management, and there could be a fear of releasing information that might be proprietary or reflect badly on an employer. Second, and in line with the first, blogging takes time, and time is in short supply for CRM archaeologists, many of whom are not permanently employed. And third, blogging is not considered to be a direct benefit to academic archaeologists – it typically doesn’t count toward tenure, for example.
I think all of those reasons can be dealt with, and I think they’re all worthy of exploring (maybe in future carnival questions?). For now, I’ll open it to readers for suggestions or comments.
Halloween has come and gone, but the memories of this year’s crop of cheaply racist, sexist, and misogynistic costumes linger. It has become a seasonal ritual: For some, an awkward intrusion of uncomfortable politics into a lighthearted holiday; for others, a chance to make a point about “freedom” in the face of political correctness – via a set of cloth and plastic assembled by underpaid workers overseas, bought on credit at a store whose employees earn less than a living wage.
In an article titled 6 Most Wildly Offensive Halloween Costumes posted Oct. 30 at AlterNet.org, April De Costa takes the fact that costumes like “Sexy Indian Maiden,” “The Freshman 15,” and “Oriental Specs” exist, and proceeds to attack the imagined market, labeling hypothetical consumers as “obtuse jagoffs,” “human garbage,” and “everyday racists.” Her outrage is directed at the consumers, and it is overlaid with class hostility. These “racists” are not consuming the right things in the right way.
De Costa reserves little ire for the companies that directed and funded the production of the costumes in the first place. This is not surprising. Late capitalism, mainstream economics, and the culture industry all insist that production and consumption be treated as radically separate domains. Those domains are not created equal: Seemingly limitless choice confronts shoppers in an endless series of markets, each one promising to create distinction if only the right consumer choices are exercised. If occasionally those choices reveal one to be a racist, sexist, or misogynist, then facing the ire of the De Costas of the world is the price one must pay for “individual freedom.”
Meanwhile, labor is devalued through downsizing, offshoring, and industry consolidation; collective bargaining rights are undermined; and social safety nets are removed in service of neoliberal ideology. Workers, who might be united in the labor of production, are on their own. Consumers dissolve their natural ties of class and kin in pursuit of a distinctive self, validated by capitalism. In service of a consumption-driven ideology, the struggles of workers are erased, while the machinations of consumers are celebrated. More ominously, social ills like racism, sexism, and misogyny appear to arise because products that signify those traits are consumed, but the underlying circumstances of their production vanish.
As an alternative, following Bertell Ollman’s (2003) Dance of the Dialectic, we could interpret these racist, sexist, misogynistic Halloween costumes in light of the internal relations of production and consumption within which they are embedded. This sort of view necessarily considers production and consumption not as separate domains, and not as balanced forces, but as forces in contradiction. It might seem odd to suggest that, say, a plastic mask marketed as a Sexy Osama bin Laden costume is conjured into being by forces of global capitalism, but in fact it both reveals the perniciously mundane ways in which the system reproduces itself, and offers clues to how to fight it. Moreover, to ignore production, to place all the attention (and blame) on the consumer, works to further entrench the privilege that led to the creation and marketing of these racist, sexist, misogynist objects in the first instance.
Behind the masks
As far as I could tell from three hours of web searching, all of the costumes mentioned in the AlterNet article are sold by small, privately held companies based in the U.S., all of them apparently in the business of warehousing and shipping. In no case was I able to find information about country of origin of the costumes themselves, let alone any specific information about their manufacture or components. Emails to three companies inquiring about the place of manufacture of specific costumes went unanswered.
In light of this, I shall consider the Sexy Indian Maiden costume, sold, among other places online, at Amazon.com by a vendor called CostumeHub.com, to be representative of the lot. CostumeHub.com is a subsidiary of Best Service Stores Inc., on online retailer incorporated in 2006, based in Kansas City, Missouri. Best Service Stores’ “About” web page helpfully notes that its Lenexa, Kansas, warehouse contains “almost 150,000 feet of … space” and promises that “consumers will typically save themselves the hassle of returns, repairs and disappointment by purchasing the best product the first time around. We only offer products from companies we know and trust.” Those known and trustworthy companies are not named.
It is clear from the online text that Best Service Stores is not in the business of production. It notes “Best Service Stores stands out from other web sites because we determine what consumers want and then find the best products at the best prices.” Manufacturing is not their business. So the costumes must be imported.
The bulk of garments imported to the United States are assembled in China, Vietnam, or Bangladesh (between 40 and 90 percent, according to 2012 figures from the U.S. International Trade Administration, depending on how “garment” is defined and the raw materials that comprise them). In 2010, the Institute for Global Labour and Human Rights reported average minimum wages for garment workers in those three countries are $0.93 per hour (China), $0.52 per hour (urban Vietnam), $0.36 per hour (rural Vietnam), and $0.21 per hour (Bangladesh). In comparison, the Institute reports, the minimum hourly wage for U.S. garment workers ranges from $8.25 – $14.00 (not including benefits).
The “Sexy Indian Maiden” costume was selling on Amazon.com for $21.21 at the time of this writing (shortly before Halloween), marked down from $29 (“You save $7.79″!). Assuming, for argument’s sake, that it takes one hour to fashion the assemblage of polyester into a finished product, the retail price minus labor at the going rate for Chinese garment workers equals $20.28; the same figure at the going rate for U.S. garment workers ranges from $7.21 – $12.96. I do not consider what appear to be plastic-and-faux-wool boots pictured in the Amazon.com image in this calculation, as the web page notes “The boots do not come with this sassy American Indian costume for tweens.”
Regardless, not counting the cost of raw materials and maintenance (?) of the factory, the exporter, shippers, and retailer can split an additional 56% to 181% in profit based on outsourcing production to China over the United States. Naturally there are environmental costs to such an extensive transportation network, and social costs to maintaining underemployment among domestic garment workers, but those costs are externalized – they’re not a factor to the seller, only to the larger society.
In a sense, it’s easy to ignore the circumstances of production of any commodity. They’re radically opaque, and the more you learn, the more troubled you become. The picture doesn’t get much more comfortable when you think about conditions surrounding consumption – going a bit deeper than simply dismissing costume buyers as “racists.”
Blame the ‘everyday racists’?
Halloween in the United States is an occasion for children and adults to fantasize, to playact, to act out in ways that are culturally acceptable. And during this once-yearly liminal period, the bounds of what is “culturally acceptable” are stretched. Costumed playacting is an indulgence rarely granted, and consequently imbued with additional meaning in light of its generally forbidden nature.
Some of the costumes that made AlterNet’s list play off deep-seated stereotypes that one encounters daily in U.S. culture, and others represent prejudices that are somewhat topical. For instance, the growing sense of outrage over racist Native American mascot caricatures like that of the Washington NFL team has led to a backlash in which some seem to revel in racist, colonialist language. In that light, it is easy to imagine (though not excuse) that some fraction of consumers would make a misguided attempt to play dress up in mockery of a topic that is current and controversial.
We are long past the days when Halloween costumes were home creations. It is rare that one has the time (between working multiple jobs, or working a job and going to school, or working overtime to compensate for company downsizing), or the resources. U.S. official unemployment stood at 7.2 percent in September 2013, 40 percent of them out of work for 27 weeks or more, but not counting 2.3 million additional who are “marginally attached” to the labor force (source here). In that light, the pressure on consumers to acquire an inexpensive entre into this meaningful cultural touchstone is intense. Many are driven online, where few questions are asked about conditions of production, and even less information is readily available.
Those who fire up a browser gain access to one of the privileged spaces of late capitalism: The space in which individuality and “freedom” is validated by consumer choice. A shopper who types “Halloween costume” into the search field at Amazon.com finds, at this writing, 295,872 options. It’s dazzling, in its own way. Faced with seemingly limitless options, surely one’s choice represents one’s individuality?
Donning a costume promises two magical moments of self-enhancement: First, the association with the Other that accompanies the wearing; and second, the disassociation from that Other, in which the individual is enhanced and transformed through that double movement. What emerges is both different from one’s self, yet also, through the secret of the fetish, a contributor to one’s new self. One is no longer simply “That guy, John;” he’s now “That guy, John, who dressed up like a sexy Indian.”
That John’s freedom of choice, his validated self, required accepting the trappings of oppression does not bother him. Just as the conditions of the costume’s production were obscure, so were the conditions surrounding John’s consumption of it. This point is made repeatedly by the Frankfurt School critique of the culture industry. Herbert Marcuse notes in One Dimensional Man that the dominant interests of any society demand repression, encode it in consumable products of culture, which are then bought through an apparently free exercise of choice and act to perpetuate that repression. John’s consumption of that costume, regardless of his internal reasons for it, enact for all of society a small drama of privilege and Otherness.
A better (costumed) future
As Marx made clear in Wage Labor and Capital, capital savings through reduction in labor costs force, in turn, greater production to offset lower selling prices, and hence the need to create larger markets to consume those lower-priced goods at a rate that ensures continued profit. This is the key to understanding production and consumption as antagonistic forces: Development of one undermines the other, forcing a quantitative transformation, which reaches but then surpasses the balance point, renewing the whole process.
The “Sexy Indian Maiden” costume is simply an element in that endless, snowballing, cycle of contradiction. That does not mean it isn’t significant, and odious. It merely means that it doesn’t exist because a market of “human jagoffs” is clamoring for it. It exists because capitalism demands that it exist, and capitalism demands that consumers buy it. The particular form of the costume, in all its racist and sexist wonder, is the result of a scattershot attempt to push enough cultural buttons to hook enough consumers to cover the (very low) cost of its production. Anything beyond that is profit.
None of this is meant as an excuse for buying, let alone wearing or dressing children in, what are demonstrably repulsive costumes. It is easy to muster outrage over outfits like these. What the righteous fury misses, however, is the political economy that underlies their production and consumption. Blame for the costumes lies at least as much with those who direct the manufacture and sale as it does with the buyers – buyers who are, incidentally, even harder to find than information on the origin and components of the costumes.
Ultimately, racist, sexist, misogynist Halloween costumes are a (small) symptom of a social-economic system that is working exactly as it is intended to work. If we change the system, better costumes will be just the beginning.
Monday was a significant anniversary in the history of labor in New York State, as well as in the upstate city where I live. It was the 100th anniversary of the Binghamton Clothing Co. factory fire, which killed 31 people, mostly immigrant women, on July 22, 1913. To this day, it stands as the single greatest loss of life in Binghamton.
A little backstory first, drawn from speeches at the event today as well as the Wikipedia entry. Binghamton was once a significant manufacturing center, and downtown was home to factories making cigars, pianos, and clothing. It’s hard to imagine now, but thousands of people, largely immigrants that formed the Italian, Irish, and Eastern European communities that still exist here, traveled on streetcars to work downtown. The Binghamton Clothing Co. made men’s overalls, some of them treated with chemical waterproofers, and employed more than 100 workers, most of them women.
Shortly after 2 p.m. on that day, a worker noticed a fire in the basement, tried to douse it with a fire bucket, but was unable to stop the spread. It quickly climbed the single staircase up to the third and fourth floors, where the factory was. Within minutes, the building was engulfed by flames that burned so hot that firefighters couldn’t approach within 100 feet, and fire ladders themselves caught fire when they touched the building.
Before the building collapsed 20 minutes later, most of the workers had escaped down the single staircase. Two in particular are remembered to this day in Binghamton: Nellie Connor and Sidney Dimmock helped many escape, Connor by guiding women down the stairs, Dimmock by carrying people out. It cost both of them their lives. As the memorial plaque at the site notes: “Their names are worthy of honor and praise.”
Some women died after jumping from the roof. Twenty of the 31 dead were burned beyond identification: their names are known because the factory’s employee list survived the fire.
I was privileged to attend the memorial service at the site. It was a moving event, with firefighter bagpiping, prayers and a reading of the names. You can see some of the images in the gallery above.
The parallels to the Triangle Shirtwaist Factory Fire are eerie. In contrast to that deadly fire, which happened just over two years before, the Binghamton Clothing Co. held fire drills, and there were no locked doors that trapped workers. But in the wake of Triangle, New York State had launched an investigating commission into factory fires. It made a series of reports over the next few years, and the Binghamton fire added a sense of urgency to its mission. The Binghamton Clothing Co. fire was the subject of investigation by that commission, and this event helped the commission eventually put forward 20 regulations that were adopted by New York to establish stricter safety and occupational health rules for factory workers in the state.
Like at Triangle, workers in Binghamton had to die to force factory bosses and money men to spend some money and effort ensuring the most basic safety measures were taken in their properties. The speakers at today’s event took pains to note that the factory owner in Binghamton was heartbroken, and devoted the rest of his life and fortune to caring for the victims’ families. I have no doubt the grief was real. But imagine how different today would have been if, after Triangle, he had taken it upon himself to make improvements in the factory?