Over at her excellent deathsplanation blog, Alison Atkin has laid out an incredibly thoughtful and professional letter to National Geographic International in response to the network’s plans to broadcast a
relic-hunting looting-for-profit “reality archaeology” program called Nazi War Diggers.
While my opposition to the program grew out of more or less simple professional anger at a show that seems to encourage illegal and unethical digging, Alison is an actual, honest-to-goodness osteoarchaeologist (someone who specializes in the archaeological recovery and interpretation of human bones), and
does has done* research involving the identification and repatriation of victims of mass casualty events. Put another way, this program steps directly on her professional toes, and therefore I think she ought to be listened to.
She has drafted a letter of concern and sent it to several email accounts in an attempt to reach people at the National Geographic International network (no easy feat, as she explains). I highly recommend reading it, available at this link. Here’s an excerpt:
Regardless of whether these actions undertaken in the name of this programme were legal, they certainly were not ethical. The recovery and repatriation of human remains, from any context, requires the presence of individuals trained archaeological and osteological techniques and methods. I will freely admit that I felt nauseous while watching the manner in which the human remains being excavated were treated – callously and without care. These are the remains of a human, who was a son, and possibly a brother, husband, father – who died during an unimaginably terrible war and horrific circumstances – and regardless of their nationality they deserve dignity in death. Anything less is completely inexcusable.
See what I mean? Quality stuff.
Go Alison go!
*Edit to clarify Atkin’s research history
The National Geographic Channel International (the U.S. version of which previously brought us the program Diggers) has greenlighted a show called, no kidding, Nazi War Diggers, which it, apparently un-ironically, refers to as a ‘factual series.’
The press surrounding the program is revealing. Let’s take a look at the “team” the National Geographic Channel International network has assembled (quoted from this post here):
Nazi War Diggers was shot in Poland and the Baltics and stars former U.S. Marine Craig Gottlieb; military expert Stephen Taylor; gadget guy Kris Rodgers; and Polish metal detectorist Adrian Kostromski.
A Marine, a “military expert,” a “gadget guy,” and a metal detectorist. All of those can be excellent occupations, and all of those occupations can be filled with people who have a deep knowledge of and respect for history and heritage and material culture. But none of them appear, on the surface, to be qualified to conduct what the network refers to as an “archaeological series.”
And the only video preview that has been released so far is, shall we say, not exactly comforting. (Link here. Caution: It portrays human remains). The outrage in the comments, at least, is encouraging. I’m no bioarchaeologist, but there are few human bones that are more distinctive than a femur. To mistake one for a humerus, as the “team” does, gives me pause.
As we’ve covered before in this space (see this post here, and also this one here, and also this one here), I object to programs like these not out of professional spite, but because they do incredible damage the archaeological record that they claim to be working to preserve. The Conflict Antiquities blog presents 20 “urgent ethical and legal questions” surrounding the program that I, for one, would like to see the National Geographic Channel address. I also call BS on the network’s basic attempt to justify this program. A quote by executive Russell Barnes, from the same post linked above:
“The Eastern Front of World War II saw probably the bloodiest fighting in human history and time is running out for us to capture the historical truths of the conflict that lie literally hidden in the ground.”
Time is not “running out” at all. There’s no pressing threat to the archaeological record of WWII battlefields besides unchecked development, climate change, and the looting of the sites themselves. By broadcasting this program, the network is (indirectly, to be as kind as possible) encouraging more looting.
And then there’s the Nazi issue.
In his Portable Antiquity Collecting and Heritage Issues blog, archaeologist Paul Barford raises his own serious ethical questions about the series, focusing most interestingly on the Nazi aspect. Barford notes that one of the presenters is chairman of a war relics collecting group, which has a related forum filled with WWII German military regalia, and is sponsored by Nazi memorabilia collectors and re-enactors.
In my own work as an archaeologist who specializes in the African Diaspora, I deal with difficult and painful history and material culture. I have absolutely no respect for anyone who purports to “collect” memorabilia of the slave trade, the Jim Crow era, or other racist relics. The simple act of collecting or trading in objects of the past that are intimately connected with slavery and racism cannot possibly be considered a value-neutral act. The same applies to Nazi objects, as far as I can see. If Nazi artifacts really are in danger of being “lost” to formation processes, then perhaps the world would be a slightly better place for it. It’s certainly a quantitatively worse place if it tolerates Nazi artifacts being bought and sold for profit. Nazi War Diggers cannot help but encourage that.
I just returned from a splendid few days in beautiful (though icy) Quebec City, where I attended the annual Society for Historical Archaeology conference. While there, I gave a paper (with Maria Theresia Starzmann) titled “Techniques of Power and Archaeologies of the Contemporary Past,” as part of an excellent session on the political economy of identity.
We received a good response to the paper
, and so I wanted to share it here. A few caveats first. This is not a peer-reviewed work; it’s an attempt by Maria and I to push our field’s debate over what constitutes responsible political engagement toward what’s termed praxis, or “theoretically informed action.” Our paper is based upon our reading of relevant literature, our own experience in the field, and our understanding of how archaeology defines the legitimate role of an academic knowledge producer. It has not, though, been subject to a thorough critique by multiple colleagues (“peer-review,”) so we ask that anyone who wishes to cite it to please speak with us first. That said, we welcome any comments. You can download a PDF by clicking the link: Roby and Starzmann SHA 2014-final.
We’d like to publicly thank everyone who was in attendance and who gave us feedback after the session.
EDIT: My mistake, I did not have clearance to post the paper. Edits reflect.
The WordPress.com stats helper monkeys prepared a 2013 annual report for this blog.
Here’s an excerpt:
A New York City subway train holds 1,200 people. This blog was viewed about 4,500 times in 2013. If it were a NYC subway train, it would take about 4 trips to carry that many people.
Last week, Fox News personality Megyn Kelly announced on her program as a true fact that Santa Claus and Jesus were white. You can see the segment here, and as for the reactions, they run the gamut from Fox fellow Bill O’Reilly totally agreeing with her, to a reasoned piece in The Atlantic by Jonathan Merritt, who notes that her comments are both bad history and bad theology.
Kelly herself has protested (here, and here) that her comments were meant to be light-hearted, and dismissed her critics as “race-baiters.” It was for the children, Kelly said last week on the segment in question: “For all you kids watching at home, Santa just is white, but this person is just arguing that maybe we should also have a black Santa.”
Leaving aside Kelly’s severe misunderstanding of her network’s viewer demographics, something’s being missed amid the outrage. As a person who studies and teaches about things like race and history and civil society, I feel the need to weigh in. To me, Santa-Is-White-Gate points to a major failure in our understanding of race and whiteness.
The most obvious problem with Kelly’s comment is that, if we take “white” to mean skin pigmentation that’s more or less similar to that of Kelly and the three others on her panel (see video above), then neither Jesus nor the historical bishop that Santa is modeled on would fit the bill. Both first-century Judea and fourth-century Anatolia were cosmopolitan regions that had seen centuries of population movement and gene flow between people from North and East Africa, Southwest and Central Asia, and Europe. As Merritt notes in his Atlantic article, “If he were taking the red-eye flight from San Francisco to New York today, Jesus might be profiled for additional security screening by TSA.” Same for Saint Nicholas.
But that’s not my point here. Kelly’s remarks are flawed on a deeper level, one that revolves around three related issues.
Skin color varies even among closely related people
The idea that “races” of people fall into neat little categories – white, black, brown, yellow, red – is a conceit that has no relationship to actual human variation. The idea traces back to Linnaeus, who assigned attributes to different populations on the basis of generalizations about skin tone. The problem is, human variation of all traits, including skin color, exists on a continuum. Trying to draw rigid lines of color falls apart with a large sample.
If you were to look at a random person from Nigeria and a random person from Norway, for instance, you’re likely to see a clear difference in skin color. But if you were to look at everyone in the Nigerian’s population, you’d find a huge amount of variation in skin tone – some darker, some lighter – and you might realize that classifying everyone as “black” ignores those who are of lighter skin. The same would be true in reverse for the Norwegian’s population – everyone would have relatively lighter skin than our African sample, but that lightness would exhibit variation itself.
This idea is illustrated very well in an interactive from the American Anthropological Association. Give it a try, where do you draw the line? And for a really good, accessible book that explores this and related topics, I highly recommend Jon Marks’ Alternative Introduction to Biological Anthropology (note: this is an uncompensated recommendation; I don’t get any money if you click that link or buy the book).
accept acknowledge the reality of variation in skin tone, then what Kelly insisted is a true fact seems much less, well, truthful and factual. But it gets better.
The idea of “white people” is a fairly recent concept
It’s no mystery why Linnaeus decided to assign and group people by skin color. He was writing at the height of European colonization of Africa, Asia, and the Americas. Since the 15th century, Europeans had been encountering people of very different appearances, habits, customs, and beliefs. Classifying those “others” became the necessary first step to controlling them. “White” was how the colonizers distinguished between themselves and the colonized.
But it didn’t happen overnight. In fact, there was quite a bit of overlap between color, language, and religion early in the colonization process. For instance, the Spanish missionary Bartolome de las Casas argued that conversion to Christianity meant native americans could not be enslaved and worked to death. Religion, in that sense, trumped color.
In the early years of the Virginia Colony, indentured servants could be of European or African descent. They lived and worked side by side, mated and married, and were able to secure freedom and property after their indenture. Status, in other words, depended not on skin color, but on one’s status as a servant or a landholder. Over that first century, color-based divisions were created by virtue of court rulings, gradually erasing indenture as a status for African-descended people (and then Europeans), and replacing it with a system of lifetime slavery based on color. For more on this, see Edmund Morgan’s American Slavery, American Freedom and Ira Berlin’s Generations of Captivity.
The point is that “white” was created by law and custom and “science,” it wasn’t a status that had meaning to people in the sense that we understand it today, until well into the 18th century. Back into the ancient world, there’s very little evidence that color was considered to be an indicator of difference or ability. Nubians ruled Egypt, Romans married North Africans, and so on.
The idea that the past was filled with white people is simply flawed, regardless of what Kelly says. “White” isn’t something that’s self-evident, it’s something that has a history. It had to be invented. And it had to be invented for a specific reason.
“White” is about power and privilege, not skin color
The people who created and policed the distinction between “white” and “other” were the ones who held the power – it was the colonizers, the scientists (Linnaeus, Haeckel, Vogt, Morton, and others), and the politicians (Jesse Helms, Strom Thurmond, Orval Faubus, etc).
After the end of slavery in the U.S., legal segregation maintained distinctions between white and black for the purpose of keeping political and civil power away from those whose interests would oppose the (white folks) in power. Mechanisms to disenfranchise African descended people were based not just on physical appearance, but on ancestry. This odd system, called “hypodescent,” assigns people whose ancestry comes from multiple groups into that group with the lower amount of prestige, regardless of their skin tone. So a person who had, say, one grandparent who was black would be classified as “black,” no matter how much they appeared “white.” Hence, they might not be able to vote, hold office, get access to education, make a contract with a “white” person, and so on. We still feel vestiges of this today: Consider how President Obama is classified, despite having a white mother and a lightly pigmented skin tone.
Moreover, exactly who is included in the category “white” (in the United States at least), has been enlarged over the past two centuries. Immigrant groups including Italians, Irish, Jews, and Scandinavians had to overcome barriers to social mobility and economic access based on the perception that they were dirty, lazy, uncultured, uncivilized, less intelligent, and other traits constructed in opposition to “white” society. “White” in this sense was not at all about skin color, but about class – access to resources and advancement and social capital was reserved for the already wealthy and privileged: the native born, England and Scotland-descended Protestant elites. By a variety of mechanisms, immigrant groups came to be included in the class-based construction of whiteness. Many of their descendants became the so-called “white ethnics” of today. For more on these cases, see David Roediger’s The Wages of Whiteness, Noel Ignatiev’s How The Irish Became White, and Karen Brodkin’s How Jews Became White Folks and What That Says About Race in America.
So to sum up: “White” as distinct from any other skin color does not exist in actual human populations; “white people” came about only in the past three centuries, and “whiteness” as an index to power and privilege is a construction that serves to exclude others based on a largely arbitrary difference.
None of this is particularly revolutionary or mysterious. Nor is it hard to teach. In my own experience, students ranging from intro-level college undergrads to graduate students in advanced seminars are profoundly interested in these ideas, and they get them on a visceral level. Yet last week we saw a highly compensated and visible news personality publicly stating the polar opposite of what anyone with a passing knowledge of history knows to be actual fact.
What’s the disconnect? Is the history of race simply not taught? That seems wrong to me, because it touches on so many other topics. You can’t teach the so-called Age of Exploration without discussing it. You can’t teach U.S. colonial or Civil War history without talking about it. You can’t teach a class in 20th Century America without talking about it.
And in case anyone thinks Kelly’s remarks were an isolated issue, realize that when people of power and prestige put out false narratives, it has an outsized impact. I point to a story out of New Mexico from this week, where a teacher is accused of chiding a black ninth-grader who dressed as Santa that “Santa is white.” I’m not suggesting this teacher got the idea from Fox News, rather that a lot of people are woefully misinformed about what “white” means, and why it means what it does.
I suggested up top that the problem with Megyn Kelly’s comment is that it highlights how little we know or think about race and whiteness. Kelly herself is highly educated – she has degrees from Syracuse University and Albany Law School, both of them very good institutions. But we clearly need to do a better job teaching, and we need to call out people in media that propagate bad history.
This is another post in my occasional “Friday Feature” series. Friday Features are published on (surprise!) Fridays, and are longer-form discussions of some aspect of archaeology, history, theory, etc., that doesn’t lend itself to a typical post. Friday Features are archived on a single page, linked at the top, for easy access.
Thanksgiving at my house brought a flurry of cooking, a spot of (very disappointing) football, and some quite tasty dinner and wine. But as usual, I had to go and complicate even the most mundane of tasks.
During the several hours of meal prep, I wondered more and more about one product in particular: Corn syrup, specifically, Karo brand corn syrup. This ingredient stuck out for two reasons. First, it was integral to the pecan pie – one of my favorite desserts, and one that I make only during the holiday season. My wife and I have two recipes: One for a traditional pie, from my grandmother; one for tiny pecan pie tarts called “Tassies” that’s from my wife’s grandmother. This year, we chose to try a third one that we found online. But in all three, corn syrup is the most important ingredient, at least by volume. There’s more corn syrup than anything else, including pecans.
The second odd thing about the Karo was how much it differed from the rest of the ingredients in our dinner. We’re pretty non-traditional when it comes to Thanksgiving – no turkey as a rule, and this year, no meat at all, just veggie sides. And we tend to cook with lots of whole ingredients, so we spent most of Thanksgiving Day peeling sweet potatoes, stemming green beans, roasting walnuts, slicing kale leaves, and so on. Amid all that natural-y stuff, the corn syrup stood alone, with its bright label announcing its ingredients, nutritional information, and place of origin.
Glancing at the ingredients, all I learned was it contained “corn syrup.” I’ve been around lots of corn, and I’ve never noticed it to be particularly syrupy. So I investigated. I wanted to know what corn syrup was, where it came from, who made it, and why it’s so ubiquitous. Here’s what I found out. In order, I’ll discuss corn syrup itself, the Karo brand’s corporate owners, and how this particular commodity ended up in my pie filling.
Sweetness and corn syrup
Corn syrup is a sweetener derived from corn, that’s liquid at room temperature, and is often used in cooking because it doesn’t crystallize like refined sugar. Corn itself is high in starch, which when extracted can be chemically modified into various types of sweeteners. Corn syrup contains a high percentage of glucose, a simple plant sugar that is easily absorbed during digestion. A glance at the label of nearly any sweetened drink or packaged food (look for corn syrup or glucose syrup, same thing) will show how common an ingredient it is – though the similarly named high fructose corn syrup (HFCS) is a different animal entirely. (Sources: “Other caloric sweeteners” from The Sugar Association; “Corn Syrup” from the Kitchen Dictionary; “Corn Syrup” from Wikipedia. Note: The Sugar Association is a trade group for the U.S. sugar industry that is currently suing the Corn Refiners Association, linked below, over the latter’s claims about HFCS. Caveat emptor.)
Corn sweeteners represent a small but significant destination of the total U.S. corn crop. In 2011/2012, the total crop yield was 12.3 billion bushels, of which corn sweeteners made up 6.5 percent. In comparison, alcohols (primarily ethanol fuel) represented 41.6 percent of that year’s crop; the rest went to animal feed and human consumption. (The sources for this and the statistics discussed below are the USDA Economic Research Service and the Farm Service Agency.)
In 2011/2012, the U.S. imported 3.2 million metric tons of raw cane sugar and produced 3.2 million metric tons, mostly in Florida and Louisiana (For simplicity, I’m excluding beet sugar, though it accounts for even more U.S. sugar production). Historically, most U.S. sugar imports have come from the Philippines, Brazil, and the Dominican Republic.
The U.S. protects domestic sugar growers and producers through tariffs and subsidies and import quotas (all summarized here and here). That has certainly helped increase domestic production, but, simply put, we sweeten so many things that domestic sugar alone can’t supply it all. Hence, corn sweeteners.
In terms of glucose syrup alone, U.S. manufacturers produced 4 million short tons in 2011, imported virtually none, exported just a fraction, and sent half of that production to food and beverage use. For HFCS, the numbers are even higher: 9.1 million short tons of production in 2011, of which the vast majority went into foods and beverages. Compare that with 5.5 million short tons of refined sugar in 2011 earmarked for “industrial use” (i.e., everything except the sugar you keep in the pantry or use at your local coffee shop), and it’s clear that domestic sugar is a small chunk of the U.S. sweetener landscape, with corn products owning the lion’s share. And in terms of consumer-level corn syrup, in the U.S., the market pretty much belongs to Karo.
The Karo label tells me the bottle is a product of ACH Food Companies Inc. of Memphis, Tennessee. ACH makes a range of food products, including Fleischmann’s Yeast, Spice Island spices, and Mazola oils, in addition to Karo. Helpfully (and somewhat troublingly), its Web site notes that “under no circumstance does ACH support or condone the use of forced or slave labor for any human being, especially children” (source).
ACH, though, is itself part of a larger corporate structure. It is a U.S. subsidiary of Associated British Foods plc, one of the world’s largest producers of sugar products and baking yeast, with retail stores in Europe and private label brands including Twinings and Ovaltine (outside the U.S., where that brand is owned by Nestle).
Associated British Foods might sound familiar, as it has been in the news recently. Earlier this year, the company denied charges that another subsidiary moved profits out of Zambia to avoid paying corporate taxes (see the report, media accounts, and corporate responses linked here). More recently, Oxfam has alleged that Associated British Foods and other companies have knowingly worked to seize land from indigenous groups around the world.
That corporate connection isn’t obvious from the Karo label. And that’s by design. Part of the mystery of commodities is the radical separation that they seem to create between production and consumption. To put that another way, consider the way we make purchases. We go to a supermarket and are confronted by things to consume – things that appear to simply exist in finished form. The little act of buying obscures the long train of steps involved in making, distributing, and marketing commodities – steps that involve vast networks of interconnected people at each stage. We confront that train only at the endpoint, and it takes work to piece together the steps leading up to it.
At the point of purchase, though, I was concerned only with getting the right product, and, to a lesser extent, with cost. Because the whole commodity chain is obscured, we end up thinking that the cost of a product appears only at that moment of purchase. The fact is, there are costs associated with each link in the chain, and many of those costs are not directly economic ones.
A taste of the past
Neither my nor my wife’s grandmother knew or cared anything about ACH foods, tax dodging allegations, or global trade policy. They just knew how to make really great pecan pies. And the key to both their recipes was Karo corn syrup. But why corn syrup and not sugar, or honey, or maple syrup? Because they cooked the way they had been taught to cook, using ingredients that were broadly available and relatively cheap. So why is corn syrup widely available and relatively cheap, and how long has it been that way?
The folks at the Corn Refiners Association, Corn.org, give a bit of history:
In 1921, crystalline dextrose hydrate was introduced. Then in the mid-1950′s, the technology for commercially preparing low conversion products such as maltodextrin and low DE syrups was developed. The purification and crystallization of dextrose meant for the first time that corn based sweeteners could compete in some markets that had been the sole domain of the sugar industry.
Statistics from the USDA Economic Research Service reveal the impact of corn syrup over time. Per-capita sweetener use, in pounds, was 119 in 1970, of which refined sugar accounted for 101 and glucose syrup for nearly 11. By 2000, per-capita use stood at nearly 149 pounds, with refined sugar accounting for 65, glucose syrup 15, and high-fructose corn syrup, 62 (the numbers are summarized in a simple format here).
Of course, it is possible to make something similar to corn syrup – thick, sweet, and liquid – for cooking purposes, by boiling cane sugar in water, stirring, and chilling the liquid. That, though, is time-consuming, and in the past few decades would have become increasingly more expensive relative to corn syrup, due to increasing supply of the latter. Time and cost, then, work to favor corn syrup for this particular kitchen application.
Moreover, in cooking, there is an issue of taste and texture. Using sugar in place of corn syrup for this pecan pie would … well, I don’t know what it would do, simply because I’ve never tried it. I can imagine it would turn out runny and somewhat granular. That’s not necessarily bad, but it’s not how I remember my grandmother’s pies or my wife’s family’s Tassies turning out.
That’s more important than it might seem: Part of the allure of using a handed-down recipe is the sensory delight of tasting something from our past, and the intangible joy that comes from cooking “like granny did.” Our tastes are formed in childhood. As adults, we can re-enact the rituals that result in that remembered taste – it’s a powerful, multi-sensory, embodied sort of practice that’s deeply satisfying on many levels. What likely began in past generations as a convenience and cost-control measure has become, over the years, simply the way this recipe is done. That conclusion, though, hides the complex network of politics, trade, and economic factors that lurk behind this simple commodity.
Producing and consuming change
To say I was shocked that what seemed to be a simple purchase in fact connected me to a host of people and processes that stretch around the world and through time would be an understatement. Surprise led to analysis, which I’ve presented here (in obviously a simple, and probably overly simplistic, form).
If this reads like a call to boycott Karo, then I haven’t been clear. It’s overly simplistic to say the solution to all the problems associated with capitalist provisioning can be solved by consumer boycotts, or even by hyper-local consumption. While both of those are probably good ideas, they don’t address the root problems here: a bottom-line focus in food production; rising prices combined with globally low wages; and the tendency to act as though we’re all radically unconnected to other producers and consumers. In other words, a real solution will have to take production and other social factors into account, as well as consumption.
Having said that, there’s value in educating ourselves about corporate food connections and the ways the commodities we consume are produced and distributed. The trick lies in translating knowledge into change, and that takes collective action on many fronts – political, economic, and social.
The 79th Annual Meeting of the Society for American Archaeology will include a section called “Blogging in Archaeology.” I won’t be able to make the meeting next year, but Doug Rocks-Macqueen at Doug’s Archaeology has come up with a cool way to both build buzz and expand the topic beyond the SAA session. He’s launched a blog carnival and invited archaeology bloggers to contribute in the months leading up to April. Each month, anyone who blogs archaeology is invited to respond to a prompt or question, and Doug will be curating the posts for easy access. Also, we’ll be using #BlogArch on Twitter both during the carnival and at the SAA session. Take a look at Doug’s November question/master post here.
To start things off, Doug asks a simple question: Why are you blogging? I’ve been doing this since February 2012, so it’s a good time, I think, to step back and consider the question.
Why are you blogging?
I noticed a couple of interesting things about this blog after looking back on the nearly two years I’ve been doing it. First is that it’s very easy to track the ebb and flow of my employment by looking at the posts I’ve made. Long stretches of silence during the meat of the academic years, and lots of action over summers and semester breaks. You could chalk that up to available time, but I think there’s more to it, and it touches on my answer to Doug’s question. I blog because I like doing it. And I like doing it because it makes me a better scholar and writer. And it makes me a better scholar and writer because it lets me stretch my intellectual and creative legs.
Whenever I teach a class with a large writing assignment, I give my students lots of leeway in choosing a topic. I tell them to write about things they care about. When you write with joy, it shows through. I try to never blog about something simply because I think I should – rather, I try to show my readers why something that I think is important or interesting, really is, by taking pleasure in my writing. So I blog more when I’m on break because even on break, I enjoy thinking and writing.
The second thing that’s become clear to me in considering Doug’s question is that this blog has shifted in its mission a bit. My About page lays out the vision I had in the beginning:
… the mission here is twofold: First, to highlight the many connections between past and present by drawing attention to and discussing historical material culture and documents; and second, to share my interest in historical objects and archives in the hope that some readers will come to share it.
I still think that’s true. But I’ve noticed that I’ve started to use this platform for more advocacy. Blogging is particularly good for advocacy work because it’s relatively immediate, it’s open, and it’s shareable. My second most-viewed post of all time, “A bad day for a relic hunter,” falls into this category. I’ve also explored issues in higher education, particularly regarding anthropology, like in my most-viewed post, “Anthropology is useless? Not to my students.” Finally, I’ve recently gone a lot more into issues of class and capitalism than I had intended to when I launched (as in here, and here). Aside from a chance to link to a few of my favorite writings, I think this highlights the value of blogging: You can tailor what you write to your interests. To me, teaching and advocacy and all that capitalism stuff is really intertwined. It all fascinates me, and like I said above, that means I write better. (At least that’s the goal: I leave to the readers to judge.) And because of the immediacy of blogging, I can explore ideas in a smaller, more informal format – blogging helps me clarify my thinking, and exposes me to helpful criticism of that thinking early on in the process.
Beyond the question
I’d like to extend Doug’s question a bit, and ask aloud why more people in archaeology aren’t blogging? Last week I was at the annual convention of the American Anthropological Association, and I had the pleasure of meeting one of my favorite social-media-savvy archaeologists, Bob Muckle (he’s on Twitter @BobMuckle and writes a monthly column for Anthropology News). In a talk on the state of the field, Bob wondered aloud where all the American archaeologists are on social media and in the blogosphere. In part, this blog carnival should help make us a bit more visible. But his point is well-taken: Why will Doug’s call draw from (likely) dozens of bloggers, rather than hundreds?
I suspect there are a couple of reasons. First, most American archaeologists work in cultural resource management, and there could be a fear of releasing information that might be proprietary or reflect badly on an employer. Second, and in line with the first, blogging takes time, and time is in short supply for CRM archaeologists, many of whom are not permanently employed. And third, blogging is not considered to be a direct benefit to academic archaeologists – it typically doesn’t count toward tenure, for example.
I think all of those reasons can be dealt with, and I think they’re all worthy of exploring (maybe in future carnival questions?). For now, I’ll open it to readers for suggestions or comments.