The Decline of Rural and Small-Town America and its Social Implications

Photo credit: Creative Commons Zero–CCO

Last fall I spent several weeks in Berkshire County, Massachusetts, where I had lived for more than three decades prior to my relocation to a new job in Santa Fe four years ago.  Berkshire County is the farthest west and most rural county of Massachusetts.  For New Yorkers and Bostonians, the Berkshires are known for their fields, forests, and outstanding cultural amenities, including the Tanglewood Music Festival in Lenox, the Clark Art Institute and Williams College in Williamstown, and the Massachusetts Museum of Contemporary Art in North Adams.

These are all wealthy, world-class institutions, yet under the glitz lies a darker reality.  As a recent article in the Boston Globe points out, poverty in Berkshire County has risen by nearly a third since 2000.  The median age is rising as younger people leave for places with better job prospects, meaning that the population will continue to be older and sicker and poorer in the coming years.  I caught a glimpse of this in a visit to Berkshire Mall in Lanesboro, the only major shopping center within 30 miles of Williamstown.  Having been away for awhile, I was shocked by the mall’s post-apocalyptic vibe now that most of its anchor stores have packed up and left.  The county’s economic decline helps to explain why the value of the house that my wife and I still own in Williamstown has declined by as much as 20 percent in the past five years.

To some extent the hollowing out of a place that I love has been under way for decades.  In common with many once-prosperous smaller towns and villages in the northeast, the factories began to close nearly a century ago, a process that accelerated in the middle of the twentieth century.  In the Berkshires, manufacturers of textiles, shoes, furniture, plastics, and electronics moved south, then offshore.  Despite this change, in the 1980s and 90s, real estate prices rose dramatically.  The Great Recession put a stop to that, and the economic arc has trended down ever since.

Journalists and to a lesser extent social scientists have now begun to take notice of this situation and assess its implications.   (For examples, see this and this.)   What caught the eye of many of them was the impact that rural and small-town voters had on the election of Donald Trump in 2016.  According to the Washington Post, rural counties favored Trump by 26 points, whereas urban ones voted for Hillary Clinton by a 32 point margin.  (Berkshire County was an exception to that pattern, favoring Clinton 67.5% to 27%, a majority consistent with Massachusetts as a whole.)

So far, I haven’t seen much ethnography focused on dying towns and rural areas.  Notable exceptions include Christina Walley’s Exit Zero (which documents the travail of a deindustrializing urban neighborhood rather than a rural one) and Arlie Russell Hochschild’s much celebrated Strangers in Their Own Land.  Reaching back to a time before the current political kerfuffle are books like Katie Stewart’s memorable A Space on the Side of the Road (1996).  There are doubtless other insightful works with which I’m not familiar.  Still, it’s hard not to get the sense that most ethnographers prefer to embed with embattled urban minorities—African Americans, Latinos, heroin addicts, LGBTQ youth—rather than with alienated and often angry white people hunkered down in blighted communities.  There is little question, though, that their sense of economic and social abandonment is a major factor in our nation’s current political malaise.  People who feel that they have nothing to lose aren’t likely to put much stock in the niceties of civil debate and dignified leadership.

For an upbeat view of a small American town that has managed to maintain its vitality despite the social and economic headwinds, don’t miss Larissa MacFarquhar’s article on Orange City, Iowa (“Where the Small-Town American Dream Lives On,” published in the New Yorker in November 2017).  As MacFarquhar points out, Orange City is about as politically conservative a place as one can find in the US, yet it doesn’t seem to have embraced the bonkers anti-government and anti-immigrant ideology that has gained traction in other parts of the country.  Therein lies a ray of hope.

Addendum, 2/19/2018.  Just happened upon a new book by sociologist Robert Wuthnow that addresses the sense of abandonment that afflicts much of rural America.  Wuthnow is one of the most prolific and reliably insightful sociologists of America working today.  A book not to be missed for anyone interested in this issue.

Time to Kill the “Anthropology of X” Meme

It’s usually a bad idea to vent pet peeves in a blog, but I’m going to do it anyway.

My complaint is about the persistent use of “The Anthropology of X” in the title of books and articles.  Feel free to insert the X of your choice:  disability, food, work, policing, lowriders, cell phones, rock videos, lawn care . . .  This usage is arguably the laziest intellectual move in our profession and one of the reasons why I sometimes despair for the future of anthropology, a discipline that needs all the help it can get at a political moment increasingly hostile to what we do and how we do it.

This regrettable disciplinary tic has a long history, going back at least to the mid-nineteenth century.  The early use of the term made sense because it was usually applied to places, as in “On the Comparative Anthropology of Scotland” (1865, a paper authored by Hector Maclean).  This is forgivable because the discipline was still in its infancy, and the author needed to communicate the fact that new methods were being deployed to shed light on the history and culture of a particular region.

The notion that “the anthropology of” communicates something specific retains a vestigial validity, I suppose, if one accepts that the document so named is likely to take a comparative, cross-cultural approach and that it’s based on ethnography to a greater or lesser degree.  (Today many anthropologists would disagree even with that minimalist characterization of anthropological methods, especially comparison.)  Mostly it says, “This was written by an anthropologist.”

And therein lies the problem.  Most general readers—and, I suspect, many academic readers as well—couldn’t care less about a writer’s disciplinary affiliation.  They care far more about whether the author brings a fresh, insightful perspective to an important topic.  Alas, this isn’t conveyed at all by “The anthropology of X” except in the increasingly rare case when the topic under consideration has never before been explored by an anthropologist, as in The Anthropology of Puff Pastry.  For all I know, an anthropologist has already written about that, too.

I came to this distaste for “The Anthropology of” during three and a half decades of collaborating with sociologists in a joint undergraduate department whose relations were mostly cordial and creative, largely because both wings were committed to ethnographic methods.  Although we maintained two separate majors, for administrative as well as intellectual reasons we decided to downplay the disciplinary divide except in a handful of courses—e.g.,  SOC 101 Invitation to Sociology.  So a course that at other colleges and universities might be called “Medical Anthropology” became “Illness & Healing in Comparative Perspective.”  A course that could be labeled “The Sociology of Consumer Society” became “Culture, Consumption, & Modernity.”  This liberated faculty and students from disciplinary constraints while encouraging everyone to focus on a range of analytical approaches relevant to the issues.

Ironically, this downplaying of disciplinary boundaries made our majors more rather than less attractive to graduate programs in those cases when students decided to pursue advanced degrees.  After all, doctoral programs generally do a good job of refashioning students into anthropologists, sociologists, etc.  But at the admissions stage, they want broadly educated applicants with demonstrated skills in thinking, writing, and research methods.

The most creative writers in anthropology avoid the “Anthropology of” meme, especially if they want their books to be widely read.  Sahlins and Graeber could have titled their recent book The Anthropology of Kingship, but they wisely settled for On Kings.  Tanya Luhrmann could have called her much-praised 2012 ethnography The Anthropology of Evangelical Prayer (yawn!) rather than When God Talks Back—although I’m confident that her editor at Vintage would have nixed the “Anthropology of” title in a nanosecond.

What’s at stake isn’t just titles, although they matter.  It’s making the effort to broaden one’s audience and write in a way that reaches them.  When writing for a journal with a narrowly disciplinary audience, “The anthropology of” is fairly harmless even if it flirts with cliché.  For work that aspires to broader impact, however, it qualifies as a missed opportunity.  Let’s lay it to rest in a remote cemetery devoted to expired memes.

Cultural appropriation and its Mayan discontents

Huipíl detail
Huipíl detail, Jilotepeque, Guatemala. Source: Wikimedia commons, Textile Museum of Canada.

I recently gave a talk on current thinking about cultural theft to an audience at Southwest Seminars, a Santa Fe-based organization that sponsors a public lecture nearly every week of the year as well as frequent field trips in the region.  As one might expect in Santa Fe, a town long known for its artists (Indigenous and otherwise), after the lecture a number of people expressed concern about whether their own art works represented cultural appropriation.  As one woman put it—and here I paraphrase—”My work is inspired both by the spectacular New Mexican landscape and the work of the Native Americans who portray it in their ancient artistic traditions.  Is that wrong?”

I have no way of knowing whether her paintings represent a commercial activity or only  a hobby.  In the latter case, it’s hard to see how imitations of, say, Pueblo pottery designs harm anyone.  Still, it reminded me of how complex and confusing the issue of cultural appropriation is for many people, especially at the non-commercial end of the arts spectrum.  And then there’s the question of how, or even whether, Indigenous artistic productions can be protected when appropriators imitate the style of a given tradition rather than actual works.

Some of these issues are addressed, and others dodged, in a recent news story about efforts to protect the intellectual/cultural property of Maya weavers in Guatemala.  According to the story, the IP laws of Guatemala explicitly excludes Indigenous art from protection.  A group of weavers has filed a lawsuit seeking government protection for their work:

Aspuac says that royalties received as a result of the patent would be divided among the community. The community will designate representatives to negotiate on their behalf with companies seeking to use their designs, and manage the distribution of funds back into the community. Aspuac and other leading members of the movement want to see the money invested in social projects like weaving schools and education for women and children.

The hope is that with the patenting of their textiles and designs, the Maya community would have more autonomy and control over their heritage and culture, thus alleviating two of the major hardships the community faces: cultural appropriation and dispossession. Royalties received from the patent would also give the communities the chance to end a long-standing cycle of poverty. [Source]

This sounds like a promising approach, and I hope it enjoys success.  Nevertheless, it begs the question of whether such a law would effectively prevent the sale of “Maya-inspired” designs that don’t consist of exact copies of existing works.  Where does Mayan creativity end and some other society’s creativity begin?  How far into the past would such protection extend? And would it protect the work of Maya weavers experimenting with radically new artistic forms?  The latter question might sound hypothetical, but after three years of hosting Native American artist fellows at SAR, I’ve come to appreciate how many of them are joyously breaking with tradition to pioneer powerful hybridized art.  An example is found in the paintings of Ehren Kee Natay, as well as his work in other media.  Ehren was SAR’s Rollin and Mary Ella King Native Artist Fellow in 2014. I expect that Indigenous Guatemalan artists aren’t far behind.

One possible solution for the Guatemalan case would be to complement conventional copyright protection of finished works with a licensing program that would allow manufacturers to certify their work as “Mayan” or “Maya-inspired” for a fee.  The licensing fee would have to be modest enough to be absorbed as part of the cost of doing business.  It would be similar to Fair Trade certification, which assures customers that they are doing the right thing by purchasing a certified product.  This strikes me as administratively more plausible than trying to enforce a “cultural copyright” on Maya weaving in all its forms and variations.

On a related front, be sure to check out the website of the Creative Sensitivity Project, the goal of which is to “get as many creatives as possible to understand the effects and ramifications of cultural misappropriation to understand how their job as creative practitioners will effect marginalised groups in society.”

Beyond hoop earrings: The damaging impact of the cultural appropriation meme

The Moana-themed costume that Disney pulled out of stores after intense public criticism, Fall 2016

The vapid debate about cultural appropriation continues in social media, the latest reductio ad absurdum being the claim that hoop earrings belong to Latina culture and shouldn’t be worn by Anglo women.  The neocon press loves these stories because they illustrate the alleged excesses of identity politics in American colleges and universities.

Conservative interest in accusations of cultural appropriation may explain why I was called by Alice Lloyd, a reporter for The Weekly Standard, and invited to explain why appropriation has become such a pervasive meme.  To her credit, she was more interested in efforts to limit the appropriation of indigenous knowledge than in tempests-in-teapots like the hoop earrings issue.  Her curiosity appears to have been sparked by criticism of the glacially slow efforts of the World Intellectual Property Organization (WIPO) to develop protocols for the protection of traditional knowledge and indigenous genetic resources.

Although I admire the efforts of legal thinkers such as James Anaya to nudge WIPO to promulgate global policies that provide an umbrella of protection for indigenous peoples, I’m skeptical that protocols on that scale can effectively address the particularities of local situations and multiple conceptual domains (e.g., genetic resources, biological knowledge, expressive culture, sacred understandings, etc.)  One has only to read WIPO’s draft documents to wonder whether endless micro-editing of terminology can lead to successful solutions in our lifetime.

Any way one slices things, legal protocols must resolve knotty questions.  Who qualifies as indigenous?  Who legitimately speaks for communities given local disagreements about whether formally constituted Native governments (e.g., the tribal councils of federally recognized Indian nations in the United States) are qualified to represent the community in matters relating to religious knowledge?  Can one ever reconcile a global IP system predicated on time-limited monopolies—patents and copyrights— with what indigenous peoples typically see as the eternal status of their values and practices?  Should the cultural-protection rights of indigenous communities always trump the right of indigenous individuals to share life histories that may include religiously sensitive information?  Can WIPO’s necessary focus on nation-states ever be fully reconciled with the complex and often fraught status of indigenous communities within those nation-states? These and other tough questions have made the journey toward international protections a painfully slow one.

The article that emerged from the Weekly Standard interview is more thoughtful than most, and I’m flattered that Lloyd says nice things about a book I wrote years ago.  Still, I feel obliged to correct an error in the account.  For the record, the School for Advanced Research wasn’t founded by “frontier-minded and Bryn Mawr-educated heiresses with cash to burn.”  It emerged from the efforts of an early anthropologist, Alice Cunningham Fletcher, and the archaeologist Edgar Lee Hewett to establish a center for the study of American prehistory that would rival institutions that studied the archaeology of the classical world.  The Bryn Mawr graduates mentioned by Lloyd are presumably Martha Root White and Amelia Elizabeth White, who built a home in Santa Fe in the 1920s.  Amelia Elizabeth White bequeathed the estate to SAR in 1972, 65 years after SAR’s founding.  Details here.

It’s fair to say, as Lloyd does, that many members of Santa Fe’s Anglo elite had an appropriative attitude toward Native American culture.  Early in the twentieth century, Santa Fe and Taos served as meccas for educated Anglos searching for an America that owed little to European high culture.  They found this primal authenticity in the New Mexico landscape and its indigenous and Hispanic populations, especially the Pueblo peoples of the region.  Although Hewett, the White sisters, and others like them were deeply sympathetic to Native Americans and in some cases fought vigorously to defend indigenous land rights and religious freedoms, their attitudes were often condescending.  They presumed that they were more qualified to speak for Indians than Indian people themselves.  In this sense they were creatures of their time.

The SAR of today is a different place.  In particular, SAR’s Indian Arts Research Center is committed to doing what it can to facilitate the transfer of indigenous knowledge between generations and to work collaboratively with the communities in which the IARC’s collections originated. And the IARC is extremely careful about maintaining the confidentiality of sensitive religious knowledge, to the limited extent that it can be found in the IARC’s records. Perfection achieved?  Not by a long shot.  But we are making progress despite the current economic and political headwinds.

It remains to be seen whether public understanding can move beyond trivial arguments about hoop earrings, yoga, and Asian cuisine to acknowledge the real injustices suffered by indigenous peoples when their hard-won traditional knowledge is commercialized or otherwise misused by outsiders.

On trademarks.
  The recent Supreme Court decision in Matal v. Tam has defined trademarks as a form of speech, thus voiding restrictions on disparaging marks and opening the door to continued legal protection of the controversial name of Washington D.C.’s football team.  I’m no legal scholar, and I understand that complex issues of free speech are at stake, yet common sense (for what that’s worth these days) says (1) that commercial speech is different from political speech, and (2) that trademarks are not a fundamental constitutional right but a license granted by the government upon satisfaction of a set of stringent conditions. Commercial speech is held to standards of accuracy that prevent a company from making wildly inaccurate claims (“Our toothpaste cures five forms of cancer!”) that are protected in the context of political speech (as when Donald Trump claims that his election victory was the greatest in American history).

I thus fail to see why the government should be obliged to authorize trademarks that disparage and hurt specific communities, especially minority ones.  Granted, the use of disparaging trademarks doesn’t seem likely to become widespread; after all, it will drive away many customers, thus defeating a trademark’s commercial intent.  But in an era as polarized as ours, I can imagine some people being moved to register and use offensive trademarks . . . well, just because they can.  Even legal scholars who defend the decision accept that it may also  void restrictions on “scandalous” trademarks, meaning that we can look forward to more vulgarity in popular media and on the shelves of our local shops.  It is hard to celebrate this decision as a positive validation of American free-speech rights.

Update on the STOP Act.
  On a more encouraging note, Senator Martin Heinrich (D-NM) has just introduced to Congress a revised version of the STOP Act (Safeguard Tribal Objects of Patrimony).  The director of SAR’s Indian Arts Research Center, Brian Vallo (Acoma Pueblo) was involved in revisions to this bill and in promoting conversations between Native American leaders, attorneys, and dealers in Native art that have led to refinements in the proposed legislation.

Update, 17 July 2017.  Don’t miss Arthur Krystal’s essay “Is Cultural Appropriation Ever Appropriate?” L.A. Review of Books, 17 July 2017.

Update, 7 August 2017.  The Washington Post has just published a fascinating story about two men who have filed trademark applications for the Nazi swastika and a variation on the n-word in order to prevent hate groups from using them.  Their hope is that they can contaminate and degrade the power of the terms. “Maynard [one of the trademark applicants] . . . planned to co-opt the swastika by including it on baby products. Such ‘social satire,’ he said, could change its meaning and restrict its usage among hate groups. ‘One of the hopes is that people look at the swastika flag in 10 years and think: baby wipes,’ he said.”

Anthropological writing for troubled times

editingWhen I made the shift from college teaching to the world of fundraising in support of anthropology and Native American arts, I quickly learned something about which I’d previously had only a vague suspicion: that as an occupational group, anthropologists do a poor job of making a case for the importance of our work.

In offering such a sweeping judgment, I’m mostly referring to cultural anthropology, my own subdiscipline.  Biological anthropologists have the advantage of being increasingly involved in genomics research as well as studies of human growth and development that have scientific cachet and often some practical utility.  Archaeologists can draw on consistent popular interest in cultural history.

If you ask an average American what recent work of anthropology he or she has read or at least is aware of, the reply is likely to be something by Jared Diamond, who isn’t an anthropologist at all.  (He was trained in physiology and now holds an academic position in geography.) This causes no end of consternation to anthropologists, who with few exceptions find Diamond’s work simplistic, derivative, and often wrong-headed.  What accounts for his popularity?  I’d say two things: the clarity of his writing and his willingness to explore big ideas. These qualities earned him a Pulitzer Prize for Guns, Germs, and Steel in 1998.

Clarity of expression and big ideas are not easy to find in the everyday writing of anthropologists.  There are occasional exceptions—one is David Graeber, whose work I’ve written about before—as well as the writers recruited by the website Sapiens, whose success since its founding in 2016 is a tribute to the vision of the Wenner-Gren Foundation and the leadership of the site’s editor-in-chief, Chip Colwell.

These exceptions and a handful of others aside, I’ve come to think of anthros as living in a dream-world in which we take for granted the importance and moral urgency of what we write without seriously considering its off-putting characteristics for the public we aspire to reach.   This suspicion was confirmed by the campaign to hold public readings of work by Michel Foucault to protest the inauguration of the current occupant of the White House.  I’m casting no aspersions on those who venerate Saint Michel, only noting the improbability that anything written by him would change the hearts and minds of Americans in a time of marked coarsening of our national discourse.

What do I mean by off-putting characteristics?  There is little point in belaboring the problem of jargon, which afflicts all academic disciplines. (Addendum, 2-24.2017.  That said, don’t miss this essay on academic BS by Maximillian Alvarez.)  But anthropology seems more prone than most to embrace weird linguistic tics, such as the compulsion to pluralize everything (“anthropologies,” “sexualities”) or claiming to “theorize” an issue when the author is simply undertaking comparison or offering inductive generalizations.  These are normalized in the discipline but are likely to baffle readers who aren’t dues-paying  members of the club.

More substantive problems include frequent, plodding references to the structure of one’s presentation—”In this paragraph I talk about X; in the next I discuss Y,” thus presuming that the reader is too dense to figure out where the author is going and why, which might indeed be the case if the prose is weak.  Or the now almost inevitable declaration that the author’s goal is to “complicate” an issue.  Some questions merit complication, of course. Many don’t.  One could even make a case that a good piece of writing is obliged to simplify or at least offer a concise interpretation of the complexities that it addresses.  Equally distracting is the perceived need to cite Big Theorists as a way of displaying cultural capital rather than illuminating an argument.  (Maximilian Forte discusses this in his video lecture “Beyond Public Anthropology,” beginning at the 36 minute mark.)

Brevity is an underrated virtue in contemporary anthropological writing despite the shift in the culture at large to ever shorter forms of written expression.  I’ve lost count of the number of ethnographies I’ve read in recent years that would have been twice as powerful if they’d been half as long.  As many gifted writers have noted, what one leaves out of a book or article is often as important  to clarity of expression as what stays in.

I was prompted to think about the future of anthropological writing by a recent message from a longtime supporter of the institution for which I work.  He heads a family foundation that funds a number of cultural institutions and progressive causes.  The present political crisis is leading him to shift his support in the direction of organizations that can effect positive social change and counter the nation’s turn to the populist right.  His advice to me: “No more esoteric stuff.  Deal with real issues and offer answers rather than narratives.”  He urged me to come out of what he called the “academic cave.”

Even if we discount the urgency prompted by recent events, his view has merit.  For anthropology to survive and prosper, its practitioners must become much better at bringing informed perspectives to issues of broad import, writing about them with impeccable clarity, and proposing practical solutions when appropriate.  This doesn’t limit work to applied or engaged or activist anthropology, although these are certainly valuable contributions to the field. There remains a place for big-picture research—on deep history, human evolution, ancient cultural traditions, and the like—that helps to contextualize current preoccupations within a larger frame.  This work has to be clear, inventive, and engaging.  If we can’t make this transition, we’ll be abandoned in our academic cave, reading Foucault by the flickering light of a dying fire.


A short but useful blog post on writing and editing, drawing on the pithy advice of the late William Zinsser, can be found in the website of Bhaskar Sarma.

The March 2017 issue of Harper’s includes an amusing review by Nat Segnit of the latest crop of writing guides.  Not all of the review is relevant to non-fiction writers and social scientists, but it’s worth a look, especially Segnit’s witty editorial critique of the preamble to the United States Constitution.

More on cultural appropriation


Hats off to a friend for directing me to a recent blog post by Fredrik deBoer questioning the widespread abuse of the idea of cultural appropriation.  His views complement and move beyond my own discussion of degrees of cultural appropriation posted early in 2016. [July 2017: It looks like the post has been taken down from deBoer’s site.]

I confess that I’m a sucker for feisty, against-the-grain assessments of thoughtless pieties of this nature, largely because recognition of the real injustices of certain kinds of inter-cultural theft are undermined by indiscriminate accusations that one group is stealing cultural elements from another..

Living in the Southwest and regularly engaging with Native American nations has sensitized me to the harmful effects of thoughtless imitation, even when well intentioned—a prominent case in point being the history of the Smoki People, a group that imitated Hopi rituals and dress for decades.  In short, cultural appropriation is a real problem worthy of informed criticism.  But critical distinctions need to be made lest it be reduced to an empty slogan, which I take to be the point of deBoer’s post.

A useful place to start a more informed discussion is an online publication by the IPinCH project in Canada: “Think Before You Appropriate.

A compelling case of biopiracy: The Stevia story

stevia_cultivationToday we hear less about biopiracy than we did a few years ago.  As I’ve argued elsewhere, some cases of alleged biopiracy are more ambiguous than critics of cultural appropriation typically admit.  But one case of flagrant biopiracy, that of sweeteners drived from the South American species Stevia rebaudiana, is finally starting to get the attention it deserves.

S. rebaudiana is an herbaceous plant native to eastern Paraguay that was long used by indigenous Guaraní peoples as a sweetener for teas and medicinal preparations.  The sweetness of Stevia comes from several glycosides, including stevioside and rebaudioside, that produce a sensation of intense sweetness without increasing the blood glucose of those who consume it.

Use of Stevia as a sweetener was documented by Western science in the late nineteenth century, although its chemical constituents were not identified for another sixty years.  As developed nations began to search for calorie-free sweeteners, the properties of Stevia became of considerable interest.  Stevia seems to have been embraced as an alternative to sugar first by Japanese and Chinese corporations.  In the U.S., use of Stevia initially stalled because of preliminary evidence that its chemical constituents might be carcinogenic, although effective lobbying by manufacturers of competing artificial sweeteners was also a factor.  The carcinogenicity claim was eventually refuted, however, and Stevia‘s commercial value has grown substantially since the 1980s.

The Guaraní, one of South America’s poorest and most endangered indigenous populations, have received negligible benefits from the global market for this potentially billion-dollar product.  Ironically, marketing campaigns for Stevia-based sweeteners often identify it as “traditional” or “indigenous.”


Smallholder farmers in Paraguay derive some income from cultivation of the plant for the market.  But even this modest compensation is being undermined by commercial biosynthesis of Stevia‘s key compounds in the developed world.  In other words, industrial producers no longer need the Stevia plant to manufacture the sweetener that has become a hot product in the competition for zero-calorie alternatives to cane sugar in parts of the world where obesity and diabetes are major public health problems.  The scale of this obvious injustice is staggering.

For more information on this situation and efforts to address it, a good starting point is the publication The Bitter Sweet Taste of Stevia, a report published by a consortium of European and Paraguayan NGOs.  A protest petition directed to Coca-Cola can be found here.

Plus ça change, plus c’est la même chose.

This site offers information about the book UPRIVER (Harvard UP, 2014), other books by Michael F. Brown, issues related to Amazonian peoples, events at the School for Advanced Research–Santa Fe, and occasional meditations on anthropology and human social life in general.

%d bloggers like this: