Recently, I was reading about the controversy over whether consuming quinoa was good or bad for South American farmers. This debate is not new; nonetheless it drew my attention, and suddenly I recognised a common pattern that I’ve seen time and again in development discourse – the desire to “preserve” traditional cultural practices, and the notion that development disrupts them. One side of the quinoa debate says that people who had traditionally eaten quinoa are no longer able to access the crop because it is all being exported, in turn raising local prices and contributing to food insecurity and malnutrition. The other side argues that many of the farmers who used to sell quinoa locally and are now exporting it are able to sell much more, and at a higher price. They are therefore becoming wealthier, spending more money locally and are able to eat as well if not better than before.
The question is, why does this story of the starving quinoa producers arise, and why do we see these stories crop up everywhere in the developing world but not inversely in the U.S. or Europe? Why is nobody saying, “stop buying U.S. corn so that the American poor can eat”? This would be a simplistic approach to poverty in the U.S., but is it really that different from the quinoa narrative?
I believe these differences arise from the idealisation of traditional lifestyles, particularly if they are linked to indigenous communities. I have been to conferences in Guatemala where people promote food sovereignty; they want to close down the borders and produce all food products locally and only use traditional agricultural practices. Clearly, this would result in increased food prices due to ineffective food production and limited food products available for import, which would increase hunger and thus cause the opposite of what this proposal intends. There is a romantic notion to the idea of people living life in some ancestral form, which ignores that modern technology has brought us to where we are now. Of course, there have been terrible tragedies that have arisen from modern technology, such as pesticides ruining ecosystems. Yet, we produce more food than ever in human history; the agricultural revolution has improved crop yields substantially, and the Malthusian catastrophe has been prevented.
Unfortunately, many people idealise country life as simple, good, healthy living that should be preserved even if it is at the cost of the national economy and a population’s well-being. Over the last century, it seems like we have shifted from one extreme to the other. During the industrial revolution, nature was the culprit of evil; many people felt that the wilderness needed taming, and modernisation was permitting people to live longer and more prosperous lives. Granted, we overdid the “nature taming” bit, but swinging to the other extreme and pretending that nature is always warm, kind, and embracing and that it only brings good things is nonsense. It is “Mother Nature” that has also given us terrible diseases like rabies and Ebola, to name just a few. Of course it is easy to talk about living “naturally” when you already live in a city with a sewage system, clean tap water and plenty of life and job opportunities. It is completely different to discuss living “naturally” when people are dying every day from easily preventable diseases such as diarrhoea due to the lack of basic sanitation and medical services in their home towns.
In the end, I think we need to strike a balance. Every time we see local cultures and languages disappear, humanity as a whole suffers a terrible loss, but the reality is that everyone has the right to live a healthy life full of opportunities if they choose to do so at the cost of their cultural heritage. ‘Choice’ is the key word here. In my previous job we worked in an area of Northern Mali where the population was primarily nomadic. Hence, setting up schools and clinics in a village they visited temporarily had little impact on child and maternal mortality, because the rest of the time the population was travelling across the Sahel. This presented a struggle for our development agency because we wanted to help this nomadic population, but we couldn’t change their lifestyles to fit our model of providing services. Of course, many nomadic people have stopped travelling, but this was their choice and that, again, is the key.
It is equally paternalistic, and in some cases even autocratic, to force a population to change for the sake of development, as it is to force them to remain “traditional” for the sake of their identity and our idealism, without taking into consideration their needs. In our case in Mali, we had to find flexible solutions for providing assistance, and not just say “well, they want to live as nomads, so women will just have to keep dying in childbirth because that is their cultural heritage.”
Overall, a balance must be found. Development and aid agencies must become more flexible, and consumers need to understand that a boycott on – for example – quinoa, is not helpful. Is what we want to provide to communities actually in their best interest? Communities’ own aspirations and needs must be determined, and we certainly must not impose our ideals on them. Most importantly, development practitioners need to reflect on some serious questions. Are our ideals based on what we think ‘development’ should look like for the recipients? Do we want possible beneficiaries of development to be frozen in time for our benefit? And, is all this in the interest of preserving a culture that is not even ours to decide what happens to it?
Featured image shows farmers threshing quinoa near Puno, in Peru. Photo from Michael Hermann (Wikimedia Commons).