9/16/14

Hurry, Curry Favor With Fava Curry (five times fast...)

“Κυάμων ἀπέχεσθαι” (“Abstain from beans.”)
--Pythagoras of Samos
For a guy who was most famous for a theorem often recognized as one of the building blocks of modern mathematics, Pythagoras was a crackpot.  A total nutjob.  A lunatic.  Really out there, is what we’re getting at.  If you want an early look at a Jim Jones-esque cult in the ancient world, look no further than the Pythagoreans.

However, one of the beliefs espoused by this small society of differently rational individuals which often meets with the most ridicule by those first studying their ways turns out (as so many behaviors of that sort do) to have a reasonable basis.  Pythagoras’ importuning of his followers to abstain from beans was probably a specific reference to the fava (Vicia faba).  And it may have saved at least some of them from a fairly painful death from complications resulting from glucose-6-phosphate dehydrogenase deficiency, also known as favism.

This condition affects certain elements of Mediterranean and African populations; carriers of the G6PD allele have a degree of resistance to malaria, which explains why the condition evolved, but it sadly renders them unable to eat fava beans without sometimes fatal interactions with the high levels of vicine and convicine in the beans.  High levels of tyramine also make favas dangerous for those who consume certain MAOI inhibiting drugs as a treatment for depression or other psychiatric conditions.

Now that we’ve scared you... let’s talk about how wonderful fava beans are!  (Because, you know, they are!)  Somewhere around 10% of those with Mediterranean ancestry need to worry about favism; the rest of us need to worry about how to get enough favas in our gardens and on our plates, and Pythagoras can go jump in a lake.

Vicia faba, also known as fava, broad bean, and horse bean, is native to North Africa, Southwest and Southern Asia, and is widely cultivated elsewhere.  In the U.S., the vast majority of the fava beans in cultivation are not actually grown for human consumption, but rather as a cover crop for forage and for nitrogen affixation (a quality which makes favas perfect in the backyard garden – more on that in a minute).

The history of fava cultivation is long – as long, in fact, as any crop other than the earliest stands of wheat-like plants which represented the first human attempts at agriculture.  To this day, farmers from India, Iran, Egypt, Morocco, Sicily, Sudan, Greece, Ethiopia, Nepal, and even Peru and Colombia grow favas in basically the same way as  farmers would have done six or seven thousand years ago.

Favas are a cool weather crop, which surprises those who are not familiar with how they are grown – one would not associate a plant from North Africa and the Mediterranean with “cool weather” but there you go.  In northern climates, or in mountainous areas (Ethiopia or Iran, for example) favas are easy to grow in the summer, but elsewhere (Egypt?  Southeast Texas?) one would grow them in fall, winter, and early spring.  In fact, one of the best reasons to grow favas is their ability to overwinter; we have had luck with plants surviving the harshest freezes the Brazos Valley can dish out; snow has not fazed them, though last winter’s ice storm, where freezing rain coated them with a half-inch of ice did wipe out a sizable portion of our crop – sizable, but not complete; several plants survived even that amount of cold-weather abuse.

In addition to the beans, whose culinary uses we’ll describe in greater detail in a moment, the plants themselves have a lot going for them.  Favas are one of the best beans available in terms of nitrogen affixing qualities.  A plot which has been overwintered with favas is perfectly ready come spring to grow practically anything you want with no soil additives necessary – corn, tomatoes, whatever you will.

Further, a mature fava plant can stand anywhere from three to five feet high, standing straight upright with leaves radiating outward only six to twelve inches, with white or purple flowers (depending on variety) which provide the local bee population with all the winter forage their little hearts could desire, creating a showpiece in your winter garden that will be the envy of your neighborhood.  When everything else is dead and brown, favas are gloriously and optimistically green and vibrant.

As if that weren’t enough… the leaves and flowers are tasty additions to the salad bowl.  Many of the same nutritional qualities found in the beans are found in the leaves and flowers.  High in fiber, high in protein, with a complete panoply of B vitamins, C and K vitamins, Calcium, Iron, Magnesium, Manganese, Phosphorous, Potassium and Zinc, favas are also one of two classes of beans (velvet beans aka mucuna pruriens being the other) which contain L-DOPA, a naturally occurring dopamine agonist, basically a naturally occurring anti-depressant.  And if that weren’t good enough, L-DOPA is also a natriuretic agent, potentially hypotensive… ie. lowers blood pressure.

The beans are eaten in about as many different ways as there are cultures which have been exposed to them.  Historically, both the Romans and Greeks took young beans and parboiled them, occasionally eating them as a puree.  Mature beans are often fried and salted or spiced (this is popular everywhere from Latin America to Thailand).  Chinese cooking features a paste called la doubanjiang (“hot pepper beans”) and in Mexico habas con chile are eaten much like spiced peanuts are consumed in the U.S.

In Egypt (as opposed to the Levant and most of the rest of the world) favas are the primary ingredient of falafel.  Garbanzos (aka “chickpeas”) are the main ingredient in the falafel with which most people are familiar, but fava-based falafel has a richer, nuttier taste that Egyptian foodies rhapsodize at great lengths as being superior – we would never turn up our nose at Levantine falafel, of course, but the Egyptians have a point.  This is somewhat akin to a debate about the relative aesthetic merits of a meadow versus a glen, but if the question were merely over nutritive content, the Egyptians with their favas win, hands down, no contest.  Garbanzos are healthy, but favas are the king of beans in that regard.

As such, it comes as no surprise that favas are the most widely consumed food in North Africa.  The Egyptians eat ful medames the same way Norwegians eat fish.  In the Sudan, mashed favas flavored with sesame oil and a bit of Jibna (sheep-cheese, similar to Feta) form the basis for the most common lunch entrée.  In Morocco, street vendors serve up a fava bean dip called bessara the way New Yorkers eat hot dogs.  Ethiopian shiro flour forms the basis of most of their dishes, and one of the main ingredients in the flour is favas.  And numerous dishes important to the Ethiopian Orthodox Church make heavy use of favas.

And speaking of religion… there is a long history of theological/spiritual/supernatural association with the fava.  Pythagoras and his followers were vegetarians owing to an intriguing form of animism in which they believed not only in reincarnation for human and animal spirits, but also for the spirit of beans – details are sketchy because reliable witnesses tended not to get invited to their parties, but that’s as believable an interpretation of what data is available as any other.

The Romans would make an offering of favas to the lemures (“house ghosts”) on the festival known as Lemuria.   In Ubykh culture (historically people from the Southern Caucasus mountains in modern Georgia, Armenia and Turkey) there was a form of fortunetelling anthropologists refer to as “favamancy” where fava beans were thrown on the ground (similar to the throwing of bones in Norse culture).  In the Ubykh language, in fact, the word for “fortune teller” was literally “bean thrower”.  And in much of Italy, to this day, fava beans are traditionally planted on November 2nd, All Souls Day.

At Myrtle’s place, we do not ascribe any magical powers to the fava bean, nor do we believe they house the souls of long departed ancestors.  We do treat them with reverence, however, because they meet so many of the standards we have set for what makes a good garden plant:  they grow easily, and once sprouted do not require much in the way of maintenance.  They are hard to kill.  They produce tasty and nutritious fruit.  And by the time they die off each Spring, they have left the soil better than it was when they found it.  They are the best possible plant for the cool half of the year, in fact.

And in the Brazos Valley, October is when to plant them.  We can hardly wait.

Happy farming!

9/15/14

To Eat or Not to Eat... Or What to Eat or Not Eat... Or Something...

“If beef is your idea of ‘real food for real people’ you’d better live real close to a real good hospital.”
--Neal Barnard, M.D.

Meat has always been a problematic question for modern humans, even for those who have chosen not to think about the problems associated with the consumption of meat.  Leaving aside the ethical questions for a moment, and just focusing on health, there are a handful of advantages posed by meat consumption (particularly seafood, but also including red meat), juxtaposed with an ever mounting pile of disadvantages (particularly as related to red meat).  We aren’t doctors, but we do think it’s a subject worth revisiting from time to time, particularly because most people on both sides of the consumption aisle are (to put it mildly) not used to discussing the matter politely.

Lest you think we’re going to dogmatically say “don’t eat it,” we’d like to start with some interesting data points from a 1999 metastudy of data from 5 different countries, published in the American Journal of Clinical Nutrition: 

Dietary Style
Mortality Ratio
Pescetarian (fish eater)
0.82
Vegetarian (lacto-ovo)
0.84
Occasional Meat Eater
0.84
Regular Meat Eater
1.0
Vegan
(0.7 to 1.44 owing to limited data points)

Obviously, as is true of any population study, these findings do not mean that there are absolute truths applicable to each and every individual regarding healthy eating habits… but the trend lines are clear.  As a general rule, one would expect a person whose diet consists of no animal flesh other than fish or the occasional egg or dairy product to greatly outlive the person who has red meat at every meal.

It is interesting to note, of course, that the statistical differential between this optimal group and the occasional meat eaters is not particularly significant; there is a far greater difference between the frequent meat eaters and the occasional meat eaters (defined as someone who eats no more than two servings of red meat per week) than there is between the occasional meat eaters and the vegetarians and pescetarians.

Vegans, naturally, are in a category all to themselves, owing to the fact that their nutritional intake is perhaps more variable within their category than is true for any of the other categories – a careful vegan is better off than anyone.  A not-so-careful vegan?  May as well be playing in traffic.  We’ll get to why in a future post, more than likely, but the odds are that most vegans reading this blog probably know more about how to eat a healthy vegan diet than we do, anyway.  We’re more concerned with elucidating for the omnivore crowd for purely utilitarian reasons, so please, don’t feel excluded.  And for any vegans who don’t know about nutrients typically not available in plant sources, for heaven’s sake, go find yourself a vegan mentor who does.

Now then, back to meat…

We suspect that a great deal of the differential between the occasional meat eaters and the regular meat eaters has less to do with the dietary value of beef and more to do with the effects of a whole host of corollary factors – quantity consumed at any given meal, preparation methods, what else is eaten, etc.  For example, an occasional consumer of beef is more likely to consume fatty fishes (that is, fishes high in omega-3 fatty acids) than is a regular beef eater; as it turns out, omega-3 fatty acids are essential for a host of bodily functions that have a strong correlation to long-term health.  So… it’s not just that occasional beef eaters eat beef; it’s that they also eat other things in greater proportion than do frequent beef eaters.

Likewise, the occasional beef-eater (especially those who are doing their best to minimize the ecological impact they have vis-à-vis cattle raising method – hello grass-fed free-range, good-bye corn-fed, factory farmed) is much more likely than the regular beef-eater to be getting a healthy dose of dark green vegetables and healthy starches (long grain rice, quinoa, etc.) and is much less likely to be gobbling fried foods and processed flour and sugar – it’s not just what they are eating, it’s also what they are not eating.

Then, too, the occasional beef-eater is more likely to be a gourmand, someone who takes the tastes they consume seriously, and is therefore not likely to be eating lower quality cuts of meat, nor are they likely to be eating processed meats.

And, as it turns out, there are strong correlations between heavy consumption of processed meats (hot dogs, bologna, pepperoni, spam, etc.) and several different cancers, as well as cardiovascular disease.  Those same correlations are not found to be red-meat specific.  In other words, there is something about the way in which the meat is processed which makes it inherently unhealthy.  Much the same can be said for processed flour, processed sugar… seems like maybe processing is a bad idea, no?

Lest you think this means there is a green-light for beef consumption, though, just so long as you’re paying extra for the grass-fed good stuff, there are other considerations that require attention. 

Heterocyclic amines (HCAs) are chemical compounds containing at least one heterocyclic ring (atoms of at least two different elements) and at least one amine (nitrogen containing) group – long story short, it’s just a category of organic compounds.  A lot of them are not only beneficial, they are downright essential.  Niacin would be a good example. 

However, there are several HCAs which are classified as carcinogenic (cancer causing), and they are created by the charring of flesh.  Like you might find in, say, the famous “bark” (that tasty outside crust) on a particularly well cooked brisket.

Let that sink in for a minute… the thing that demarcates beef as “really good” for certainly most Texans, and we’re guessing most people in other parts of the world… is carcinogenic.  Not “might be”, but “is”.

Now, can you cook red meat without charring it?  Yes, you can.  Does it still satisfy your meat cravings?  We can’t answer that for you.  And depending on the method one chooses, there may still be other health risks involved – meat cooked on a grill or over a flame which is not hot enough to char (and therefore not hot enough to create carcinogenic HCAs) may also not be hot enough to destroy flesh-borne pathogens (bacteria and viruses).  Microwaves can kill those pathogens without charring the meat, but they also have the nasty side effect of changing the chemical composition of meat (and of anything else they are used to heat) in unpredictable and hard-to-quantify ways, especially when cooked in, on, or near plastics.  Microwaves do break down a variety of prions, though, which may be beneficial, in light of…

Prion disease.  One form of which is known as Bovine Spongiform Encephalopathy (BSE).  Also known as “Mad Cow Disease”.  There are actually variations of this particularly nasty affliction for every kind of consumable mammalian flesh, including human flesh, if you’re into cannibalism.  And while some sources of beef are free-and-clear of the potential for BSE (that would be local, free range grass-fed beef), the vast majority of red meat sources for grocers, restaurants, etc. are not. 

Safeguards in place are laughable, given that the only protection measures in place are to prevent the use of bone meal made with already infected animals.  These measures are sensible, of course, in that allowing contaminated animals to be used to make feed for non-contaminated animals would, naturally, spread the condition around.  The problem is, this approach ignores how the condition started in the first place.

Spongiform encephalopathy, whether of the bovine or other variety, is a condition wherein prions (protein fragments which are self-replicatable, but do not comprise a complete RNA or DNA sequence) run amok in the host animal; they invariably attack the central nervous system, and are only noticeable by their effects.  Autopsies done on diseased animals (including affected humans) will find brains eaten away like millions of little swiss-cheese bubbles.

And while on extremely rare occasions these prions are a more-or-less spontaneous creation in a genetically prone individual… on more occasions than not, these prions are created during the process of ingesting, digesting, and metabolizing flesh from a creature with similar DNA to the affected animal’s own DNA.  Hence the references to cannibalism.

Most beef (and pork… and chicken… and farm-raised fish) in the United States (and increasingly in the rest of the world) is “factory farmed” – that is, raised in cramped conditions and fed a slurry made from a mixture of corn, bone meal, and animal wastes (recycled poop, yum!); which means most meat sources are, in fact, cannibal meat sources.  Animals who have eaten their own kind, or a kind awfully similar to their own.

Given these conditions, it’s not a question of if some new strain of encephalopathy will emerge; it’s a question of when will it make itself known. 

Now, there are a few factors limiting the likelihood of onset, and they should be almost as troubling as the event they are forestalling.  A good example is the famed “pink slime” of McDonald’s fame.  Various industrial processes, such as the “cold pasteurizing” (euphemism for irradiation) of meat, or the use of ammonia-baths, etc. are good for removing bacteria, viruses, and even (in the case of irradiation) prions… though if those procedures don’t make you nervous, you are either very brave, or very drunk.

All of which, we are sure, has by now convinced you that it might be easier just to forego that big platter of ribs you were planning on smoking this weekend, right?

No?

Well, at least let us convince you to spend a few extra dollars to make sure that if you are going to continue to be a meat eater, you get your beef from a healthy source.  Archer-Daniels-Midland will do just fine on their own without you throwing away years of your life just to line their pockets.

And make sure that you eat plenty of veggies along with your main dish of choice, no matter in which longevity category you’ve decided to plant yourself.  As we noted when first breaking down the meaning of the statistics, it is quite likely what unhealthy folk aren’t eating that is putting them in the wrong categories; dark green veggies and fatty fishes top that list, so hop to!  We like you; we’d like to have you around reading our blog for a long, long time.


Happy farming!

9/13/14

Water... It's All Wet

天下莫柔弱於水。而攻堅強者、莫之能勝。以其無以易
There is nothing softer and weaker than water,
And yet there is nothing better for attacking hard and strong things.
For this reason there is no substitute for it.
--Lao Tse, Tao te Ching

Shortsightedness with respect to our place in the world seems to be the human condition, and there are sound evolutionary reasons for that.  We won’t go into the nuts-and-bolts of Myers-Briggs personality profiling, but just as a “fer instance” the ratio of those whose primary data processing mode is “sensing” rather than “intuitive” (that is, based on sensory perception, tactile solidity, and temporal contemporaneousness rather than on a holistic and fluid understanding of patterns and trends) is somewhere on the order of a 70/30 split by percentages.

Sensors, naturally, are more concerned with “how things are” whereas intuitives tend to focus on “what things are becoming”.  One of the nicknames for sensors is “guardians” which more or less explains why they are so well adapted for their evolutionary niche – humanity has clearly done well (in terms of evolutionary success) with a wide array of culturally oriented adaptations (see: all technology since the invention of the pointy stick)… but we remain physically vulnerable as individuals, and many of the changes we have foist upon ourselves have made us more vulnerable still (see: pollution, industrial accidents, snowboarding, etc.) 

In general, sensors tend to limit these dangers – for themselves and others – by enforcing conformity, norms, and other social and personal barriers to excessive changes.  And up until such time as the only agents of change capable of global harm came in the form of avoidable innovations, this was a useful (if frequently irritating, especially to visionaries in the arts and sciences) trait for the majority of humanity to display.  Nietzsche may have railed against conformity-masquerading-as-morality, but in general, it has been a good system.

Times they be a’changin’, though, and it is likely that the majority of humankind is (sadly) incapable of philosophically/emotionally/cognitively keeping pace.  “It takes all kinds,” the saying goes, but it unfortunately doesn’t tell us in what proportion each of those kinds ought to be served up.

Nowhere is this more evident than as relates to the problem of water scarcity, exacerbated as it is by the pressures of climate change, urbanization, pollution, expanding agribusiness, and at least a half dozen other factors we haven’t even yet gleamed.
The CEO of the Nestle Corporation recently made news for stating that “access toclean water is not a human right.”  We’ll pass over why he would even wade into such unfathomably muddy moral waters for now, at least as pertaining to the issues at hand in his case, but the why and wherefore of his attitude is fairly self-evident as it relates to human psychology.

Basically, he is refusing to allow the debate over whether access to clean water is a human right to be framed by those who argue that there are teleological (cause-and-effect reasoning) implications in a moral debate – essentially, he is arguing that the fact that something changes (as an intuitive knows is true of everything) should have no impact on what is “right” or “wrong” with respect to that thing. 

Water is scarce?  “Move.”  So say the innately deontologist (a priori moral framework) sensors, like our exemplary CEO (and like a great many people in business and accounting).  Even when a sensor has some compassion, however, their essential reasoning will be similar – and similarly flawed.  A sensor who believes that access to clean water is a human right will face difficulty in accepting the clear evidence that not everyone can have it.  “Human right” or not, there is quickly going to be a day when it will simply not be possible for everyone on planet Earth to have access to clean water.

And we are not psychologically equipped, as a species, to come to terms with what this is going to mean.

To see some of the psychological trauma involved in the evolution of the human relationship with water supplies, a denizen of the Brazos Valley need not travel far.  In fact, though residents of Bryan/College Station like to think of ourselves as “the” Brazos Valley, the name applies to a far larger group of communities, ranging as far north as the Texas panhandle and even into eastern New Mexico.  And many of the cities and townships which draw their water from the Brazos River (particularly those in the northernmost regions of the Brazos Valley)… are running short.  Their neighbors outside the Brazos watershed are faring even worse.

Wichita Falls, as an example, has recently begun not only recycling grey water, but has actually begun reuse of all treated wastewater.  Their potable water supply comes from a number of sources, but among them are waters which at one time were part of the so-called “black water”.  (You’re not really going to make us spell that out for you, are you?)

Now…. It’s perfectly safe insofar as it is chemically and biologically virtually identical to other West Texas tap water.  We’re not going to get into the tap water vs. distilled vs. filtered debate at present, we’re just pointing out that something most Americans think of as “icky” has become a part of daily life in one particularly conservative community.  Exactly the sort of community where you would least expect radical changes to every day life to be welcome, in fact.  But they have bent to necessity, and are now making full use of every form of water available to them.

What would happen, one might wonder, if even reusing toilet water did not provide enough potable water for the community’s needs…?  What if they simply ran out?  This is the case in more than one West Texas community, in fact.  And what those towns have done is to ship in water from other places.  If you were to search the world over, it would be difficult to come up with a better example of the concept of unsustainability.  Yet “just move” is not an answer that most of these people will apply to themselves.  That is what they might say to someone else, from somewhere else, because nothing like that happens here.

The answer most sensors have had to this crisis over the last 100 years (and yes, strange as it may seem, we have known for even longer than that about the idea that getting enough water might be a problem for people living in arid places like, say, most of the region of the United States between the Mississippi River and Seattle) has been to “drill a well”.  When asked where well water comes from, their answer would typically be “from the ground.”

This was, is, and ever will be, what those focused on global issues call a problem.

Groundwater does, in fact, come from the ground.  It typically comes from naturally occurring reservoirs called aquifers, where water has collected (usually over hundreds and thousands of years, though in the case of the largest – and most used by humans – these zones took millions of years to develop) and most of which provide forms of natural filtration that make aquifer water among the cleanest naturally available sources of water on the planet, behind only a handful of glacier-melt sources in purity and desirability.

The problem (and trust us, it is a huge problem) is that aquifers are a limited resource.  They for the most part do not recharge especially quickly, particularly when compared to the rate at which it is possible to pump water out of them.  The largest aquifer system in the United States from which water is pumped is the Ogallala Aquifer, which spans the plains from South Dakota down through Nebraska, Kansas, Colorado, Oklahoma, New Mexico, and the headwaters of the Brazos River high
up on the Edwards Plateau in the Texas panhandle. 

The population dependent on the Ogallala for drinking water is not especially large by percentage – roughly 2 million people out of the 300+ million citizens of the United States – but that does not begin to tell the whole story.  See… about 27% of the irrigated land in the U.S. lies atop the Ogallala.  Just over 1/4th of all agriculture in the United States is dependent upon well water from an aquifer which is shrinking, has been shrinking for a long time, and will continue to shrink until it disappears.

Since 1950, agricultural irrigation has reduced the saturated volume of the aquifer by 9%; to make matters worse, the rate of decline is accelerating.  Between 2001 and 2009 alone, the aquifer declined by 2%. 

The rate at which the aquifer recharges is exceptionally slow.  Scientists are extremely sure of their classification of most of the water being drawn by wells as paleowater.  That is, it is water which got into the aquifer no more recently than the end of the last ice age roughly 10,000 years ago, and most likely far earlier than that.

It is, in short, foolish to make any plans based on the availability of water in the Great Plains, which just so happens to be the region we also call the nation’s “bread basket”.  It’s not a question of if this system will collapse.  It’s a question of when.

So…

What’s the good news?

We like to present solutions where possible.  Leave ‘em laughing, it’s said, or when they leave, they’ll never come back.  Myrtle likes visitors, so we’ll do our best to not end on a glum note.

There are several strategies for dealing with water scarcity and the coming of “peak water” (a concept about which much needs to be said, though we’ll say it later).  Some are obvious, some a little less so, and all fit with our general philosophy which is smaller is better.

To start with, it should be obvious to everyone (though obviously for some, it’s going to take a little longer to catch up with this reality) that the old industrial agriculture paradigm is doomed.  Big agribusiness can’t last, and it won’t last.  GMOs (about which we have written before) are a dangerous nuisance, but they ultimately won’t succeed because they are the philosophical equivalent of rearranging deck chairs on the Titanic. 

GMOs are just the latest in a series of stratagems put forth by agribusiness to try to make farming profitable in environmental niches where large-scale farming is not even sustainable.  Like every other stratagem, modifying crops to use less water, or to be more pest resistant, ignores the reality that eventually there won’t just be less water in these places, there will be no water in these places.   So the “big” model is dead, it just doesn’t know it yet.

What replaces it?

The small model, of course!  Not merely in the Ogallala region, where it will be the only viable model, but also in the places currently feeding off the degradation of the Ogallala (that would be everywhere else in the U.S., including places not currently experiencing water stress).


When you stop to think about it, this will be a necessity-driven change, as well.  What happens when 27% of the agriculture sector simply vanishes?  People will still need to eat, right?  Who will feed them?  The only people equipped to do so.  Market farmers.  People who grow fruits and vegetables (and eggs, and frequently milk and dairy) in their backyards, in vacant lots near their homes, on their roofs, in their converted garages/greenhouses, etc.)

Ecological and environmental economists (and weird as it sounds, trust us, those are two separate groups altogether) agree on very little, but one thing they both agree on is the idea that feeding people in the future is a very complicated, difficult task.  They’ve got all kinds of numbers and graphs and pie charts explaining why.

They are all full of poppycock.

It’s not a complicated question at all; the model is simply wrong.  We have been searching for almost a decade now, and we have not seen any serious academic investigation on the root question of how much food can be supplied for the world’s needs on a subsistence-plus scale – that is, production by those whose only aim is to grow “enough for my family, plus a little extra”. 

Urban homesteaders are more than a novelty, now, though, and it is time for their methods and philosophy to get more attention.  There are some things that urban homesteaders do which directly impact water consumption – rainwater collection systems, for example, or the use of hydroponic growing systems – but there are others, most notably the raising of large quantities of produce on small plots of land, using less water than large producers use, that do not get enough attention.  It is our strong belief that the numbers (should they ever be taken) would support the idea that more people worldwide can be fed from smaller plots than from the large ones.

California...as is so often the case... is leading the way.  There is a state law there which subsidizes the use of vacant lots as urban farming land; essentially, a new form of homestead law that allows urban residents to make use of undeveloped urban spaces for market farming projects.  So far, San Francisco is the only place where the model is being actively used, but we envision success there quickly becoming the model for the rest of the state.  Once California has success with urban farming, we suspect it won’t take that long for the rest of the country to see a good thing and copy it.

When human beings have no other choice, we at Myrtle’s place are certain we (as a species) will eventually get around to seeing things from the intuitive point of view.  Long term, it is not only the only choice for survival, it is also (happily) the most sustainable choice, offering the greatest degree of personal happiness and satisfaction.  So… we guess the good news is, yes it’s going to get worse before it gets better… but it is going to get better.


Happy farming!

8/25/14

Chicken Soup for the Vaccinator's Soul

Nature answers only when she is questioned.”
--Jacob Henle

We are fast approaching flu season in the U.S. and it is time for the general public to start making plans to get vaccinated.  It was time many months ago, however, for the Centers for Disease Control (among many others) to start planning just exactly what kind of vaccines should be made available for this year’s version of this ongoing fight against influenza.
Lots of variations on one basic theme:  Ick! Yuck! Phooey!

We won’t go into the abhorrent pseudoscience involved in the anti-vaccine movement in this post (though there is certainly rich material there for many an essay); rather, we thought this might be a good time to speak about influenza generally, and avian flu specifically, particularly as it relates to concerns anyone may have about the possibility of disease spreading through backyard chicken flocks.

Spoiler alert:  Well tended backyard chicken flocks are actually beneficial in fighting the spread of avian flu.  We’ll explain in a bit.

Influenza, or “flu”, is usually nowhere near so serious an illness as to justify the enormous amount of attention it gets each year.  Though it is somewhat more severe than the common cold… it is typically not fatal, and typically lasts for somewhere between a week to two weeks.

The problem, of course, is that word “usually” – when it does take a turn for the worse, it takes a nasty turn for the worse.  The most famous example is the 1918 Spanish influenza pandemic, which took the lives of 50+ million people worldwide.  To provide some context… “only” 16 million people died in World War I over the four years preceding the pandemic.

Why so much death from a disease that usually just sends you to bed for a couple of weeks?

The reason is that the influenza virus is one of the most mutable viruses ever discovered – most of the time, the dominant strain of influenza is one which may put those who already have weak health (the elderly, young children, those with compromised immune systems or who are already wracked by some other illness) in mortal peril (it is a rare year in which influenza deaths in the U.S. do not number in the thousands).  However, in most years, those who are reasonably healthy will suffer nothing more than an uncomfortable couple of weeks of fever, chills, headaches, sore throats, maybe nausea (depending on strain), general body aches, etc.  You know, “being sick”…

In some years, though, the mutated strain of virus is much more dangerous, and ends up generating potentially fatal complications like pneumonia, dangerously high fevers, impaired kidney functioning, etc. in previously healthy persons.

And that is why all the fuss.

The goal of immunization is, therefore, not particularly personal. 

We bill immunization as an attempt to “keep you healthy” but the reality is that immunization is more about collective than individual health.  A population in which a sufficiently large percentage of people have been immunized is much less likely to spread the virus, and is therefore much less likely to play host to the more virulent varieties of the virus.  There will still be individuals who get sick even if they get the vaccine.  There will simply be fewer of them, which is good for everyone.

So how do birds relate to all of this?

Glad you asked.  Birds, it turns out, are frequently the place where the whole story begins.  There are various strains of flu associated with different species, and those which have evolved specifically for human hosts are almost never particularly dangerous.  The varieties which run amok and kill large numbers of people are viruses which originally evolved in other kinds of animals.  On occasion, those viruses mutate in such a way that they can cross over and infect humans – this has happened with swine flu before, for example, but avian flu (flu associated with birds) is by far the most common of these “crossover” diseases.

There are numerous versions of Influenza A, the virus which is adapted to birds, the most famous of which in recent years is H5N1.  This variety has been spreading throughout Asia since 2003, reached Europe in 2005, and the Middle East in 2006.  One case was reported in Canada in January, 2014.

The virus originated in bird populations, and spread to humans.  Which birds?  Well… of the affected populations studied, 84% were domestic populations composed of chickens, ducks and turkeys, and the remainder were wild birds.  So the domestic populations were to blame, right?

Wrong.  At least, not exactly…

The virus almost definitely originated in wild birds, and was then transmitted to domestic flocks via interaction with the wild birds (waste matter dropping into a pen… wild birds scavenging food or water from the flock… there are a lot of ways for the birds to comingle).  Wild birds, however, do not live in confined quarters; as a result, they have a much lower rate of interaction with each other than do domestic birds.

A typical commercial poultry flock, however, will number thousands of birds in very, very tight quarters.  And will almost always have insufficiently cleaned ventilation, food and watering appliances.  A typical commercial poultry flock  is a pathogenic disaster waiting to happen.
Who in their right mind is dumb enough to think this is a good idea?

The response of government to the finding of an infected bird is informative in this regard – when an infected bird is found, the entire flock is killed.  They literally have no choice – there is no way to quarantine an infected bird; if one bird in a chicken farm has it, they all have it.  When you are raising birds in a 1’ by 1’ cage (or “free-ranging” them by allowing 1,000 birds to roam in a 20’ x 50’ barn) then they are not only breathing the same air, they are quite literally pooping on each other and living in muck and filth.  And those are the good farms, where regulations and procedures are being followed.

The sad, despicable truth, though, is that no one knows exactly how many poultry operations actually follow the rules.  The USDA does not have enough manpower to cover more than a small percentage of the regulatory territory to which they are assigned, and even when inspectors are available to review the health and sanitation of a given poultry operation… it is frightfully easy for operators to misdirect investigators away from problem areas.

Salmonella outbreaks are the most common result of disease-laden poultry farm ventilation systems… but one can easily envision H5N1 being spread in the same way.

Enter the backyard chicken flock…

There are a number of advantages to raising backyard chickens, and we have enumerated many of them on other occasions; aiding in the fight against avian flu, however, is one area in which taking birds out of cramped conditions is an underrated element.  Consider:
  • The principle dangers associated with poultry farming come from population density, which is not a problem for backyard flocks ranging only anywhere from 4-10 birds, in much larger spaces than are afforded by factory farms.
  • Occasional interactions between wild birds and diffuse backyard flocks will still happen… but any one interaction only affects 4-10 birds, not 10,000.
  • Human interaction with a flock of 4-10 birds is limited to one household, not the tens or hundreds of thousands (or even millions) who interact with a single poultry farm in factory farming setups.
  • While flu can spread via the lungs, it is much easier to transmit via fluids, like, say, blood, which is often smeared all over human workers on a poultry farm, and almost never contacted by backyard birders.

The logic behind a more diffuse production system for poultry related products is fairly simple – for the same reason that contagion is more common in dense population centers, the spread of food-borne pathogens is much higher when the production of those foods is also performed in dense populations.  When the Black Plague hit London, even the superstitious folk of the pre-Enlightenment era knew to head for the hills, because there was danger in population density.

The advantages posed by backyard birding, however, will not be fully felt unless a sufficient number of folk take up the “hobby” and make it a lifestyle choice.  In order to decrease the population density of
Much less chance of contagion, and no equipment to clean.
commercial poultry farming, sufficient numbers of backyard chicken flocks will need to be raised to take a serious bite out of the bottom line for the monsters of the industry (pejorative sense of the word intended) like Tyson, et al. 

Barring having one’s own backyard flock, we would at least encourage consumers of eggs or poultry meat to consider purchasing from farmer’s markets or other venues where small producers with free range flocks not raised in pestilential, overpopulated factory farms make their money.  Otherwise, we will all be vulnerable to the inevitable spread of some form of influenza from factory farmed bird-to-human, and then from human-to-human.  Once it’s human borne, it is out of the hands of the backyard birder or responsible consumer.

Which brings us back to immunization…

We began by noting that those in charge of manufacturing each year’s vaccine have to plan months in advance of distribution, owing to the mutability of the virus (there’s your evolution-in-action laboratory-verifiable example, if you care to argue with a creationist at any point, though we prefer arguing with brick walls, as they don’t spout nearly as much nonsense).  Because there is so much variability, a flu vaccine is really usually a vaccination against up to four different virus types.  Which, obviously, takes a lot of planning, based on data gathered from the previous year’s strains, examples of viruses found in poultry populations, examples found in other countries with which the U.S. population has a lot of interaction, and good solid guesses based on historical trends regarding which strains are “due” for a comeback.

As a result of all this variability, measuring the effectiveness of flu vaccines is anything but straightforward.  In most years (for example 16 of the 19 years before a 2007 study), the strains which actually bloomed during flu season were exactly the strains predicted by vaccine manufacturers.  And even in the years when they guess wrong, there is still some cross-over protection, given that not every mutation of the virus is radically different from the previous version.

There have been many meta-analyses of the data (basically, reviews of multiple studies) which suggest that overall, for healthy adults, vaccinations result in roughly a 75% decrease in the likelihood of getting the bug.  Without doubt everyone involved (particularly those who got the shot but ended up getting sick anyway) would prefer a rate closer to perfection… but remember, the ultimate objective is not necessarily to keep any one person healthy, it’s to keep a pandemic from happening.  If a few people still get sick, but we never see anything like 50 million people dying from something like the Spanish flu again, it will have been an exercise well worth it.

So…

Your assignments for this Fall:  Get your flu shot, raise backyard chickens (or shop at farmer’s markets), try to stay warm and dry, wash your hands regularly, and…


Happy farming!

8/20/14

Rewriting the Wealth of Nations, One Happy Person at a Time

Quod satis est cui contigit, nihil amplius optet.
(“Let him who has enough ask for nothing more.”)
-Horace

What do we really need?

Given that there are roughly 6 billion people on Earth, there are probably 18 billion different answers to this question.  But it is, we believe, a question that is not asked nearly often enough.  It is certainly not asked often enough in what so many radical activists in the developing world refer to as “the decadent West”.

 A little perspective might help to show the uncomfortable justice in this charge which so many of us find alien and a little off-putting:

There are approximately 300 million people living in the United States.  That means that roughly 1 in 20 folk in the entire world are Americans.  All but the poorest Americans are in the wealthiest 10% of people in the world.  The vast majority of Americans, in fact, are in the top 1%  of the world’s wealthy.  The poverty line in the United States for a family of four in 2012 was $23,492.  The median income across 131 countries in 2012 was $9733.  That means that poor Americans in 2012 earned almost 2 ½ times as much as the middle-of-the-road earner worldwide.

Almost half the world – over three billion people – survive on less than $2.50 a day.  The World Bank sets the global poverty line at less than $2.00 a day.  Median income in the U.S. in 2012 was $51,017, meaning that the typical family of four in the U.S. makes about $140 a day… 70 times as much as the global poverty standard.

What do all these statistics really mean, though?

The typical American philosophical response (leaving aside the cretins who simply don’t care about the fates of others) is to wonder what we can do to help raise others out of their poverty.  This is a natural response – how do we help them get what we have?  There are several different approaches within the framework of this basic understanding of “the problem”… ranging from wealth redistribution schemes on the socialist end of the spectrum to sending economic missionaries to teach classes on free enterprise on the libertarian/free market economics end of the spectrum.

We believe they have both misunderstood the problem.

Measuring wealth in terms of monetary income is natural; the meaningful unit of economic exchanges has always been money – in fact, reading, writing and arithmetic, the sine qua non of civilization, were largely invented in order to deal with the problem of measuring wealth in just such terms. 

However… measuring wealth in terms of money ignores the more fundamental reality that genuine wealth can only truly be understood in terms of contentment.  And while the two different ways of looking at economic prosperity (or lack thereof) often mirror one another… there is a point of diminishing returns on wealth, a point beyond which more money brings only unhappiness, discontent, and a spiritual cancer – frequently to others, but almost always to oneself.

There are obvious benefits to “having enough”:  adequate food, clothing and shelter formed the holy trinity of wealth for early humans on the plains of the Serengeti, whose chief concerns lay in getting enough to eat, staying out of the sun and rain, and keeping clear of saber-toothed tigers.  Many of the same concerns still abound today, added to a few others with a more modern feel – adequate drinking water, access to education for one’s children, good health care, sufficient law and order to protect whatever possessions one does have from the marauding of one’s fellow humans.

In 1943, psychologist Abraham Maslow’s paper “A Theory of Human Motivation” laid out a hierarchy of needs which has been the roadmap for much of the discussion on contentment that has taken place ever since.  According to Maslow, human ambition climbs as each level of need is attained – once basic physical needs are met, one can focus on security; once security is obtained, one can seek emotional fulfillment, and so on.  His levels are as follows:

Level One
Physiological
Air, food, water, sex, sleep, homeostasis, excretion
Level Two
Safety
Security of body, of employment, of resources, of the family, of health, of property
Level Three
Love/Belonging
Friendship, family, sexual intimacy
Level Four
Esteem
Self-esteem, confidence, achievement, respect of others, respect by others
Level Five
Self-actualization
Morality, creativity, spontaneity, problem solving, lack of prejudice, acceptance of facts

In context of a discussion of wealth, one thing becomes readily apparent almost immediately when attempting to apply the hierarchy to individual human economies – practically everyone everywhere will have a different way of meeting virtually everything on this list of needs.  At the very base of the hierarchy… it costs more money to get clean air, food, water, etc. in some places than others. 

A hermit on a hillside farm in Bhutan may have plenty of clean air to breathe, plenty of food and water… a single mother in Nairobi may have to wear a kerchief over her nose if smog descends, and she may need to boil her water to kill microbes, and unless she grows her own food in bags of soil hung from the balcony of her apartment building, or in an abandoned alleyway, she will have to buy what her family needs from a local market. 

And the market from which someone in Beijing buys produce may have a different nutritional value than the equivalent market in Darfur, or Sao Palo, or Quebec, or Los Angeles, or Quito.

We cannot even begin climbing the pyramid of needs, in other words, without encountering differences in economic status throughout the world’s populations.

That having been said… taking a deeper look at the hierarchy… no matter what the cost of each item may be, it is possible locally to meet that cost.  The question then becomes, once you’ve met your needs, now what?

It is helpful in answering that question to step back from whatever striving we may be doing in terms of seeking wealth to ask… what would we do with it if we had it?  A quick look around at the kinds of things the typical American family does with what they have (and often, with credit… or put another way, with money they don’t have…) shows that a lot of things we buy with our comparative wealth don’t appear on the hierarchy of needs anywhere.

Television?  Vacations?  Movies?  Electronic gadgets?  Fancy clothes?  Where exactly do all these things fit?

It turns out luxury is an attempt to satisfy unmet needs, too.  We seek the comfort of luxuries because they satisfy our desires for things we cannot directly purchase, and may otherwise feel powerless to acquire – how often, for example, do we spend money to participate in group events as a part of our attempts to find friendship?  At first, the mind rebels against this charge (and make no mistake, it is an accusation), but a simple thought experiment demonstrates the truth of this idea.

Imagine a parent playing happily with their child in the park.  Their child is laughing uproariously over some silly joke they have just told, and is enjoying the kind of frolic one associates with a happy-go-lucky kind of day being enjoyed by a happy-go-lucky family.  Would either the parent or the child want to trade that experience for a new pair of designer jeans?  Would they want to trade that experience for two hours at a movie theater watching a shoot-em-up?  Remember, you’re trading that experience.  Would you give up the good feelings associated with an interpersonal relationship (enjoyed for the staggering economic sum of zero dollars) for any amount of purchased goods or services?

Life is not always so simple as that, of course, but unless we stop to consider a cut-and-dried example, we may not fully appreciate the distinction between emotional satisfaction and the means by which emotional satisfaction is achieved.  And if we don’t see that distinction, we invariably seek the wrong things.

There have been innumerable movements with countless gurus preaching more or less the same thing we are discussing here; whether you call it “Simple Living”, or “Back to Basics” or “Satyagraha” or “The Franciscan Order” the benefits are palpable and real.  Satisfaction comes from gratefulness, and gratefulness is the beginning of not just the humility central to the spiritual teachings of Jesus, Buddha, Gandhi, and countless other gurus and rabbis, but also the beginning of emotional and physical health and well-being.

We have often noted that there is some selection bias involved in any nutritional study – those who eat truly healthy diets tend to live longer and have fewer health problems.  However… those who eat truly healthy diets tend to live more simply and intentionally, as well.  They frequently have more money than they need, not because they make so much money, but because they make so much more money than they have any desire to spend.

Much the same dynamic plays out on every other step of Maslow’s pyramid – once a person satisfies a need, if they are paying attention and being sensible, they find they need less than their fears foretold.  And the more grateful they are for what they have, the more physical, spiritual and emotional capital they have to spend in moving on to the next step.  Ultimately, physical, spiritual and emotional capital are more valuable – in attaining what we need – than financial capital ever could be. 

When American ears, therefore, process the idea of $2 a day being typical for over half the world’s population, we are more staggered than we ought to be.  We can’t imagine either how we ourselves could live that way, or how we could ever assuage our guilt for the suffering of those who do live that way because the overwhelming numbers seem impossible to surmount.

We don’t need to make them financially wealthy in our terms, though.  Yes, there are plenty of people in the world who need more money than they currently have.  And yes, access to food, clean water, health care, and education are huge issues which need to be resolved.

But the real poverty crisis is the poverty of spirit which is felt every bit as much (and often more) in the “wealthy” decadent West as it is in the underdeveloped nations of the world.  The bumper sticker slogan has got it wrong – all too often we say “Live simply, that others may simply live.”  It would be more meaningful to say “Live simply, for cryin’ out loud – if you’re not happy now, you’re not going to be happy after your next million dollars, either.”

A little land, a chair to sit on, some chickens to scratch around... we're good.


Happy farming!