9/28/14

Almost Time for Ghoulies, Ghosties, Gremlins and... Gardeners?

“The basis of optimism is sheer terror.”--Oscar Wilde, ‘The Picture of Dorian Gray’
Scholars from a variety of fields including anthropology, theology, linguistics, history and literature will debate practically anything, including whether or not Halloween is merely a Christianized version of Samhain, or if it evolved independently.

As is the case in many such scholarly disagreements, they’re all being silly – the important thing about Halloween is that it is fun.  We don’t mean to demean anyone’s deeply held convictions, of course, but if your deeply held convictions lead to a drab, dreary, colorless, somber life of drudgery… well, maybe you ought to consider shopping for a new set of beliefs, beliefs which include the idea that being human should be enjoyable, not miserable.  There’s enough troubles that happen to us without bringing unhappiness on ourselves.

Which makes Halloween a perfect holiday, when you think about it – ghosts and ghouls and creepy-crawlies represent all the horrors of our imagination… and Halloween places all of those creepy-crawlies in an ironic and humorous context.  We cringe and cower in fear on a typical Monday morning when our bosses, or supervisors, or teachers (or students), or customers, or suppliers, or whomever, load us down with a laundry list of complaints and problems with which we are ill equipped to cope.  However, if we have a freaky automated rubber hand reach up at us out of a candy dish, our hearts pound with fear… but we laugh.

We laugh because we know the fear is not real, and that knowledge equips us to rise above our fears of those things which are real – it gives us the hope and courage necessary to face harsh realities, whether they be in the form of the coming of winter and desolation (oh so important to farmers for as long as there have been farmers, and even more important to hunter-gatherers before that), or more modern fears, whether of problems at work, school or in our personal lives.

Not all of the traditions associated with Halloween speak to this psychological reality, of course; some are simply festive.  Candy corn, caramel apples, bonfires, costumes of all sorts (frightening and – more and more frequently – not so frightening), silly songs, silly movies, and silly decorations form the centerpiece of what is really just one extended party marking the end of summer, and the beginning of something else.

Samhain was the Celtic New Year for a reason; harvest marked the dying of the old year, when the plants have given their bounty and are returning to the dust from whence they came.  There are equivalent harvest festivals in most of the world, and even in places where religious fanaticism drowns out the individual freedom to get freaky on Halloween, something takes its place.

The Puritans, for example, strongly discouraged recognition of All Hallow’s Eve (the evening before All Saint’s Day) – not because of any sort of direct satanic influence, as is frequently the case among extremist fundamentalist Christians in our own era, but because the holiday was seen as essentially Catholic.  (Though, to be fair, they considered Catholics little better than Satanists, but why quibble?  One flavor of crazy intolerance is as good as another.)

In spite of their innate problems with All Hallow’s Eve (and associated harvest festival traditions, Celtic or otherwise)… the Puritans brought us Thanksgiving, which is nothing more than a dressed-down Protestant version of Halloween.  Simple clothes and formal dining (or at least as formal as working-class families get) take the place of free-wheeling frivolity… but the basic message is the same:  the time has come to give thanks for the bounty we have received, that we may be prepared for what follows.

There is plenty of room for both ways of celebrating the harvest; most people never even stop to consider the subtle tension between the perspectives offered by these two intimately related holidays – the vast majority of Americans celebrate both Halloween and Thanksgiving without giving it a second thought.

Halloween is not a uniquely American holiday, of course, but what Americans have done with the day says a lot about why it is important to us.  Historian Nicholas Rogers writes that “some folklorists have detected its origins in the Roman feast of Pomona, the goddess of fruits and seeds, or in the festival of the dead called Parentalia,” but that “it is more typically linked to the Celtic festival of Samhain,” which comes from the Old Irish words meaning “summer’s end”.

The mish-mash of traditions in modern American Halloween festivities have stripped virtually all of the religious overtones of the holiday and replaced them with purely secular meanings and traditions – even where religious or mystical festivals such as Día de los Muertos are celebrated alongside Halloween, they are clearly seen as two separate entities.  You celebrate the one the night of October 31st, and the next day you move on to the festival honoring the dead.  Where there is overlap, it feels very much like a meeting of friends from different backgrounds at Yule, some of whom are celebrating Chanukah, others are celebrating Christmas, Kwanzaa, or some other festival.  (Except maybe Festivus, because those people don’t get along with anybody.  But we digress.)

Much of the mish-mash of American Halloween is to be expected, based simply on the idea that the American melting pot is itself a mish-mash; we are the mongrels of the world, a mix of ethnicities, races, religions, cultures, languages and traditions so diverse we often lose track even within our own families within a generation or two of just exactly who we are.  It makes perfect sense, then, that what we do will not have the same degree of continuity you would find in places in the world where families have been in residence for hundreds or even thousands of years.

But the strongest influences on the holiday also tend to make it a particularly prone set of traditions to have evolved over time.  The Celtic celebration of Halloween is not altogether easy to enumerate – yes, we know that “Samhain” was the fall harvest/New Year celebration… but exactly what early Celts did during this time is a matter of conjecture.  Many of the traditions passed down as “pagan” actually originated during the long and influential era of Celtic Christianity, and how to draw distinctions between which Irish traditions date from the first millennium C.E., and which came from before then is an almost completely pointless exercise, both intellectually, and philosophically.

Is your Jack O’Lantern carved from a turnip really more authentic than one carved in a Boston Squash or some other kind of pumpkin?  And even if it were, would it be any more fun?  Probably not.  Probably, you’d get to display your “authentic” Jack O’Lantern at the kind of party where no one else was much fun to be around, either.  (But hey, that’s just us.  We like candy corn, so what do we know?)

And ultimately, that is the American contribution – while many bemoan the crass commercialism of Halloween (and make no mistake, there is clearly a lot of that on display), this is missing the point.  Commercialism is the manifestation of a vibrant truth, one which is not so negative:  If life is a game, then whoever throws the best parties wins.  We prefer the kinds of parties where everything is homemade… but make no mistake, the reason Halloween sells is because whoever makes “the stuff,” it is stuff people want.  And even if life isn’t “a game,” learned optimists know that you frequently only get where you’re trying to go if you treat it like a game.

This is the optimistic American contribution to All Hallow’s Eve – some are frightened by the idea, because it smacks of the kind of licentiousness which at its worst brings us things like Detroit’s “Devil’s Night” – but that is just one extreme.  At the other end of the spectrum, this spirit of freedom (best represented by the tradition of wearing costumes, and freeing our identities from our workaday selves, which, after all, are just another disguise we wear, albeit on a regular basis) helps us escape the fears and troubles which all too easily overwhelm us.

We have nothing against other harvest festivals:  Jewish Sukkot, Turkmen Hasyl toýy, Persian Mehregan, Russian Dozhynki, Yoruban Ikore, and Korean Chuseok all have unique stories to tell, and each contribute in their way to the succor of the human spirit.  Some are more closely tied to the simple life which we advocate on a regular basis, and there is much to be said for celebrating traditions in a more agrarian manner, as a means to encourage people to return to the land… but in America, Halloween is what it is because people have become what they are.  As such, we approve.  Strongly.

We’re a little over a month away, but the supply of pumpkins and other strangely shaped winter squash and gourds has started making its way to the vegetable stands around town.  It’s almost time to break out the black and orange, string up some “ghosts” in the trees in the front yard, and hang “Witchy-Poo” on our front door.  Because, you know…. fun!

Happy farming!

9/23/14

Ever Get Tired of Ragging on Ragweed?

“Naturam expellas furca, tamen usque revenit.”(“You can drive nature out with a pitchfork; she will nevertheless come back.”)--Horace
At any given time, roughly 25% of the population is suffering from allergic rhinitis, known more commonly by the inaccurate name “hay fever”.  There are, of course, some people who are allergic to hay, but since they are typically smart enough not to go on hayrides, or venture too close to a horse barn, they generally don’t have much of a problem.  And yet… there’s that nasty fact that airborne pollen from a whole host of non-hay sources gets under their skin (or, at least, into their
nostrils).

Roughly half of all reported cases are the result of sensitivity to one particular culprit – ragweed.  It is difficult to know, of course, how many cases there are which go unreported simply because, while irritating, the symptoms were not bad enough to lead to a doctor’s visit.  That having been said… Ragweed season is an early Autumn affair in the Brazos Valley, and… yeah, it’s here.

We say “ragweed”… but this is actually not just one species, it is an entire category of plants, some of which are even grown on purpose, believe it or not.  Still, the two most commonly cited varieties are most definitely wild and unwelcome in the typical garden (though we know of some atypical gardeners who swear by them!) – common ragweed (Ambrosia artemisiifolia) and great ragweed (Ambrosia trifida).  There are around 50 species in total, found all over the Americas (and now running rampant as invasive species in Europe), and many of them are quite attractive, aside from the fact that they make so many people miserable.

Hay fever was first identified in 1819 by physician John Bostock; pollen was identified as the causal agent in 1859 by Charles Blackley; it was not until 1906 that Clemens von Pirquet identified the hypersensitivity of the human immune system as the mechanism by which the condition creates such misery.

All the years since then, of course, have seen the most typical of human reactions – burn the offending plants!  Kill them all!  And, sure enough, since the identification of the various ambrosia species known as “ragweed” there have been an almost uncountable number of strategies attempted to eradicate the offending weeds.  Each has met with very limited success.  Some have only made things worse.

Pulling the plants up by hand, of course, presents a high degree of difficulty.  For one thing, they must be identified very young, before their tough root system grows sufficiently to tax one’s muscles mere minutes into what (by nature) must be a long day’s work.  To make matters worse, while the pollen is not toxic (remember, it’s your own immune system that generates the histimines in your system that are making you ill), the leaves and stocks of the plant are mildly toxic, enough to irritate your skin and, after enough exposure, give you a nasty rash.

Mowing is somewhat more effective, provided it is done once the plants are tall enough to cut, but still young enough not to have bloomed.  And, of course, it must be repeated frequently.  And… it only gets those plants growing in a lawn or field… any plants growing in a garden plot or on a farmstead, or in an out of the way ditch, culvert, wild space, vacant lot, etc. will be untouched.  And, since the pollen can stay in the air for days or even weeks at a time, and can be borne hundreds or sometimes even thousands of miles… mowing down your own ragweed does you no good whatsoever if you are downwind from someone else’s.

Not too many years ago, a commonly applied “solution” was to burn fields with ragweed; we’ll leave to your imagination the sum total of what exactly was wrong with this little stratagem.  Among other things, ambrosia smoke is its own form of toxin, irritating in ways the pollen could only dream of being.  Fortunately, this strategy is no longer recommended even by the most backwards of agricultural extension agents.

And then there is the chemical approach.  Vast quantities of herbicide have been applied to ragweed patches over the years.  The problem, of course, is that ambrosia is particularly resistant to the vast majority of commercially available herbicides; pour buckets of Round Up on it, and it will thank you for the watering and go on its merry way.  Even through our sneezes, we can admire its tenacity.

Worse than the fact that herbicides don’t work, however, is that herbicides do kill the one thing that does work.  Ragweed populations can only be kept in check using natural means.  And there are, it turns out, plenty of animals who not only are not irritated by these plants, they thrive on them.  There is a long laundry list of Lepidoptera species which thrive on ragweed.  That’d be moths and butterflies, species who, while they like eating ragweed, do not like ingesting herbicides.  Oops.

A 1973 review of ragweed control techniques published in the Bulletin of the Torrey Botanical Club found that regardless of which control technique was used, after a few years there was no appreciable difference from simply leaving the ragweeds in place.  We could not find any more recent efficacy studies, but we suspect that given the increase in environmental degradation, fields with herbicidal controls are probably worse than simply leaving the plants in place, owing primarily to the decimation of foraging populations.

Proper population control of any “weed” (basically, any plant you find noxious for whatever reason) cannot take the form of eradication.  The sooner we rid ourselves of the notion that we can “do away with” things we don’t like in nature, the better.

No, “control” can only come in the form of management.  And management means creating balance.

We already noted one part of this equation, tending to the foraging populations, namely moths and butterflies.  Helping those populations along by planting other food sources is one important step in limiting the wild stands of ragweed – plant enough milkweed in your garden, and not only will it take up space that might otherwise be used by a resourceful a. artemisiifolia plant, but it will succor enough flying critters to eat any nearby a. artemisiifolia plants that might have otherwise given you trouble.  Plants in your neighbors yard, for example, which you would not have been able to get at with a mower, a blowtorch, or a spray bottle full of poisons.

And we just hinted at another solution – increased biodiversity in your garden, making use of every available space to plant other things.  One of the biggest problems with lawns (and we have written before about how icky we find them) is that they limit biodiversity and create niches for invasive “weed” species.  We hate the term “weed” but in this case, its meaning is appropriate – a “weed” is a plant whose presence is indicative of a problem.  Note that the “weed” is not the problem, it is there because of the problem.

By increasing biodiversity (putting in a flower, herb, veggie, etc. bed, preferably a combination of all of the above) the niche for the offending plant is eliminated.  There m
ay still be room for the odd individual or two… but there is no longer room for a large stand of invasives, and there will be a much greater population of foraging insects and other creatures who, after having sampled one species, moves on to another.

Ragweed sensitivity in the general population is higher now than it has been in hundreds of years; there are a lot of reasons for that, all of them related to human activity.  Various forms of pollution have left our immune systems primed for hypersensitivity to pollen; additionally, once pollen sensitivity has been kicked off, all those chemical pollutants in the air are more keenly felt in our nasal passages, lungs, eyes, skin… it’s enough to make you sick.

But it’s not the ragweed’s fault.  It is the fault of the lack of balance in our relationship with nature.  We won’t begin to truly breathe easy until we give up on our stubborn attempt to tell Mother Nature what it is we want her to be doing.  Mother Nature knows exactly what she needs to be doing – in the meantime, try some lemon and local honey in hot water; ragweed won’t be in bloom forever, and a nice hot beverage is as good a way as any to while away the time until the elm trees are in bloom…

Happy farming!

9/21/14

The Circle of Life is Made of Leaves (and Big Logs)

“I like trees because they seem more resigned to the way they have to live than other things do.”--Willa Cather, O Pioneers (1913)
One of the unfortunate consequences of living in a drought-prone area is the occasional need to cut down dead or dying trees which should have been healthy for many years to come.  We at Myrtle’s place are in the middle of just such a project, with three exceptionally large elm trees all having died during the last couple of summers – had they only been able to hold out one more year, Summer 2014 was especially wet by comparison to the dry blast furnaces of the last couple of years.
Felled logs make excellent garden bed borders

We have undertaken this sad task at precisely the time of year when we are ordinarily thinking a lot about our trees anyway, given that we are on the verge of the autumnal leaf-drop season.  Of course, a lot of leaves drop throughout the year, simply based on the wide variety of species of trees and bushes on our lot, but the largest number of our trees conform to the stereotypical fall foliage festival, and so, we are eagerly anticipating that day (probably in October, but one never knows for sure) when our tallest oaks start looking a little pale, and then overnight turn to gold, and then quickly to red, and then quickly to bare.

The first signs of the coming of fall actually come from our grape vines, which start going all brown and crinkly in late August every year, though they manage to limp along sometimes until cooler weather finally arrives shortly before Halloween.  Every year the timid among Brazos Valley gardeners wonder if they’ve done something wrong, if their muscadine vines are dying… and every year, they come back stronger than before, often because of and not just in spite of the abuse and neglect they received the year before.

We find a lot of surprises when the grape leaves start falling.  Some of these surprises are charming and amusing, like the birds’ nests tucked away in precarious nooks and crannies that are tantalizingly oh-so-close-and-oh-so-far-away from our cat’s reach.  Some are natural and important ecologically, but still give us the willies, like giant nests of wasps going about their business mere inches above our heads on a daily basis with us none the wiser… until the leaf cover falls.

We know, though, that the falling of the grape leaves is the surest sign that we are entering an important part of the leaf cycle of our garden.  Er… life cycle.  Sorry about that.

Chickens in the foreground.  Oaks in the background.
The oaks are the tall ones who don't cluck.
Anyway, as we were saying… we frequently joke that we operate on an oak leaf economic basis.  The bedding for our chickens comes from a three-foot deep layer of oak leaves, for example, and as that stews and composts it becomes the rich hummus which we use to fill our raised planter boxes for our various veggies.  Likewise, while we use cypress mulch for our pathways, when we mulch our herbs and vegetables we use the much cheaper and much more readily available oak leaf mulch – sometimes whether we plan to do so or not, particularly in our herb beds directly beneath the canopy of our largest water oak, which exfoliates so dramatically that there is simply no way we could keep the leaves out of the rosemary even if we wanted to, which, of course, we don’t.

There is almost unanimous support in the composting world for including leaves in any mix of organic matter being broken down for soil nutrition, and there are good reasons for that.  We have mentioned before that the healthiest soils are those which best reflect the natural growing conditions for whichever ecological niche you happen to occupy.  In most of the world, the biggest contributors to the native “compost” will be trees and shrubs, and usually it will be the deciduous varieties of each.

But… apart from their obvious bounty, why?  What makes leaves so important nutritionally for the myriad plants and animals that live on them?

The primary clue comes from the fact that leaves are so important to the plants which grew them in the first place.  Photosynthesis – the conversion of light to food – is the first and foremost function of leaves.  Their green color comes from the chlorophyll which forms the building blocks of their light-conversion-engines, and the spines and veins you can see in a leaf if you hold it up to the light show that they are very much more akin to animal life forms than we typically give them credit for being.  Trees and shrubs have circulatory systems, and it wouldn’t be too far off the mark to credit them with a central nervous system, albeit one that reacts somewhat differently from what you might see in the animal kingdom.

Pretty much everything in our garden is growing in
composted oak leaves, even the stuff in baskets.
The process of leaf loss is technically referred to as abscission, and though it happens every year, it never ceases to amaze us.  In temperate, boreal, and seasonally dry climates, abscission allows the tree to allocate resources properly for the seasonal unavailability of one or more essential components of its usual life-cycle.  Basically, in cooler climates, or in regions far enough north that winter daylight is insufficient for enough photosynthesis to meet usual nutritional needs, or in areas where long monsoon rains are followed by many months of drought-like aridity, trees will shed their leaves and “shut down” – the metabolism of the entire organism slows to as close to nothing as possible, and the plant waits for conditions to become favorable again before sprouting new leaves.

Obviously, given this strategy, the tree or shrub would not want to discard leaves which still had any sort of nutritional value to the organism as a whole, and for that reason alone one might question whether fallen leaves have any organic value, but that would be an oversimplification.  There is still plenty left in the biochemical goodie bag of a fallen leaf, it’s just that it is not in a form readily accessible to the plant “as is”.

It doesn't matter where you are on our property, the
big oaks out front dominate the horizon.  Good.
In addition to creating energy, leaves are often storehouses for energy, as well.  Further, many of the defense mechanisms plants use to fend off foraging animal life – thorns made of tightly arrayed lignins, tannins, other natural poison compounds – are comprised of nutritional components which have been allocated for the purpose of defense, and which are readily broken down in the compost heap (or the natural hummus of a decomposing forest floor), but which would not be easily broken down by the tree itself when getting ready for a long winter’s nap.

It makes sense, then, that the animals which forage on green leaves would ignore fallen browns, reds and yellows, but it also makes sense that a whole new category of creatures would feast on the discard pile.  A wide variety of insects and worms (not to mention microbes), in addition to molds and other miscellany, go to town on the forgotten side of the forest salad bar.  And that’s before even considering the leaves that get hauled off to the chicken coop, where they get mixed in with… ahem! other organic materials.

Even if we didn’t use fallen leaves in our garden or our chicken coop, of course, it still goes without saying that a pile of raked up leaves is just plain fun to jump in.

Happy farming!

9/16/14

Hurry, Curry Favor With Fava Curry (five times fast...)

“Κυάμων ἀπέχεσθαι” (“Abstain from beans.”)
--Pythagoras of Samos
For a guy who was most famous for a theorem often recognized as one of the building blocks of modern mathematics, Pythagoras was a crackpot.  A total nutjob.  A lunatic.  Really out there, is what we’re getting at.  If you want an early look at a Jim Jones-esque cult in the ancient world, look no further than the Pythagoreans.

However, one of the beliefs espoused by this small society of differently rational individuals which often meets with the most ridicule by those first studying their ways turns out (as so many behaviors of that sort do) to have a reasonable basis.  Pythagoras’ importuning of his followers to abstain from beans was probably a specific reference to the fava (Vicia faba).  And it may have saved at least some of them from a fairly painful death from complications resulting from glucose-6-phosphate dehydrogenase deficiency, also known as favism.

This condition affects certain elements of Mediterranean and African populations; carriers of the G6PD allele have a degree of resistance to malaria, which explains why the condition evolved, but it sadly renders them unable to eat fava beans without sometimes fatal interactions with the high levels of vicine and convicine in the beans.  High levels of tyramine also make favas dangerous for those who consume certain MAOI inhibiting drugs as a treatment for depression or other psychiatric conditions.

Now that we’ve scared you... let’s talk about how wonderful fava beans are!  (Because, you know, they are!)  Somewhere around 10% of those with Mediterranean ancestry need to worry about favism; the rest of us need to worry about how to get enough favas in our gardens and on our plates, and Pythagoras can go jump in a lake.

Vicia faba, also known as fava, broad bean, and horse bean, is native to North Africa, Southwest and Southern Asia, and is widely cultivated elsewhere.  In the U.S., the vast majority of the fava beans in cultivation are not actually grown for human consumption, but rather as a cover crop for forage and for nitrogen affixation (a quality which makes favas perfect in the backyard garden – more on that in a minute).

The history of fava cultivation is long – as long, in fact, as any crop other than the earliest stands of wheat-like plants which represented the first human attempts at agriculture.  To this day, farmers from India, Iran, Egypt, Morocco, Sicily, Sudan, Greece, Ethiopia, Nepal, and even Peru and Colombia grow favas in basically the same way as  farmers would have done six or seven thousand years ago.

Favas are a cool weather crop, which surprises those who are not familiar with how they are grown – one would not associate a plant from North Africa and the Mediterranean with “cool weather” but there you go.  In northern climates, or in mountainous areas (Ethiopia or Iran, for example) favas are easy to grow in the summer, but elsewhere (Egypt?  Southeast Texas?) one would grow them in fall, winter, and early spring.  In fact, one of the best reasons to grow favas is their ability to overwinter; we have had luck with plants surviving the harshest freezes the Brazos Valley can dish out; snow has not fazed them, though last winter’s ice storm, where freezing rain coated them with a half-inch of ice did wipe out a sizable portion of our crop – sizable, but not complete; several plants survived even that amount of cold-weather abuse.

In addition to the beans, whose culinary uses we’ll describe in greater detail in a moment, the plants themselves have a lot going for them.  Favas are one of the best beans available in terms of nitrogen affixing qualities.  A plot which has been overwintered with favas is perfectly ready come spring to grow practically anything you want with no soil additives necessary – corn, tomatoes, whatever you will.

Further, a mature fava plant can stand anywhere from three to five feet high, standing straight upright with leaves radiating outward only six to twelve inches, with white or purple flowers (depending on variety) which provide the local bee population with all the winter forage their little hearts could desire, creating a showpiece in your winter garden that will be the envy of your neighborhood.  When everything else is dead and brown, favas are gloriously and optimistically green and vibrant.

As if that weren’t enough… the leaves and flowers are tasty additions to the salad bowl.  Many of the same nutritional qualities found in the beans are found in the leaves and flowers.  High in fiber, high in protein, with a complete panoply of B vitamins, C and K vitamins, Calcium, Iron, Magnesium, Manganese, Phosphorous, Potassium and Zinc, favas are also one of two classes of beans (velvet beans aka mucuna pruriens being the other) which contain L-DOPA, a naturally occurring dopamine agonist, basically a naturally occurring anti-depressant.  And if that weren’t good enough, L-DOPA is also a natriuretic agent, potentially hypotensive… ie. lowers blood pressure.

The beans are eaten in about as many different ways as there are cultures which have been exposed to them.  Historically, both the Romans and Greeks took young beans and parboiled them, occasionally eating them as a puree.  Mature beans are often fried and salted or spiced (this is popular everywhere from Latin America to Thailand).  Chinese cooking features a paste called la doubanjiang (“hot pepper beans”) and in Mexico habas con chile are eaten much like spiced peanuts are consumed in the U.S.

In Egypt (as opposed to the Levant and most of the rest of the world) favas are the primary ingredient of falafel.  Garbanzos (aka “chickpeas”) are the main ingredient in the falafel with which most people are familiar, but fava-based falafel has a richer, nuttier taste that Egyptian foodies rhapsodize at great lengths as being superior – we would never turn up our nose at Levantine falafel, of course, but the Egyptians have a point.  This is somewhat akin to a debate about the relative aesthetic merits of a meadow versus a glen, but if the question were merely over nutritive content, the Egyptians with their favas win, hands down, no contest.  Garbanzos are healthy, but favas are the king of beans in that regard.

As such, it comes as no surprise that favas are the most widely consumed food in North Africa.  The Egyptians eat ful medames the same way Norwegians eat fish.  In the Sudan, mashed favas flavored with sesame oil and a bit of Jibna (sheep-cheese, similar to Feta) form the basis for the most common lunch entrée.  In Morocco, street vendors serve up a fava bean dip called bessara the way New Yorkers eat hot dogs.  Ethiopian shiro flour forms the basis of most of their dishes, and one of the main ingredients in the flour is favas.  And numerous dishes important to the Ethiopian Orthodox Church make heavy use of favas.

And speaking of religion… there is a long history of theological/spiritual/supernatural association with the fava.  Pythagoras and his followers were vegetarians owing to an intriguing form of animism in which they believed not only in reincarnation for human and animal spirits, but also for the spirit of beans – details are sketchy because reliable witnesses tended not to get invited to their parties, but that’s as believable an interpretation of what data is available as any other.

The Romans would make an offering of favas to the lemures (“house ghosts”) on the festival known as Lemuria.   In Ubykh culture (historically people from the Southern Caucasus mountains in modern Georgia, Armenia and Turkey) there was a form of fortunetelling anthropologists refer to as “favamancy” where fava beans were thrown on the ground (similar to the throwing of bones in Norse culture).  In the Ubykh language, in fact, the word for “fortune teller” was literally “bean thrower”.  And in much of Italy, to this day, fava beans are traditionally planted on November 2nd, All Souls Day.

At Myrtle’s place, we do not ascribe any magical powers to the fava bean, nor do we believe they house the souls of long departed ancestors.  We do treat them with reverence, however, because they meet so many of the standards we have set for what makes a good garden plant:  they grow easily, and once sprouted do not require much in the way of maintenance.  They are hard to kill.  They produce tasty and nutritious fruit.  And by the time they die off each Spring, they have left the soil better than it was when they found it.  They are the best possible plant for the cool half of the year, in fact.

And in the Brazos Valley, October is when to plant them.  We can hardly wait.

Happy farming!

9/15/14

To Eat or Not to Eat... Or What to Eat or Not Eat... Or Something...

“If beef is your idea of ‘real food for real people’ you’d better live real close to a real good hospital.”
--Neal Barnard, M.D.

Meat has always been a problematic question for modern humans, even for those who have chosen not to think about the problems associated with the consumption of meat.  Leaving aside the ethical questions for a moment, and just focusing on health, there are a handful of advantages posed by meat consumption (particularly seafood, but also including red meat), juxtaposed with an ever mounting pile of disadvantages (particularly as related to red meat).  We aren’t doctors, but we do think it’s a subject worth revisiting from time to time, particularly because most people on both sides of the consumption aisle are (to put it mildly) not used to discussing the matter politely.

Lest you think we’re going to dogmatically say “don’t eat it,” we’d like to start with some interesting data points from a 1999 metastudy of data from 5 different countries, published in the American Journal of Clinical Nutrition: 

Dietary Style
Mortality Ratio
Pescetarian (fish eater)
0.82
Vegetarian (lacto-ovo)
0.84
Occasional Meat Eater
0.84
Regular Meat Eater
1.0
Vegan
(0.7 to 1.44 owing to limited data points)

Obviously, as is true of any population study, these findings do not mean that there are absolute truths applicable to each and every individual regarding healthy eating habits… but the trend lines are clear.  As a general rule, one would expect a person whose diet consists of no animal flesh other than fish or the occasional egg or dairy product to greatly outlive the person who has red meat at every meal.

It is interesting to note, of course, that the statistical differential between this optimal group and the occasional meat eaters is not particularly significant; there is a far greater difference between the frequent meat eaters and the occasional meat eaters (defined as someone who eats no more than two servings of red meat per week) than there is between the occasional meat eaters and the vegetarians and pescetarians.

Vegans, naturally, are in a category all to themselves, owing to the fact that their nutritional intake is perhaps more variable within their category than is true for any of the other categories – a careful vegan is better off than anyone.  A not-so-careful vegan?  May as well be playing in traffic.  We’ll get to why in a future post, more than likely, but the odds are that most vegans reading this blog probably know more about how to eat a healthy vegan diet than we do, anyway.  We’re more concerned with elucidating for the omnivore crowd for purely utilitarian reasons, so please, don’t feel excluded.  And for any vegans who don’t know about nutrients typically not available in plant sources, for heaven’s sake, go find yourself a vegan mentor who does.

Now then, back to meat…

We suspect that a great deal of the differential between the occasional meat eaters and the regular meat eaters has less to do with the dietary value of beef and more to do with the effects of a whole host of corollary factors – quantity consumed at any given meal, preparation methods, what else is eaten, etc.  For example, an occasional consumer of beef is more likely to consume fatty fishes (that is, fishes high in omega-3 fatty acids) than is a regular beef eater; as it turns out, omega-3 fatty acids are essential for a host of bodily functions that have a strong correlation to long-term health.  So… it’s not just that occasional beef eaters eat beef; it’s that they also eat other things in greater proportion than do frequent beef eaters.

Likewise, the occasional beef-eater (especially those who are doing their best to minimize the ecological impact they have vis-à-vis cattle raising method – hello grass-fed free-range, good-bye corn-fed, factory farmed) is much more likely than the regular beef-eater to be getting a healthy dose of dark green vegetables and healthy starches (long grain rice, quinoa, etc.) and is much less likely to be gobbling fried foods and processed flour and sugar – it’s not just what they are eating, it’s also what they are not eating.

Then, too, the occasional beef-eater is more likely to be a gourmand, someone who takes the tastes they consume seriously, and is therefore not likely to be eating lower quality cuts of meat, nor are they likely to be eating processed meats.

And, as it turns out, there are strong correlations between heavy consumption of processed meats (hot dogs, bologna, pepperoni, spam, etc.) and several different cancers, as well as cardiovascular disease.  Those same correlations are not found to be red-meat specific.  In other words, there is something about the way in which the meat is processed which makes it inherently unhealthy.  Much the same can be said for processed flour, processed sugar… seems like maybe processing is a bad idea, no?

Lest you think this means there is a green-light for beef consumption, though, just so long as you’re paying extra for the grass-fed good stuff, there are other considerations that require attention. 

Heterocyclic amines (HCAs) are chemical compounds containing at least one heterocyclic ring (atoms of at least two different elements) and at least one amine (nitrogen containing) group – long story short, it’s just a category of organic compounds.  A lot of them are not only beneficial, they are downright essential.  Niacin would be a good example. 

However, there are several HCAs which are classified as carcinogenic (cancer causing), and they are created by the charring of flesh.  Like you might find in, say, the famous “bark” (that tasty outside crust) on a particularly well cooked brisket.

Let that sink in for a minute… the thing that demarcates beef as “really good” for certainly most Texans, and we’re guessing most people in other parts of the world… is carcinogenic.  Not “might be”, but “is”.

Now, can you cook red meat without charring it?  Yes, you can.  Does it still satisfy your meat cravings?  We can’t answer that for you.  And depending on the method one chooses, there may still be other health risks involved – meat cooked on a grill or over a flame which is not hot enough to char (and therefore not hot enough to create carcinogenic HCAs) may also not be hot enough to destroy flesh-borne pathogens (bacteria and viruses).  Microwaves can kill those pathogens without charring the meat, but they also have the nasty side effect of changing the chemical composition of meat (and of anything else they are used to heat) in unpredictable and hard-to-quantify ways, especially when cooked in, on, or near plastics.  Microwaves do break down a variety of prions, though, which may be beneficial, in light of…

Prion disease.  One form of which is known as Bovine Spongiform Encephalopathy (BSE).  Also known as “Mad Cow Disease”.  There are actually variations of this particularly nasty affliction for every kind of consumable mammalian flesh, including human flesh, if you’re into cannibalism.  And while some sources of beef are free-and-clear of the potential for BSE (that would be local, free range grass-fed beef), the vast majority of red meat sources for grocers, restaurants, etc. are not. 

Safeguards in place are laughable, given that the only protection measures in place are to prevent the use of bone meal made with already infected animals.  These measures are sensible, of course, in that allowing contaminated animals to be used to make feed for non-contaminated animals would, naturally, spread the condition around.  The problem is, this approach ignores how the condition started in the first place.

Spongiform encephalopathy, whether of the bovine or other variety, is a condition wherein prions (protein fragments which are self-replicatable, but do not comprise a complete RNA or DNA sequence) run amok in the host animal; they invariably attack the central nervous system, and are only noticeable by their effects.  Autopsies done on diseased animals (including affected humans) will find brains eaten away like millions of little swiss-cheese bubbles.

And while on extremely rare occasions these prions are a more-or-less spontaneous creation in a genetically prone individual… on more occasions than not, these prions are created during the process of ingesting, digesting, and metabolizing flesh from a creature with similar DNA to the affected animal’s own DNA.  Hence the references to cannibalism.

Most beef (and pork… and chicken… and farm-raised fish) in the United States (and increasingly in the rest of the world) is “factory farmed” – that is, raised in cramped conditions and fed a slurry made from a mixture of corn, bone meal, and animal wastes (recycled poop, yum!); which means most meat sources are, in fact, cannibal meat sources.  Animals who have eaten their own kind, or a kind awfully similar to their own.

Given these conditions, it’s not a question of if some new strain of encephalopathy will emerge; it’s a question of when will it make itself known. 

Now, there are a few factors limiting the likelihood of onset, and they should be almost as troubling as the event they are forestalling.  A good example is the famed “pink slime” of McDonald’s fame.  Various industrial processes, such as the “cold pasteurizing” (euphemism for irradiation) of meat, or the use of ammonia-baths, etc. are good for removing bacteria, viruses, and even (in the case of irradiation) prions… though if those procedures don’t make you nervous, you are either very brave, or very drunk.

All of which, we are sure, has by now convinced you that it might be easier just to forego that big platter of ribs you were planning on smoking this weekend, right?

No?

Well, at least let us convince you to spend a few extra dollars to make sure that if you are going to continue to be a meat eater, you get your beef from a healthy source.  Archer-Daniels-Midland will do just fine on their own without you throwing away years of your life just to line their pockets.

And make sure that you eat plenty of veggies along with your main dish of choice, no matter in which longevity category you’ve decided to plant yourself.  As we noted when first breaking down the meaning of the statistics, it is quite likely what unhealthy folk aren’t eating that is putting them in the wrong categories; dark green veggies and fatty fishes top that list, so hop to!  We like you; we’d like to have you around reading our blog for a long, long time.


Happy farming!

9/13/14

Water... It's All Wet

天下莫柔弱於水。而攻堅強者、莫之能勝。以其無以易
There is nothing softer and weaker than water,
And yet there is nothing better for attacking hard and strong things.
For this reason there is no substitute for it.
--Lao Tse, Tao te Ching

Shortsightedness with respect to our place in the world seems to be the human condition, and there are sound evolutionary reasons for that.  We won’t go into the nuts-and-bolts of Myers-Briggs personality profiling, but just as a “fer instance” the ratio of those whose primary data processing mode is “sensing” rather than “intuitive” (that is, based on sensory perception, tactile solidity, and temporal contemporaneousness rather than on a holistic and fluid understanding of patterns and trends) is somewhere on the order of a 70/30 split by percentages.

Sensors, naturally, are more concerned with “how things are” whereas intuitives tend to focus on “what things are becoming”.  One of the nicknames for sensors is “guardians” which more or less explains why they are so well adapted for their evolutionary niche – humanity has clearly done well (in terms of evolutionary success) with a wide array of culturally oriented adaptations (see: all technology since the invention of the pointy stick)… but we remain physically vulnerable as individuals, and many of the changes we have foist upon ourselves have made us more vulnerable still (see: pollution, industrial accidents, snowboarding, etc.) 

In general, sensors tend to limit these dangers – for themselves and others – by enforcing conformity, norms, and other social and personal barriers to excessive changes.  And up until such time as the only agents of change capable of global harm came in the form of avoidable innovations, this was a useful (if frequently irritating, especially to visionaries in the arts and sciences) trait for the majority of humanity to display.  Nietzsche may have railed against conformity-masquerading-as-morality, but in general, it has been a good system.

Times they be a’changin’, though, and it is likely that the majority of humankind is (sadly) incapable of philosophically/emotionally/cognitively keeping pace.  “It takes all kinds,” the saying goes, but it unfortunately doesn’t tell us in what proportion each of those kinds ought to be served up.

Nowhere is this more evident than as relates to the problem of water scarcity, exacerbated as it is by the pressures of climate change, urbanization, pollution, expanding agribusiness, and at least a half dozen other factors we haven’t even yet gleamed.
The CEO of the Nestle Corporation recently made news for stating that “access toclean water is not a human right.”  We’ll pass over why he would even wade into such unfathomably muddy moral waters for now, at least as pertaining to the issues at hand in his case, but the why and wherefore of his attitude is fairly self-evident as it relates to human psychology.

Basically, he is refusing to allow the debate over whether access to clean water is a human right to be framed by those who argue that there are teleological (cause-and-effect reasoning) implications in a moral debate – essentially, he is arguing that the fact that something changes (as an intuitive knows is true of everything) should have no impact on what is “right” or “wrong” with respect to that thing. 

Water is scarce?  “Move.”  So say the innately deontologist (a priori moral framework) sensors, like our exemplary CEO (and like a great many people in business and accounting).  Even when a sensor has some compassion, however, their essential reasoning will be similar – and similarly flawed.  A sensor who believes that access to clean water is a human right will face difficulty in accepting the clear evidence that not everyone can have it.  “Human right” or not, there is quickly going to be a day when it will simply not be possible for everyone on planet Earth to have access to clean water.

And we are not psychologically equipped, as a species, to come to terms with what this is going to mean.

To see some of the psychological trauma involved in the evolution of the human relationship with water supplies, a denizen of the Brazos Valley need not travel far.  In fact, though residents of Bryan/College Station like to think of ourselves as “the” Brazos Valley, the name applies to a far larger group of communities, ranging as far north as the Texas panhandle and even into eastern New Mexico.  And many of the cities and townships which draw their water from the Brazos River (particularly those in the northernmost regions of the Brazos Valley)… are running short.  Their neighbors outside the Brazos watershed are faring even worse.

Wichita Falls, as an example, has recently begun not only recycling grey water, but has actually begun reuse of all treated wastewater.  Their potable water supply comes from a number of sources, but among them are waters which at one time were part of the so-called “black water”.  (You’re not really going to make us spell that out for you, are you?)

Now…. It’s perfectly safe insofar as it is chemically and biologically virtually identical to other West Texas tap water.  We’re not going to get into the tap water vs. distilled vs. filtered debate at present, we’re just pointing out that something most Americans think of as “icky” has become a part of daily life in one particularly conservative community.  Exactly the sort of community where you would least expect radical changes to every day life to be welcome, in fact.  But they have bent to necessity, and are now making full use of every form of water available to them.

What would happen, one might wonder, if even reusing toilet water did not provide enough potable water for the community’s needs…?  What if they simply ran out?  This is the case in more than one West Texas community, in fact.  And what those towns have done is to ship in water from other places.  If you were to search the world over, it would be difficult to come up with a better example of the concept of unsustainability.  Yet “just move” is not an answer that most of these people will apply to themselves.  That is what they might say to someone else, from somewhere else, because nothing like that happens here.

The answer most sensors have had to this crisis over the last 100 years (and yes, strange as it may seem, we have known for even longer than that about the idea that getting enough water might be a problem for people living in arid places like, say, most of the region of the United States between the Mississippi River and Seattle) has been to “drill a well”.  When asked where well water comes from, their answer would typically be “from the ground.”

This was, is, and ever will be, what those focused on global issues call a problem.

Groundwater does, in fact, come from the ground.  It typically comes from naturally occurring reservoirs called aquifers, where water has collected (usually over hundreds and thousands of years, though in the case of the largest – and most used by humans – these zones took millions of years to develop) and most of which provide forms of natural filtration that make aquifer water among the cleanest naturally available sources of water on the planet, behind only a handful of glacier-melt sources in purity and desirability.

The problem (and trust us, it is a huge problem) is that aquifers are a limited resource.  They for the most part do not recharge especially quickly, particularly when compared to the rate at which it is possible to pump water out of them.  The largest aquifer system in the United States from which water is pumped is the Ogallala Aquifer, which spans the plains from South Dakota down through Nebraska, Kansas, Colorado, Oklahoma, New Mexico, and the headwaters of the Brazos River high
up on the Edwards Plateau in the Texas panhandle. 

The population dependent on the Ogallala for drinking water is not especially large by percentage – roughly 2 million people out of the 300+ million citizens of the United States – but that does not begin to tell the whole story.  See… about 27% of the irrigated land in the U.S. lies atop the Ogallala.  Just over 1/4th of all agriculture in the United States is dependent upon well water from an aquifer which is shrinking, has been shrinking for a long time, and will continue to shrink until it disappears.

Since 1950, agricultural irrigation has reduced the saturated volume of the aquifer by 9%; to make matters worse, the rate of decline is accelerating.  Between 2001 and 2009 alone, the aquifer declined by 2%. 

The rate at which the aquifer recharges is exceptionally slow.  Scientists are extremely sure of their classification of most of the water being drawn by wells as paleowater.  That is, it is water which got into the aquifer no more recently than the end of the last ice age roughly 10,000 years ago, and most likely far earlier than that.

It is, in short, foolish to make any plans based on the availability of water in the Great Plains, which just so happens to be the region we also call the nation’s “bread basket”.  It’s not a question of if this system will collapse.  It’s a question of when.

So…

What’s the good news?

We like to present solutions where possible.  Leave ‘em laughing, it’s said, or when they leave, they’ll never come back.  Myrtle likes visitors, so we’ll do our best to not end on a glum note.

There are several strategies for dealing with water scarcity and the coming of “peak water” (a concept about which much needs to be said, though we’ll say it later).  Some are obvious, some a little less so, and all fit with our general philosophy which is smaller is better.

To start with, it should be obvious to everyone (though obviously for some, it’s going to take a little longer to catch up with this reality) that the old industrial agriculture paradigm is doomed.  Big agribusiness can’t last, and it won’t last.  GMOs (about which we have written before) are a dangerous nuisance, but they ultimately won’t succeed because they are the philosophical equivalent of rearranging deck chairs on the Titanic. 

GMOs are just the latest in a series of stratagems put forth by agribusiness to try to make farming profitable in environmental niches where large-scale farming is not even sustainable.  Like every other stratagem, modifying crops to use less water, or to be more pest resistant, ignores the reality that eventually there won’t just be less water in these places, there will be no water in these places.   So the “big” model is dead, it just doesn’t know it yet.

What replaces it?

The small model, of course!  Not merely in the Ogallala region, where it will be the only viable model, but also in the places currently feeding off the degradation of the Ogallala (that would be everywhere else in the U.S., including places not currently experiencing water stress).


When you stop to think about it, this will be a necessity-driven change, as well.  What happens when 27% of the agriculture sector simply vanishes?  People will still need to eat, right?  Who will feed them?  The only people equipped to do so.  Market farmers.  People who grow fruits and vegetables (and eggs, and frequently milk and dairy) in their backyards, in vacant lots near their homes, on their roofs, in their converted garages/greenhouses, etc.)

Ecological and environmental economists (and weird as it sounds, trust us, those are two separate groups altogether) agree on very little, but one thing they both agree on is the idea that feeding people in the future is a very complicated, difficult task.  They’ve got all kinds of numbers and graphs and pie charts explaining why.

They are all full of poppycock.

It’s not a complicated question at all; the model is simply wrong.  We have been searching for almost a decade now, and we have not seen any serious academic investigation on the root question of how much food can be supplied for the world’s needs on a subsistence-plus scale – that is, production by those whose only aim is to grow “enough for my family, plus a little extra”. 

Urban homesteaders are more than a novelty, now, though, and it is time for their methods and philosophy to get more attention.  There are some things that urban homesteaders do which directly impact water consumption – rainwater collection systems, for example, or the use of hydroponic growing systems – but there are others, most notably the raising of large quantities of produce on small plots of land, using less water than large producers use, that do not get enough attention.  It is our strong belief that the numbers (should they ever be taken) would support the idea that more people worldwide can be fed from smaller plots than from the large ones.

California...as is so often the case... is leading the way.  There is a state law there which subsidizes the use of vacant lots as urban farming land; essentially, a new form of homestead law that allows urban residents to make use of undeveloped urban spaces for market farming projects.  So far, San Francisco is the only place where the model is being actively used, but we envision success there quickly becoming the model for the rest of the state.  Once California has success with urban farming, we suspect it won’t take that long for the rest of the country to see a good thing and copy it.

When human beings have no other choice, we at Myrtle’s place are certain we (as a species) will eventually get around to seeing things from the intuitive point of view.  Long term, it is not only the only choice for survival, it is also (happily) the most sustainable choice, offering the greatest degree of personal happiness and satisfaction.  So… we guess the good news is, yes it’s going to get worse before it gets better… but it is going to get better.


Happy farming!