The summers when I was in middle school I spent hiking out West, and we did a lot of putting one foot in front of the other in the most gorgeous places I’ve ever been. But I don’t think that I really walked, or knew what walking was, at that time.
On the other hand, the summer after college when I painted my parents’ house, on my days off I would walk from our kitchen door into the woods, pass under the highway via the railroad, and head into a wilderness which, oddly, gets you closer to the city. It was around then that I learned to walk. Maybe it was because it was so unbearably hot that summer, and the only thing you can do in heat like that is wander, and walking is wandering.
I began bird watching that summer, which gives reconnoitering a richness and at the same time floats it in its proper uselessness. That swamp is where you can find the towhees. Over the hill, a line of herons will fly heavily into that flooded bend in the river in the late afternoon. If a walk is going to have a goal it had better be a fanciful one: to revisit a dimly recollected grove of pitch pines and see the extent of the sandy seam they grow in; to see what is just a little further west. No “let’s conquer Mt. such and such.” That’s good when you’re with others to avoid the stress of shared decisions, but when you’re by yourself you’d better not have such things in mind; you need to be able to make the most of what you encounter, and take an attractive avenue or field opening to the left. Thoreau writes about the paradox of walking: with what he calls sauntering you are always under the impression that just over the horizon is the promised land, and you’re always already at home.
The fact that I started to walk in Concord, where I grew up—hallowed New England walking country—obscures an important fact: that the walker is not picky. The walker does not require much. This is why the best walk is one that begins as you step out of your doorway—involve a car and you’re unable, unfree, complex and in debt; your feet won’t touch the ground.
Walking does, however, require one essential item: you do need a working pair of legs. I was getting really good at walking when something went wrong with my legs. This is what this article is about.
For about half a decade I couldn’t walk. In fact I found myself slowing down year by year like the tin man after the rain. The ailment I eventually discovered I had, an autoimmune condition called ankylosing spondylitis (AS)—was baffling to me: constant crippling pain, bones fusing to each other, and, in my case, the threat of dropping dead anywhere anytime. If I had had time to write an essay about this a year ago, when I felt like I was walking waist-deep in water, and the water was pain, it would have been one kind of essay—maybe a better one than I am in a position to write now. It could have been about a phantasmagoria of pharmaceuticals and doctors I’d see for fifteen-minute speed sessions. It could have been an essay about what it is to give up something that you truly love. And it would have been written for my sake much more than for yours.
Instead, however, something really remarkable happened. I departed from my doctors’ recommendations—more on that later—and did what was not at all natural to me: I searched around on the internet for information about the disease, and came to find that a group of people with AS claimed to know something simple but very important. There was a problem with what I was eating. They recommended a radical diet that is a lot like the diet man would have eaten before the advent of agriculture. Nine months ago I gave it a try and since then have been, in a word, cured.
So this essay turns out to be about something quite practical. It turns out that almost all of us are not eating food fit for humans. About 72 percent of the food we consume in Western diets comes from sources that were never or very rarely eaten by our ancestors before the last (evolutionarily negligible) 10,000 years. This can make our bodies fall apart, some in small ways, some catastrophically. You or someone you know has these problems or will, which is why you should read this or something like it.
PART I
But I am getting ahead of my story. Let me begin further back. One day in his first summer of life our family dog Max went swimming—his bliss—and spent the rest of the summer with red blotches all over his stomach that he’d sleepily shake a paw at. Though my parents thought this would pass, the second summer it happened again, and this time the blotches grew into nasty red continents. He only got sicker into the fall and winter. My parents were referred to a pet allergist, and tests determined that Max was allergic to “grass, weeds, mosquitoes, trees [a long list of them], feathers, flowers, tobacco, ants, pollen, mites, histamines, horses, cats [!!], sheep, fleas, dust, mold”—pretty much everything they tested for. Max is blessed with a sweet personality and a lovely coat of hair and this must have helped my parents in the face of the news that their dog’s body was so unaccountably unfit for life on Earth. The latter was simply the way it was so far as the vet was concerned, who told them to brace themselves for forking up big bucks for treatment.
Thankfully, just before it was time for this, an acquaintance of my mother’s suggested that she try experimenting with what Max was eating. I remember seeing the shiny silver bag of dog food with a woman and her dog romantically silhouetted by an orange and gold sunset over the italicized slogan “The thinking person’s pet food.” I remember it because it seemed so snobby and also because it sounded a little like the person was going to be eating the pet food. Now, however, the slogan seems to me, whatever else it is, accurate. “The thinking person’s pet food” appears to imply that the demographic that thinks, thinks profoundly, buys this product. But in fact it might just mean that a dog is an animal that gets sick when it eats grains (which most dog foods are full of), because the ancestors of the dog never ate them and a dog isn’t designed to digest them. The thinking being referred to could be just the thinking of this thought.
That at no time in the long descent to dogness did any wild dog eat the seeds of grasses, and that this is one weighty consideration in the question of whether a domestic dog ought to eat them, is an instance of one of the most straightforward post-Darwinian practical principles available to us. At the zoo they don’t wait for diet research to roll in before they have an accurate idea of what ought to go into the polar bear compound: they know that this animal eats fish. While a wild diet might not be the last word, it isn’t going to lead one far astray. But somehow such reasoning is conspicuously scarce outside the zoo.
The more specific story, we’re finding out, is that if a dog eats lots of grains it is prone to developing autoimmunity and allergies and other health problems because it has a compromised intestinal tract that lets antigens into the blood stream. As far as anyone can tell, Max has been hale and fit ever since he started eating his new food. No more horrendous itchy red continents on his underside. At twelve—getting up there as dogs go—he is a lithe blond beast, and swims, when given the chance, for entire afternoons at a stretch. On a recent visit home I found that when Max gets back from a long walk he still sprints a few smooth laps around the house, sending the turf flying.
Max was the first in the family to return to the diet of his forbears. My mother was next. She had some sort of mysterious gastrointestinal issue, and she followed her dog into the past. The human equivalent to the kind of wild-informed diet Max is on is what is now sometimes called an “ancestral,” or “archevore,” or “cave man” diet, but most often a “Paleo diet,” because it tries to mimic man’s diet at it was in the long Paleolithic era from 2,600,000 B.C. to 8,000 B.C. when we were hunter-gatherers, when the makeup of our gastrointestinal system and metabolism were in large part molded. My mother eats a version of a basic Paleo diet: plenty of fish and leafy greens, grass-fed beef, no grains (this covers rice and corn), no legumes (a.k.a. beans), no dairy, no refined oils, little starch, sugar or alcohol.
My own path to a similar diet—not quite a Paleo diet, as I will explain— was rougher than Max’s or my mother’s. It also involved one big blunder.
●
Normally it is a wonderful thing that we have an immune system. This is the part of our body that seeks out and destroys or disables “antigens”—viruses, harmful microorganisms and other foreign things that have entered the body. We wouldn’t survive long in the world without it. But the immune system, we now know, is capable of making mistakes. Some of these are inconvenient but (in most cases) forgivable, like allergies. Allergies are an inappropriate immune response, where one’s body comes to see something that is in fact harmless—feathers, etc.—as a threat.
A different kind of mistake that the immune system can make is much more serious. An autoimmune disorder is a disease in which the immune system accidentally slates for destruction some of one’s own healthy cells. It is the body’s friendly fire, only focused and relentless. This sounds like something exotic—“yes Mr. Bond, the viper’s poison will use your body’s own defenses to destroy you.” So it is. The thing is that today it is more and more common.
Around fifty million Americans, or one in six, suffer from an autoimmune condition, though few people could name one of them with confidence (no, AIDS isn’t). We spend around $120 billion on treatment for autoimmunity each year. To give you some perspective, we spend $70 billion on cancer. This isn’t counting allergies or asthma, nor well-known diseases such as schizophrenia, Alzheimer’s, autism, atherosclerosis or Parkinson’s disease, that are now thought to involve autoimmunity.
Now people like to hear about others’ health maladies about as much as they like to hear them relate their dreams at the breakfast table. So I’ll just tell you where I ended up. After several years—the latter half of my twenties—in which I didn’t know what the fuck was happening, I had three big problems. One was that I couldn’t sleep. Another was that I experienced excruciating pain in my legs and lower back that unaccountably migrated from place to place. The last and in some ways the most distressing was that a series of echocardiograms, the first of which was supposed to be a routine just-in-case kind of thing for palpitations I had been feeling, determined that my heart was only pumping out about 30 percent of its volume with each beat. A healthy heart pumps 55-60 percent. This kind of heart failure almost always gets worse, and I was in immediate danger of what doctors have given the gentle euphemism “sudden death,” where the heart simply stops.
After much meandering from one hypothesis to another, last year I was definitively diagnosed with ankylosing spondylitis. “Spondylitis” means inflammation of the vertebrae, the source of the sufferer’s chief complaint. It’s really inflammation of the spine and sacrum, and it can also affect other organs like the eyes and, occasionally, the heart. “Ankylos” means bent, referring, I found out, to the disease’s most dramatic symptom. Eventually the ligament that runs down the anterior side of the spinal cord calcifies, curling the back into a hardened hunchbacked position. I was diagnosed when an MRI showed the milky beginnings of this process.
Why was my body attacking itself? Immunologists now think that an adequate answer to that question may begin with what was going on in my digestive system. Let me explain what I was eating at the time.
●
Although I grew up in Concord, I never really read our local authors until I left—in fact when I arrived at college I hadn’t read a word of Emerson, only glimpsed out of the corner of my eye poetry and epigrams etched into stone around town.
As for Thoreau, I believe I could have come up with “I went to the woods because I wished to live deliberately” and drawn the lineaments of his cabin. The exception was that I did know what Thoreau said about food: I had had to read it aloud in English class in the tenth grade. (This in lieu of reading the book on our own, our teacher having told us, from experience I suppose, that if we swam out into Walden we would drown.)
What I later remembered from the passage I read—it is remarkable how well one remembers what one is responsible for before an audience—was Thoreau’s complaint that meat is literally a bloody mess. He says that catching, cooking, etc., his fish takes forever compared to, say, planting potatoes and harvesting them. “The practical objection to animal food in my case was its uncleanness; and, besides, when I had caught and cleaned and cooked and eaten my fish, they seemed not to have fed me essentially. It was insignificant and unnecessary, and cost more than it came to.”
When I got to college I finally read Walden; in fact I read it more often and more seriously than any other book in my life and it became my good friend; and at that time I didn’t drown, I don’t think so anyway, I drank up its ideal of simplicity and self-reliance—simplicity as the unappreciated center of the ethics and economics of a responsible life as well as a happy one. I started to try to live up to this in what way I could: taking up the flower over the opera, for the joy of what is wild and gratuitous, not coveted, not sitting on top of grandeur with complex conditions. This, as I understand Thoreau, is one aspect of how to cultivate an ability to take day-to-day undeferred, untriangulated joy in one’s life.
I still think that Thoreau is right about that. The point for our purposes here is that when I was twenty or so I became a vegetarian. Meat was complex, complicitous, expensive and unsustainable. Plants grew straight out of the soil—food for someone living lightly.
I was a vegetarian—a good one who really ate lots and lots of vegetables, not a bageltarian—for five years.
●
Vegetarians like to point to their favorite big vegetarian for evidence that plants can provide all we need for nourishment. Thoreau’s example is the ox:
One farmer says to me: “You cannot live on vegetable food solely, for it furnishes nothing to make bones with”; and so he religiously devotes a part of his day to supplying his system with the raw material of bones; walking all the while he talks behind his oxen, which, with vegetable-made bones, jerk him and his lumbering plow along in spite of every obstacle.
Strictly speaking Thoreau is quite right: it is possible to make bones out of vegetables. We irresistibly tend to imagine that eating muscle makes muscly men; fat, fat ones. A vegetarian is subjected at Thanksgiving to the inevitable uncle chiding, “you need to put some meat on you” and gesturing to the turkey; and Thoreau is correct: this is more or less fallacious.
On the other hand, it is a terrible idea to draw inferences about nutrition and human health from the diet of the ox. We now know that the process of turning raw material into energy and body structure works quite differently in different animals. An ox is a ruminant. It has four stomachs, the first of which, the “rumen,” is designed for digesting grass. This 25-gallon tank contains a huge population of microorganisms that we lack entirely, for breaking down cellulose. It regurgitates its rumen something like 500 times over eight hours to make the most of this extremely nutrient-poor food source—utterly unlike the way we do things.
The example more favored by vegetarians today is the gorilla—much closer to us on the family tree and also a fun one because with the gorilla everyone gets to imagine a leaf-eating silverback playfully pulling one’s skeptical meat-stubborn uncle to pieces. The gorilla, however, is what we would have become if we had evolved a survival strategy like that of the ox. Our line of descent diverged from that of the gorilla over seven million years ago, and in that time gorillas acquired cellulose-processing gut flora.
We took our own way. At no point in our long descent to Homo sapiens did we ever eat grass, as do the gorilla or the ox. Rather we ate the ox, or at least its ancestor the aurochs. To be more specific, we were once tree-dwelling fruitivores. Then the American continents collided, which changed the circulation of warm water worldwide. The heavily forested areas to which our ancestors were adapted gave way to an increasingly open landscape, and it was then—about 2.6 million years ago—that we began a brilliant career as the tool-wielding opportunistic omnivore. We grew a huge hungry brain and a shorter gut designed for quickly digesting rich fatty foods.
We don’t know exactly how much meat our hominid ancestors were getting their hands on. Tellingly, modern hunter-gatherer societies take in roughly 65 percent of their calories from animal foods (on average), to our roughly 25 percent. They eat considerably less starch and sugar.1
●
A lot of people, myself included, first heard about the ills of carbohydrates with the phenomenon of the Atkins diet, and assumed that this began with Atkins. In fact the idea that the volume of carbohydrates consumed in the standard Western diet is harmful to humans has been appreciated for a long time. Among the unexpected revelations that came out of British colonialism in the nineteenth century were the reports that returned from the field regarding the absence of civilization’s chief chronic illnesses in pre-agricultural peoples. This came from all corners. Doctors practicing in indigenous non-Westernized populations in Africa, Asia, the South Pacific and America would see plenty of certain sorts of ailments: broken bones, malaria, gangrene. But again and again the doctors observed that a whole range of problems—obesity, cancer, atherosclerosis, asthma, osteoporosis, and a long list of others—were either absent or extremely rare. They were absent, that is, until Western agricultural foods began to be incorporated into the local diet.
It was the same list everywhere, and this led some to call these diet-related diseases “diseases of civilization.” In the twentieth century anthropologists leant support to the general idea when they reported that man’s dental mass and body stature had diminished with the advent of agriculture.
The fact that the most conspicuous change to our diet after agriculture concerned our intake of starch and sugar was not lost on health professionals at the time that these discoveries were made. Furthermore the idea that it was carbohydrates, not protein or fat, that were the chief culprit, fit with the evidence that came in from people living with the super-healthy Masai, who eat exclusively blood, milk and meat; or the Inuit, some of whom get close to 99 percent of their calories from animals. To take one example, there was not a single known case of breast cancer anywhere among the Inuit until 1966—and yes, they were looking for it.
In the early twentieth century, however, a competing theory came on the scene very suddenly. Some hypothesized that the diseases of civilization must be related to food abundance, and blamed what they thought was an increase in calorie consumption in general—the theory of the so-called “diseases of affluence.” Later many of this persuasion began to focus more specifically on saturated fat.
In the 1960s the medical world was well aware that the evidence that this new theory had going for it—epidemiological evidence comparing the diet and health of (selected) disparate populations—was weak. This was the kind of data that is useful for drawing up fresh, to-be-tested hypotheses, but nothing more than that. Still to be performed, of course, were the rigorous scientific studies. These would be large-scale observation and intervention diet studies within a given population, to test more conclusively, in a controlled way, whether or not it was true that high calorie consumption in general and fat consumption in particular were bad for human health.
The story from here on, as Gary Taubes recounts it in his Good Calories, Bad Calories, is an edifying tale of epistemic drift. By the mid 1970s there had been a sort of sea change, and many, including those on record as saying that the more conclusive studies were eagerly anticipated, began acting as though they had already been performed. They had not been performed because they were too expensive. Just a few years later, the idea that fat consumption causes problems like obesity and heart disease had been accepted and canonized so thoroughly that the older hypothesis on starches and sugars was abandoned just because it was so hard to square with it. The rest is history—history that, by way of the 1977 McGovern commission, the FDA’s Food Pyramid with its base in spilling bags of golden grains, and a host of institutions and ads and so forth warning of the hazards of fatty foods, led directly to my dim sense that my vegetarianism wasn’t just ethically and environmentally right, it was also going to help me live long and lean.
●
In the spring of 2005 I began to feel what I at first mistook for a pulled hamstring. This became very painful, and the pain progressed aggressively. Soon I was sort of tottering down the sidewalk, trying to reconstruct how it was that walking was supposed to flow along.
Not long afterwards, pain between my shoulder blades was keeping me up in the middle of the night, and while I was lying in bed I would hear the normal thump-thump … thump-thump of my heartbeat, but then, occasionally, a thump-thump, thumthi ……. THUMP thi ………… THUMP thi…. This was what brought me in to discover the heart condition.
In 2005, after the heart diagnosis, I gave up the vegetarianism just in case. I feel now that I could had been much more wise and proactive than that in regard to the role of food. For there was one big diet-related clue I could have picked up on.
It is a peculiarity of mine that when I work intensively at a project, I often sort of forget to eat. When my stomach finally forces me to open the fridge and stretch out my hand for the grapes, I unaccountably veer back to my work by some sort of override. It becomes noon, 1:00, 2:00. The later it gets the more focused I am on the redeeming rendezvous with wisdom that I keep feeling is just now upon me. At 3:00 in the afternoon I will have had nothing but a mug of water.
On such days I would notice that my pain disappeared. At the time I put it down to distraction and adrenaline and the healing heat of creative work. As my brain was being trashed, my body, which by now would be found pacing back and forth from desk to door, was lucid. Sitting at a desk is supposed to make you stiff and ragged, but in comparison to the way I generally felt my body was sailing the floor, floating forward by my gaze.
Reader: if you’ve got a chronic health issue, especially one that may involve inflammation, try a 24-hour fast. If the symptoms abate you’ve got good reason to believe it has to do with what you’re eating. I didn’t make the connection. Given what we now know about AS, this experience makes a lot of sense.
●
In 1973 a big advance was made in understanding AS when it was discovered that almost all AS patients share a common gene called HLA-B27. A lot of people have HLA-B27, and only a small proportion of them—about 5 percent—develop the disease. It appears that AS also requires an environmental “trigger” to set it off, which is thought to be true of almost all autoimmune conditions.
Dr. Alan Ebringer is a London immunologist, and the lead protagonist in the story of diet medicine for AS. In 1976 Ebringer reported that he had caught the culprit. A common bowel bacterium named Klebsiella pneumoniae looks like the HLA-B26 protein. Ebringer proposed that the two are so similar that the immune system mixes them up.
Today most researchers think the claim that a Klebsiella overgrowth causes AS via molecular mimicry is probably overstated at best. But most agree that gut flora is the right place to be looking for AS’s pathogenesis.
Gut flora are microorganisms whose environment is the gastrointestinal tract. When we’re in the womb our tract is sterile and doesn’t harbor any bacteria at all, but as soon as we eat and breathe we begin to seed our own personal populations of these mole people. Then they multiply—if you could extract all of the microorganisms living in your gut and pack them together it would look like a grey-brown softball and weigh about three pounds: upwards of 100 trillion microorganisms, ten times the number of cells in the human body. We are beginning to appreciate what a big influence they have on our health, even if our understanding of the causal connections in play is inchoate.2
In many ways the human body, having evolved in the context of gut microbes, is designed to coexist with them. We give the bacteria a place to live and a steady supply of food, and some of them, the ones we call “probiotics,” apparently go to work for us.
We don’t always get along with them, however, and this brings us back to diet. Remember that the human digestive system was set up to handle the kind of food we were eating in the Paleolithic era, and hasn’t changed significantly since. People who advocate a Paleo diet think that all sorts of things go wrong— namely the aforementioned “diseases of civilization”—when we stray too far into the modern Western diet. What may be particularly pertinent in my case is that our healthy, predictable relationship with the predictable populations of bacteria in our gut depends on our old, ancestral diet.
Especially problematic in regard to gut flora are starches. Starches are long chains of carbohydrates that have to be “broken down” in the stomach into simple sugars before we can digest them. Unlike simple sugars, which are pretty much always metabolized, about 10-20 percent of the starches we eat remain undigested in the tract and pass into the colon. There they feed bacteria.
Paleolithic man would have encountered starch primarily in a few fruits, vegetables and roots—a relatively small amount. But 10,000 years ago—far, far too recently to have given our DNA any time to adjust—along came agriculture, which is for the most part the cultivation of starch-rich plants for food. Particularly irresistible were grains. The fact that grains are so starchy is in part what makes them reliable, storable, shippable: starchiness is what makes grains a great commodity.
The FDA food pyramid I grew up with had us taking in about 40-50 percent of our calories in the form of starches. This is one of the rare places where American habits have been on par with health recommendations; and as a vegetarian replacing meat with grain products and beans, I was taking in quite a bit more. The result was that my gut flora population—my “microbiome”—was (to put it vaguely, since we lack the details) out of whack.
This, many immunologists now believe, may explain how it is that my body came to attack itself: the constitution of my gut flora was not to my body’s liking, and my body, attempting to beat back an infestation, ended up accidentally attacking the healthy cells in my spine, sacrum, heart and who knows what else.
PART II
In 2009 I was finally diagnosed with AS. Even then, however, I remained in the dark about the possible role of diet in the disease. Once you have a diagnosis for AS you are referred to a rheumatologist. Mine looked like a battle-hardened wrinkly witch, but she didn’t turn out to be as interesting as she looked. She took me into her office and breezily went over the basics of AS, holding up a plastic chart of what looked like a schema of WWII battleships and were actually anti-inflammatory pills. Five months later I had tried three of these and felt quite a bit worse. I called to find out what I should do and was told that next in her “big bag of tricks”—these were her assistant’s words, though it didn’t seem like a big bag to me—was to begin immunosuppressant infusions.
This was what was on my mind a year ago. The general public doesn’t know much about autoimmune diseases, but investors do. The immunosuppressant drug Humira is set to become the most lucrative pharmaceutical in the world in 2012. Drugs like Humira run a patient $15,000-25,000 per year and higher. Because immunosuppressants knock back the body’s defense system as a whole, they also make you more susceptible to serious infections, cancer, congestive heart failure and liver disease, to name just some of their side effects. Nevertheless they are hailed as a miracle by doctors, and for good reason: they are often the last line of treatment for otherwise debilitating pain and degeneration, and can give a patient who has been suffering for years profound relief within weeks. They have a positive effect on AS symptoms about half the time.
On the same day I learned that immunosuppressants were next up for treatment options, I remember my wife seeing me struggling to lean over to unload the dishwasher and saying: “Oh just go lie down.” It looked like I was probably going to need to give immunosuppressants a try, though this couldn’t be entered into lightly because apparently they can be hard to get off of once you’re on them.
First, though, I thought I should try fiddling with my diet. By this time, as I’ve related, both my mom and Max were on ancestral foods. Also a friend of mine was a proselytizing “Paleo person” with his own blog. I remember Paul at Thanksgiving picking up a huge turkey leg and gnawing on it. Unconvinced that there would be enough dark meat to satisfy his need for fat, he had shown up with his own personal duck. Paul looked like he was consuming about three times more calories than the rest of us which to be frank I found disturbing, but I had to admit he was acting about three times more vigorous than I was feeling. Where was the usual Chicago grad student late November pallor? So I had one big advantage that you, reader, may not enjoy: seeing is believing.
(Incidentally Paul also once told me that it is a good idea to sit for an entire week doing nothing at all when the mood comes upon one, which he figured was what it was like for cavemen back in the day from time to time.)
It was with my inquiries into the future of Paul’s heart health, and his responses to them, that I first began to pay attention to the history regarding fats and dietary recommendations I’ve related above. Today the large scientific studies anticipated in the 1960s are actually beginning to roll in, and by and large the results do not tend to support the idea that saturated fat is bad for you. What continues to pile up is the evidence that starches and sugars at the levels we’re consuming them can be very dangerous. For lots of people, this leads to insulin issues. For others a Western diet, and especially grains, may have the effect of damaging the digestive tract, and causing what is called “increased intestinal permeability.” The end result of increased intestinal permeability is that we fail to properly screen things that pass into the bloodstream, making the whole unfortunate molecular mimicry cascade to autoimmunity that much more likely, just because there is so much more for the immune system to fight and make mistakes with. And then there are the gut floral issues I have described.
Long before I knew I had AS, Paul had mentioned that a Paleo diet alleviates autoimmunity in some people. When I got my diagnosis I remembered the claim. So then, really as a way to sort of prepare myself for a life on immunosuppressants, I decided to experiment with a Paleo diet just to be sure, for the purpose of eliminating everything but the immunosuppressants I guess you could say.
This was in December of 2011. Doing a little research on my new diet quickly led me to a blog by a guy who had AS and said that as everyone knows the key with AS isn’t quite Paleo, but a diet that eliminates all starches. So I switched my dietary test to that instead. I thought it would most likely be a temporary thing—I’d give it two weeks.
●
The idea that you could effectively get over active AS by eating less starch wasn’t immediately obvious, even to those that initially proposed the gut flora connection. Fortunately, at this point serendipity stepped in. A patient of Dr. Ebringer’s who saw him for AS in the early 1980s asked him if he had any advice for losing weight. Ebringer, for reasons that had nothing to do with arthritis and autoimmunity—and departing, obviously, from the opinions of the time—recommended a diet of red meat and tomatoes. The patient took the advice and did lose weight, but more importantly he also came back reporting something shocking: his pain and inflammation were gone as well.
A light went on in Ebringer’s mind: these foods don’t contain complex carbohydrates. The starch must be what feeds the relevant gut flora; and gut floral overgrowth must not be a one-time passing “trigger” for AS but an ongoing cause and condition of its pathology. This means that we can treat AS by reducing the starch we take in. Ebringer came up with what he called the “London AS Low-Starch Diet,” which he administered to patients at his clinic. This basically consisted of four prohibitions: NO BREAD, NO POTATOES, NO CAKES (in Britain this comprehends pastries), NO PASTA. He published his findings on the diet in a 1996 paper, which reported that of the 450 patients he treated at his clinic in London between 1983-1996, over half now required no medication. His clinical intervention trial of 36 patients showed a statistically significant decline in markers for inflammation over a period of nine months.
In effect, the idea behind the diet is that eating less starch has the same result as my accidental fasts: it starves out certain gut flora, this time by more precisely targeting their starchy substrate. Bacteria are notoriously hard to eradicate entirely but they do die without sustenance. Whether or not Ebringer turns out to have been correct regarding the role of the Klebsiella organism specifically, it appears that he discovered how to change the constitution of one’s gut flora in such a way as to curtail AS autoimmunity.
The Spondylitis Association of America, the largest disseminator of information to AS patients in the world, until recently made the very visible and important claim on their website regarding Ebringer’s findings that “other studies … have found that the diet has little or no effect on symptoms.” When I ran across this statement, I contacted them to find out more. What followed was an exchange of emails in which I basically asked over and over “but what studies specifically?” They ended up dropping the claim from the site. Because, as they then found, there have been no such studies.
Why the SAA made such a stark and egregious error I’m not sure. (To their credit they amended it quickly). Why no one in the medical world other than Ebringer and his colleagues has touched the diet, and indeed no large-scale intervention diet studies have ever been performed on AS patients, and instead rheumatologists pooh pooh the very idea of diet-based medicine in spite of mounting consensus among researchers regarding the role of intestinal health and gut flora, is a different discussion for a different time.
●
The idea of treating AS with a diet low in starches could have languished in Ebringer’s ignored 1996 paper, and shed no light on the likes of me, if it weren’t for the excitement and evangelism of Ebringer’s healed patients, and later those influenced by them (this essay, for example, being my contribution).
By the late 1990s those who had their legs back because of Ebringer’s diet connected his work to two of the diet’s present-day heroes. First, someone discovered that there was already a book on eating starch-free called The IBS Low-Starch Diet by a layperson named Carol Sinclair. Sinclair had discovered, after much tinkering and testing, that she could cure her irritable bowel if she proscribed all starch. Her regimen is much more thorough than Ebringer’s rather low-key prohibitions, and today her book is the main how-to treatise for people like me who are on a strict “No Starch Diet,” or NSD. Sinclair recently discovered via genetic testing that she had AS all along.
The other individual Ebringer’s patients connected with was an AS sufferer the no starch community knows as Dragonslayer. Dragonslayer is NSD’s apostle to the internet, the co-founder and owner of kickas.org, the main forum for information on the diet. Dragonslayer has the quirk of capitalizing the word “you,” and when provoked by NSD skeptics he uses words like “demise” and “downfall” in sentences that sound like they should be followed by “mwaha- hahaha,” e.g. “And with every statement, You crawl further out on the limb You are sawing off.” “NONSENSE! You might want to re-read the study and pay attention this time. There may be a quiz at the end.” Since kickas.org went online in 1999, Dragonslayer reports that he has personally helped over two hundred people to remission through NSD via his online correspondence—which if it’s accurate is an astonishing achievement for a layman who never sought to make a penny from it. The night that I began the diet I read Dragonslayer’s “NSD QuickStart and tips.” While in general I try to avoid throwing away my sunny days on the internet like I’m slack-lipped at a rummage sale, this easy ability to share and glean information outside the orthodox medical community radically improved, and maybe saved, my life. I could not be more thankful to those who made the diet a presence online.
PART III
In reading what I am about to describe, you must remember that my intention is not to convince you to try a draconian diet like mine. Such things are for gimps such as myself. I’m trying to convince you to lean toward our hominid past and thus avoid autoimmunity as well as other diet-related diseases, which together affect 50-65 percent of people eating a contemporary Western diet worldwide.
Some of the things that NSD prohibits: all grains (which includes wheat, which of course means pastries, pasta, etc.; also corn and corn products, rice); all legumes; all root vegetables; a few fruits (like bananas); and certain nuts, including peanuts (anyway they’re peas not nuts).
Some things are tricky and their starchiness has to be determined item by item. For this I use an old middle school chemistry class trick I learned from Sinclair’s The IBS Low-Starch Diet: iodine turns blackish blue if you put a droplet on something starchy. All tomatoes need to be tested in this way. So do apples, vine fruits like watermelons, and coconuts.
That leaves one with all animal foods, most fruits and vegetables, and the remaining nuts. People on NSD can eat dairy, which makes the diet in this one sense more permissive than one that sticks strictly to Paleolithic foods.
This is all, needless to say, an immense pain in the ass. In the first weeks that I was eating the food of the hunter-gatherer, I felt a bit like a hunter-gatherer must have felt in a world where food was hiding and had to be ferreted out from the rocks. At the supermarket I soon found it was not worthwhile to venture into the actual aisles of the store, since almost everything that can be found there is starchy or has had starch added to it. I had to stick to the periphery. There is often literally nothing I can eat at gas stations.
As I say, initially I was sure this was just a two-week experiment. I would soon return to my old diet, the diet of my friends, family and neighbors. I thought it most likely that the online NSD people were willful witnesses and hypochondriacs. Right around two weeks in, however, when my trial was to conclude, I started to feel significantly better in my legs and back, and I understood that this was for real.
The community on kickas.org has built up an edifice of anecdotal info: it appears that the no starch diet doesn’t work for all AS sufferers, and it works best in those that catch the disease early. Relief arrives in two to ten weeks or so (it isn’t as immediate as fasting), and the pain will stay away unless you eat starch, in which case you are likely to get a “flare up” of symptoms. You need to be really rather strict with the diet it for it to work well: a bite of yogurt that contains corn starch will set you back.
The latter has indeed been my experience. Diets like mine are very difficult, and I wish I could congratulate myself on the kind of rare restraint that often has to go into such things. The truth is that I open the cupboard at my wife’s parents’ place and am presented with food that will put me into face-altering pain within five hours. This will last two or three days. Whatever respecting that is, it cannot be called restraint.
●
It took two weeks for me to see results, which was very motivating. But the diet of course took much longer to get accustomed to. In the end, what had to happen was a kind of reimagining of what was reasonable in regard to eating meat, and especially fat, and also a sort of magnification in my mind of foods I was already familiar with.
This morning when I got downstairs I sliced a sausage in half and fried it. Once it was almost done I drained off the fat and heaped in baby spinach so that this stuck out over the top of the pan. The result looks like two brownish-gold submarines in weeds. I keep poached eggs in ramekins in the fridge, and I heated two of these in the microwave with some butter (this is how restaurants do it—happily it’s quite delicate). I put blackberries around the edge of the plate.
For a long time I don’t think I quite realized that this or something like it was now to be an every-morning thing for me. It still felt like a once-a-month brunch. Incidentally, however, in this country, right up into the late nineteenth century, for those who could afford it this was a standard morning meal: Thoreau’s contemporaries very often ate a breakfast of eggs and meat much like mine. A meal of boiled grains such as farina would be the last resort of the poor, one of Dickens’s symbols for their suffering.
Actually American expectations in regard to breakfast have a rather dubious history, one worth recounting now, as I’m sure, reader, you’re thinking that I’ve started to stray beyond the bounds of what’s reasonable.
The story of dry grains and our notion that they’re the healthy way to begin the day got its start in religious radicalism. In the early nineteenth century, a Presbyterian minister by the name of Graham, namesake of the cracker, began to promote a high-fiber vegetarian diet as a way to curb carnal passions. He was especially obsessed with “self-abuse,” what they used to call masturbation. People had believed for a long time that a diet that excludes meat, the diet of the holy ascetic, reduces libido. Today there is actually a lot of evidence to back up that claim. In the nineteenth century with people like Graham, however, there emerged the erroneous notion that sexual arousal was itself terrible for our health—the problem, so he wrote, at the root of most modern ailments. So the vegetarian diet was not just the most spiritually pure diet, but the healthiest human diet too.
Graham influenced those who went on to radically change our culinary habits. Among them was the founder and head of the Seventh Day Adventist church, Ellen White. The Seventh Day Adventists are best known for their belief that Saturday is the Sabbath. They also have a very literal interpretation of Genesis 1:29: “Behold, I have given you every herb bearing seed … to you it shall be meat.” They are vegetarians. For religious reasons they believe that man is not supposed to eat meat. Now Graham and his followers were conveniently adding that this vegetarianism is best for the body as well as the soul. White and the Adventists founded a sanitarium in Battle Creek, Michigan, “The San”—the setting and subject of the 1993 comic bestseller The Road to Wellville—to promote the new health principles, and appointed their fellow parishioner and physician John Harvey Kellogg to run it.
Dr. Kellogg had the charisma and commercial savvy that Graham lacked. He shared with Graham an abhorrence of sexuality. In his manual on avoiding sexual arousal and activity, Plain Facts For Old and Young, Kellogg describes the procedures he used for preventing masturbation at The San:
The prepuce, or foreskin, is drawn forward over the glans, and the needle to which the wire is attached is passed through from one side to the other. After drawing the wire through, the ends are twisted together, and cut off close. It is now impossible for an erection to occur.
Quite simple really. And there was something for the ladies: “In females, the author has found the application of pure carbolic acid (phenol) to the clitoris an excellent means of allaying the abnormal excitement.” Kellogg was a piece of work. When he couldn’t cure someone, he would accuse them of masturbation. He himself was married but celibate, penning parts of Plain Facts on his wedding night. He would have an assistant give him an enema every morning after breakfast.
It was Kellogg who bequeathed to us the reason still given for the healthfulness of dry cereals: that they cleanse us. Really everything Kellogg did was in one way or another about cleansing. He spent much of the latter half of his life coming up with ways to purify the white race (some sort of scheme involving nuts). At the top of Kellogg’s list for avoiding the arousal of the blood in carnal lust and its attendant maladies was bland, high-fiber vegetarian food, especially dry cereal grains, which would pass through the intestines and rake out matter that he thought stimulated the genitals. At this time, with The San at the middle of the action, many of the dry cereals that we know today and that still command a lot of shelf space at the supermarket were developed and marketed: shredded wheat, granola, bran and corn flakes. All were given the same recommendation: that they would keep us clean. Shredded wheat actually looks (and tastes—as Kellogg himself noted) like a bristle brush.
In the twentieth century a lot of fiber-rich dry foods like breakfast cereals and crackers had the fiber processed out of them to make them tastier (Graham would have shaken his head at the sugary sweet fate of the Graham cracker, which basically became a cookie). The scrubbing action of fiber was ripe for a new revival, and that revival—again led by religious men, this time two British missionary doctors, Denis Burkitt and Hugh Trowell—is what we’re still living through.
But aren’t I committing a genetic fallacy here? However misguided its beginnings, surely the benefits of fiber have now been backed up by modern dietary research? Actually no. Not even close. In the 170 years since the cereal revolution, very few controlled clinical or large-scale observation studies of dietary fiber from grains have ever been performed—seriously—and in those that have the results have been dismal. As Gary Taubes writes of the hypothesis that fiber is important for human health: “The pattern is precisely what would be expected of a hypothesis that simply isn’t true: the larger and more rigorous the trials set up to test it, the more consistently negative the evidence.” For fiber, as with anti-fat recommendations, the royal road ran from hypothesis to advertising to accepted health doctrine, without making a stop at meaningful, rigorous research. Only now is that research beginning to roll in. On one point the fiber enthusiasts appear to be correct: fiber does lead to more “regularity” and bigger poops—the obsession of octogenarian weirdoes. The trouble is that we have yet to find any correlation between bowel regularity and a single significant mark of human health, including colon cancer.
The data on fiber isn’t conclusive yet, but by all appearances fiber is going the way of the douche. We used to think that a lot of other things needed to be cleaned: vaginas, veins, throats, ears, sinuses. If there’s a hole, someone has thought that it would be a good idea to get up in there regularly and go to work. With all of these most of us have returned to the common sense idea—having, today, perhaps a little more trust in the body—that the inaccessible insides of an animal will generally do a fine job keeping tidy on their own.
Pretty much everything Kellogg did at The San has turned out to be arrant quackery. What hangs on is our prejudice as to what’s for breakfast. Kellogg was also a leader in the new formula that set the standard for making money in food development in the next century and a half. Basically big business in marketing food has to focus on what it can commodify and brand. It’s not easy to do this with my breakfast—eggs, sausage, spinach, blackberries—at least not in a proprietary way. You can brand things that you’ve changed, cooked up, modified to make edible, and you can commodify what you can store and ship. This you can give a name like “Grape Nuts.” “Grape Nuts,” neither grape nor nut, was invented by C.W. Post, one of Kellogg’s patients at The San and another deplorable mix of religious fervor, charlatanry and commercial savvy.
Interestingly there is one thing Dr. Kellogg may have gotten right: he appears to have been ahead of his time in his speculation that many modern ailments have to do with intestinal flora. Only he thought that this should be treated by eating bristle brush-like substances, and the liberal application of yogurt up the butt.
It so happens that one paper I’ve seen cited in the context of AS and gut flora is a 1973 study of 47 contemporary Seventh-Day Adventists. The vegetarian Adventists had on average 30,000 Klebsiella organisms per gram in their feces—the organism that may cause AS via molecular mimicry—compared to 700 organisms per gram in the controls on a normal mixed American diet: in other words, these vegetarians had more than forty times the amount of this microbe in their gut.
There have to date been no epidemiological studies on the link between vegetarianism and autoimmune disease, though many suspect one. For what it’s worth all four of the people I know with an autoimmune disorder, including myself, were vegetarians in their youth—tragically two still are.
●
Figuring out Breakfast, or I guess you could say allowing it, was one of the first big adjustments I had to make on the diet. There are plenty of things that I still miss, of course. One that seizes me from time to time is that without starches, there are very few things that I can eat that are crunchy. Not a chomp like a cucumber but a dry crackly crunch—apparently one of the few food sensations a baby doesn’t have to learn to like. I recently baked kale chips purely for crunch, but when I bit into them they made a crinkle.
In general, however, I’ve found that I’ve made a good bargain when it comes to delight and satisfaction in food: starch and sugar for fat. And there is nothing in the diet that is like choking on a bran muffin.
Thoreau, a lax vegetarian, a vegetarian “in his imagination,” as he says, lets fly in one of Walden’s many parrhesic paragraphs:
As I came home through the woods with a string of fish, trailing my pole, it being now quite dark, I caught a glimpse of a woodchuck stealing across my path, and felt a strange thrill of savage delight, and was strongly tempted to seize and devour him raw; not that I was hungry then, except for that wildness which he represented. Once or twice, however, while I lived at the pond, I found myself ranging the woods, like a half-starved hound, with a strange abandonment, seeking some kind of venison which I might devour, and no morsel could have been too savage for me.
The reformed vegetarians I know have some such story to tell. My recollection, which I suppose is the other side of Thoreau’s, is of approaching home across the park below my apartment with numb fingers on a raw February night, looking up at my windows where my wife was cooking and knowing in my heart that no soybean would warm my extremities.
One wonders about Thoreau. Walden is his long paean on the ills of cultivation and civilization in favor of the primitive and the wild—the mock-Georgic agricultural chapter “The Bean Field” not at all excepted. He explains that he is “less and less a fisherman” because he is learning to heed an intuition. He says of this: “It is a faint intimation, yet so are the first streaks of morning.” What he leaves hanging is why he heeded what was so faint, and why he didn’t find a place for his other much less faint feelings.3
For lunch today I have brought to work some leftover beef tips and salad, supplemented by rolled-up turkey cold cuts with mustard, olives, and summer squash. I have an apple for a snack (a Fuji—one of the few reliably non-starchy varieties), and a chunk of chocolate.
I have taken to making a batch of crab cakes on the weekend. When I get home tonight I will probably have a pre-dinner snack of one of these. For dinner it’s wild salmon, of which I will serve myself about three times as much to eat in one sitting as I did last summer.
●
The question “What ought we to eat?” involves not just nutritional considerations, but also ethical, environmental, and pecuniary ones. These after all were the considerations that led me to my vegetarianism all those years. Where have I put them? I’ll try to be open about what still unsettles me.
The truth is that whereas I remember a time when my wife and I budgeted $150 per week on groceries for the two of us—about average for an American household—now it is more in the range of $220. I bite the bullet on spending significantly more money on what I eat now. We’ve cut back on other things like dining out and, happily, health bills.
If you can’t afford it, well then this just isn’t an option obviously; but there are different degrees of “can’t” here. As late as the 1940s, Americans allocated more than 40 percent of their income to food. Something that it is important to recognize is that the reason we are now able to allocate so little—around 15 percent, including eating out—is that such low prices are sustained by conditions that must change: cheap fossil fuels, abuse of an immigrant work force, and especially the once rich but now increasingly depleted and desiccated soil of the Midwest. In the case of factory-farmed meat, prices are kept low by our tolerance of the animals’ abominable anguish and misery.
My diet has me buying grass-fed beef and bison and other animal foods raised down the road in New Hampshire where I live (which, like much of the rest of the planet, isn’t suitable for grain agriculture anyway). That is what much of my high expenditure is going toward. There’s little that I’m confident about in regard to the social responsibility of my diet, but there is no question, on this one matter, that the pasture is a better life for the cow.
The pasture may also be preferable environmentally. Unlike a cow on a feed lot, what a pastured cow consumes is transformed, firstly, into cow, and then the remainder comes back to the soil as nitrogen-rich pee and manure. The ruminant is part of a process that turns renewable resources—sunlight and rain—into both nutritious food and fertility, the same fifty-million-year-old ecology that built up the rich topsoil of the world’s agricultural areas. It does so largely without the intervention of fossil fuels.
This last point is often passed over. Some of our reflexes when it comes to the social irresponsibility of carnivorism come from the fact that when we compare eating meat to eating vegetables, and impressively add up how many more people can be fed on an acre of soybeans and so forth, we are often obliviously talking about agriculture that draws on various forms of nonrenewable natural capital. One such source, the prairie soil, was actually made from generations of the resident ruminants that the soybeans are said to be preferable to. The truth is that intensive agriculture in the absence of livestock either enters endgame or depends heavily on fertilizers that are themselves made with, and from, fossil fuels. You may have heard that in many agricultural areas aquifers are depleted (read: no water), and the cracking, wizened-white soil is washing away. Increasingly the soil that the world’s grains come from is literally dead, with none of the biotic vitality required for plant life but pumped-in fertilizers.
In The Omnivore’s Dilemma, Michael Pollan estimates that an acre of corn takes fifty gallons of oil all told to come to table. There may or may not be a catastrophe resulting from a serious hike in the price of fossil fuels, but it isn’t clear to me which food choices will do a better job of averting or assuaging it.
Feeding everyone on spangled cows in a grassy dale with wet nostrils open to the breeze, and fish from the deep blue ocean, is not in the cards. I have no illusions that the hardest question—one certainly on my mind—is one I’m only further from an answer to: What about the whole world? The distressing truth as far as I can tell is that comparing, say, responsibly raised animal foods to grains, the choice is between food that is viable but hardly high-yield enough for all, or a food we can (and do) grow for the whole world, using up resources we’re rapidly running low on. Without the use of fossil fuels for fertilizer, our planet has a human carrying capacity of around four billion. We’re at seven billion. That’s an untidy constraint I have no idea what to do with when something in me inarticulately wants to know how the whole world and its children’s children can live, and wants to live like that.
●
Meanwhile it does seem obvious to me which is the correct choice nutritionally. Five years ago my cardiologist told me I wasn’t likely to live a long life. One year ago I was told that in the event I did live a long life I might not be able to look my friends and family in the eye, because even if treatment for AS went well it was likely that my sacrum and spine all the way up to my skull would be gradually fusing into one bent bone.
And I was in a lot of pain. It is hard to describe the way it feels to have inflammation in the hip area. It doesn’t feel like there is some spot that hurts. Rather it’s like as long as you stay still the world will keep its normal laws, but take a step and a different dimension flickers on, and there is no possibility of understanding how it works. It’s like you’re trying to move in an anti-medium, a thickening thing, pain, with the slow awkwardness of an underwater Kung Fu scene. I also had insomnia for those years, which may or may not have been related to autoimmunity.
Two weeks after I began to eat a no starch diet my pain pretty much went away—from like a 4 or a 5 every day with no remission for the final two years to a 1. I also began to sleep well. Three months in, my inflammation had crept back to its deepest strongholds; I didn’t feel pain anymore unless I poked really, really hard in one spot on my tailbone. Then one day that too was gone. Today, nine months in, I’m fine, and I can jog and so forth like old times. At my most recent (and I’m hoping my last) visit to the rheumatologist my blood work showed my indicators for inflammation were normal—low, in fact, as Americans go.
Over the years my heart had mysteriously been improving its function bit by bit. Originally the cardiologist told me that this couldn’t happen, but she and I watched as my heart went from pumping out 30 to 35 to 40 to 45 percent, with each semiannual echocardiogram from 2007-2010. I became glad I declined the invasive internal defibrillator that she had prescribed. Of course I can’t be sure what caused this recovery, but my best guess is that when I was tested at 30 percent I had, as I have recounted, been a vegetarian for five years. At my most recent visit to the cardiologist, five months into the no starch diet, the echocardiogram showed my heart pumping out 55 percent of its volume with each beat—in other words, normal. The report literally says “all valves working normally.”
If I had assented to what the experts I saw prescribed to me in the half- decade before I began the diet, I would have a device the size of a cigarette box implanted in my chest with permanent wires running into my heart. This would be liable, scandalously often it turns out, to give me a shock without warning when it isn’t needed. There is no off switch. One poor guy, I learned, had it misfire on him 74 times in one day.
I would today be on anti-inflammatories, antidepressants (I was soon to begin these to treat the insomnia), and needling myself each week for dangerous immunosuppressant infusions. I don’t think that any of these recommendations were out of the ordinary; I sought out and got some of the best help available.
I was so close to this fate—really if any one of the parts of the story of the no starch diet and its dissemination, or any one of the factors leading me to try it, hadn’t played out, I would be on all of these treatments. And I would be lethargic from the unreal sleep afforded by pharmaceuticals. And I would most likely still be in a lot of pain. And fusing up. Then there would be other costs. The defibrillator would have cost my insurance $50,000 (of which I would pay ten percent), with a $5,000 new battery every five years. The immunosuppres- sants would be running $15-25,000 per year, to say nothing of all the continued exams, visits to specialists, surgery. I would be journeying into a dark, dependent and bewildering life. Now, instead, I’m quite well. That’s the bottom line in my personal story.
●
If you peruse the health columns in the New York Times, you’ll see that the titans are rebelling: pissed off Salt, Saturated Fat, and Cholesterol are re-ascending Mt. Olympus. Waify Stretching has just been cast down the rocks. They’ve got blond Grains in their arms and are heaving him back and forth, winding up for the blissful banishing toss.
Next month you’ll see columns cheering on the old order. This sort of twenty-first century fog of data can make any attempt to address one’s health by diet feel like guesswork. But I think what I am suggesting is a straightforward idea. When archaeologists and anthropologists can tell us that hominids ate such and such, then this food is probably fine; and when they tell us that hominids never ate such and such, or very little of it, there is good reason to suspect it isn’t—because we are hominids. Given that hominids historically rarely ate grains, for example, we shouldn’t eat them in mass quantities. By no means is that an air-tight rubric; but it’s a good general guideline, to be followed in a way one finds reasonable.
Hippocrates is supposed to have said “Let food be your medicine and medi- cine be your food.” That’s for the afflicted. For ordinary people it ought to be: make sure your food is food for the kind of thing that you are. Michael Pollan’s now famous definition of food is that it is only food if your grandmother would recognize what it is. The idea of eating like our ancestors, at least the version of it I take up, is something similar: it is human food if your great-great-grandmother out to about 150 generations was eating it.
●
Hippocrates Is also supposed to have said: “Walking is man’s best medicine,” and in this he and I are most firmly in agreement. Last April my son Ben was born, our first. Those first few weeks stretch people pretty thin. It’s hard to know what it would have even looked like if my insomnia and inflammation were still around then. By April, however, I was in full remission from the disease, and I was able to be the legs of the family while my wife recovered from the birth.
I take care of Ben in the morning, and every day I strap him into his sort of backpack and we take a meandering walk, if a modest one. There is a 50 percent chance that Ben inherited the HLA-B27 gene from me, but hopefully there will be no need to ever find out. He and my wife will eat—what becomes inevitable with me eating the way I do—a sharply starch-reduced diet, and if the account I’ve laid out here is correct then there’s not much chance of him developing the disease in our house. I’m hoping that his dad will get stronger and stronger as he grows up.
The summers when I was in middle school I spent hiking out West, and we did a lot of putting one foot in front of the other in the most gorgeous places I’ve ever been. But I don’t think that I really walked, or knew what walking was, at that time.
On the other hand, the summer after college when I painted my parents’ house, on my days off I would walk from our kitchen door into the woods, pass under the highway via the railroad, and head into a wilderness which, oddly, gets you closer to the city. It was around then that I learned to walk. Maybe it was because it was so unbearably hot that summer, and the only thing you can do in heat like that is wander, and walking is wandering.
I began bird watching that summer, which gives reconnoitering a richness and at the same time floats it in its proper uselessness. That swamp is where you can find the towhees. Over the hill, a line of herons will fly heavily into that flooded bend in the river in the late afternoon. If a walk is going to have a goal it had better be a fanciful one: to revisit a dimly recollected grove of pitch pines and see the extent of the sandy seam they grow in; to see what is just a little further west. No “let’s conquer Mt. such and such.” That’s good when you’re with others to avoid the stress of shared decisions, but when you’re by yourself you’d better not have such things in mind; you need to be able to make the most of what you encounter, and take an attractive avenue or field opening to the left. Thoreau writes about the paradox of walking: with what he calls sauntering you are always under the impression that just over the horizon is the promised land, and you’re always already at home.
The fact that I started to walk in Concord, where I grew up—hallowed New England walking country—obscures an important fact: that the walker is not picky. The walker does not require much. This is why the best walk is one that begins as you step out of your doorway—involve a car and you’re unable, unfree, complex and in debt; your feet won’t touch the ground.
Walking does, however, require one essential item: you do need a working pair of legs. I was getting really good at walking when something went wrong with my legs. This is what this article is about.
For about half a decade I couldn’t walk. In fact I found myself slowing down year by year like the tin man after the rain. The ailment I eventually discovered I had, an autoimmune condition called ankylosing spondylitis (AS)—was baffling to me: constant crippling pain, bones fusing to each other, and, in my case, the threat of dropping dead anywhere anytime. If I had had time to write an essay about this a year ago, when I felt like I was walking waist-deep in water, and the water was pain, it would have been one kind of essay—maybe a better one than I am in a position to write now. It could have been about a phantasmagoria of pharmaceuticals and doctors I’d see for fifteen-minute speed sessions. It could have been an essay about what it is to give up something that you truly love. And it would have been written for my sake much more than for yours.
Instead, however, something really remarkable happened. I departed from my doctors’ recommendations—more on that later—and did what was not at all natural to me: I searched around on the internet for information about the disease, and came to find that a group of people with AS claimed to know something simple but very important. There was a problem with what I was eating. They recommended a radical diet that is a lot like the diet man would have eaten before the advent of agriculture. Nine months ago I gave it a try and since then have been, in a word, cured.
So this essay turns out to be about something quite practical. It turns out that almost all of us are not eating food fit for humans. About 72 percent of the food we consume in Western diets comes from sources that were never or very rarely eaten by our ancestors before the last (evolutionarily negligible) 10,000 years. This can make our bodies fall apart, some in small ways, some catastrophically. You or someone you know has these problems or will, which is why you should read this or something like it.
PART I
But I am getting ahead of my story. Let me begin further back. One day in his first summer of life our family dog Max went swimming—his bliss—and spent the rest of the summer with red blotches all over his stomach that he’d sleepily shake a paw at. Though my parents thought this would pass, the second summer it happened again, and this time the blotches grew into nasty red continents. He only got sicker into the fall and winter. My parents were referred to a pet allergist, and tests determined that Max was allergic to “grass, weeds, mosquitoes, trees [a long list of them], feathers, flowers, tobacco, ants, pollen, mites, histamines, horses, cats [!!], sheep, fleas, dust, mold”—pretty much everything they tested for. Max is blessed with a sweet personality and a lovely coat of hair and this must have helped my parents in the face of the news that their dog’s body was so unaccountably unfit for life on Earth. The latter was simply the way it was so far as the vet was concerned, who told them to brace themselves for forking up big bucks for treatment.
Thankfully, just before it was time for this, an acquaintance of my mother’s suggested that she try experimenting with what Max was eating. I remember seeing the shiny silver bag of dog food with a woman and her dog romantically silhouetted by an orange and gold sunset over the italicized slogan “The thinking person’s pet food.” I remember it because it seemed so snobby and also because it sounded a little like the person was going to be eating the pet food. Now, however, the slogan seems to me, whatever else it is, accurate. “The thinking person’s pet food” appears to imply that the demographic that thinks, thinks profoundly, buys this product. But in fact it might just mean that a dog is an animal that gets sick when it eats grains (which most dog foods are full of), because the ancestors of the dog never ate them and a dog isn’t designed to digest them. The thinking being referred to could be just the thinking of this thought.
That at no time in the long descent to dogness did any wild dog eat the seeds of grasses, and that this is one weighty consideration in the question of whether a domestic dog ought to eat them, is an instance of one of the most straightforward post-Darwinian practical principles available to us. At the zoo they don’t wait for diet research to roll in before they have an accurate idea of what ought to go into the polar bear compound: they know that this animal eats fish. While a wild diet might not be the last word, it isn’t going to lead one far astray. But somehow such reasoning is conspicuously scarce outside the zoo.
The more specific story, we’re finding out, is that if a dog eats lots of grains it is prone to developing autoimmunity and allergies and other health problems because it has a compromised intestinal tract that lets antigens into the blood stream. As far as anyone can tell, Max has been hale and fit ever since he started eating his new food. No more horrendous itchy red continents on his underside. At twelve—getting up there as dogs go—he is a lithe blond beast, and swims, when given the chance, for entire afternoons at a stretch. On a recent visit home I found that when Max gets back from a long walk he still sprints a few smooth laps around the house, sending the turf flying.
Max was the first in the family to return to the diet of his forbears. My mother was next. She had some sort of mysterious gastrointestinal issue, and she followed her dog into the past. The human equivalent to the kind of wild-informed diet Max is on is what is now sometimes called an “ancestral,” or “archevore,” or “cave man” diet, but most often a “Paleo diet,” because it tries to mimic man’s diet at it was in the long Paleolithic era from 2,600,000 B.C. to 8,000 B.C. when we were hunter-gatherers, when the makeup of our gastrointestinal system and metabolism were in large part molded. My mother eats a version of a basic Paleo diet: plenty of fish and leafy greens, grass-fed beef, no grains (this covers rice and corn), no legumes (a.k.a. beans), no dairy, no refined oils, little starch, sugar or alcohol.
My own path to a similar diet—not quite a Paleo diet, as I will explain— was rougher than Max’s or my mother’s. It also involved one big blunder.
●
Normally it is a wonderful thing that we have an immune system. This is the part of our body that seeks out and destroys or disables “antigens”—viruses, harmful microorganisms and other foreign things that have entered the body. We wouldn’t survive long in the world without it. But the immune system, we now know, is capable of making mistakes. Some of these are inconvenient but (in most cases) forgivable, like allergies. Allergies are an inappropriate immune response, where one’s body comes to see something that is in fact harmless—feathers, etc.—as a threat.
A different kind of mistake that the immune system can make is much more serious. An autoimmune disorder is a disease in which the immune system accidentally slates for destruction some of one’s own healthy cells. It is the body’s friendly fire, only focused and relentless. This sounds like something exotic—“yes Mr. Bond, the viper’s poison will use your body’s own defenses to destroy you.” So it is. The thing is that today it is more and more common.
Around fifty million Americans, or one in six, suffer from an autoimmune condition, though few people could name one of them with confidence (no, AIDS isn’t). We spend around $120 billion on treatment for autoimmunity each year. To give you some perspective, we spend $70 billion on cancer. This isn’t counting allergies or asthma, nor well-known diseases such as schizophrenia, Alzheimer’s, autism, atherosclerosis or Parkinson’s disease, that are now thought to involve autoimmunity.
Now people like to hear about others’ health maladies about as much as they like to hear them relate their dreams at the breakfast table. So I’ll just tell you where I ended up. After several years—the latter half of my twenties—in which I didn’t know what the fuck was happening, I had three big problems. One was that I couldn’t sleep. Another was that I experienced excruciating pain in my legs and lower back that unaccountably migrated from place to place. The last and in some ways the most distressing was that a series of echocardiograms, the first of which was supposed to be a routine just-in-case kind of thing for palpitations I had been feeling, determined that my heart was only pumping out about 30 percent of its volume with each beat. A healthy heart pumps 55-60 percent. This kind of heart failure almost always gets worse, and I was in immediate danger of what doctors have given the gentle euphemism “sudden death,” where the heart simply stops.
After much meandering from one hypothesis to another, last year I was definitively diagnosed with ankylosing spondylitis. “Spondylitis” means inflammation of the vertebrae, the source of the sufferer’s chief complaint. It’s really inflammation of the spine and sacrum, and it can also affect other organs like the eyes and, occasionally, the heart. “Ankylos” means bent, referring, I found out, to the disease’s most dramatic symptom. Eventually the ligament that runs down the anterior side of the spinal cord calcifies, curling the back into a hardened hunchbacked position. I was diagnosed when an MRI showed the milky beginnings of this process.
Why was my body attacking itself? Immunologists now think that an adequate answer to that question may begin with what was going on in my digestive system. Let me explain what I was eating at the time.
●
Although I grew up in Concord, I never really read our local authors until I left—in fact when I arrived at college I hadn’t read a word of Emerson, only glimpsed out of the corner of my eye poetry and epigrams etched into stone around town.
As for Thoreau, I believe I could have come up with “I went to the woods because I wished to live deliberately” and drawn the lineaments of his cabin. The exception was that I did know what Thoreau said about food: I had had to read it aloud in English class in the tenth grade. (This in lieu of reading the book on our own, our teacher having told us, from experience I suppose, that if we swam out into Walden we would drown.)
What I later remembered from the passage I read—it is remarkable how well one remembers what one is responsible for before an audience—was Thoreau’s complaint that meat is literally a bloody mess. He says that catching, cooking, etc., his fish takes forever compared to, say, planting potatoes and harvesting them. “The practical objection to animal food in my case was its uncleanness; and, besides, when I had caught and cleaned and cooked and eaten my fish, they seemed not to have fed me essentially. It was insignificant and unnecessary, and cost more than it came to.”
When I got to college I finally read Walden; in fact I read it more often and more seriously than any other book in my life and it became my good friend; and at that time I didn’t drown, I don’t think so anyway, I drank up its ideal of simplicity and self-reliance—simplicity as the unappreciated center of the ethics and economics of a responsible life as well as a happy one. I started to try to live up to this in what way I could: taking up the flower over the opera, for the joy of what is wild and gratuitous, not coveted, not sitting on top of grandeur with complex conditions. This, as I understand Thoreau, is one aspect of how to cultivate an ability to take day-to-day undeferred, untriangulated joy in one’s life.
I still think that Thoreau is right about that. The point for our purposes here is that when I was twenty or so I became a vegetarian. Meat was complex, complicitous, expensive and unsustainable. Plants grew straight out of the soil—food for someone living lightly.
I was a vegetarian—a good one who really ate lots and lots of vegetables, not a bageltarian—for five years.
●
Vegetarians like to point to their favorite big vegetarian for evidence that plants can provide all we need for nourishment. Thoreau’s example is the ox:
Strictly speaking Thoreau is quite right: it is possible to make bones out of vegetables. We irresistibly tend to imagine that eating muscle makes muscly men; fat, fat ones. A vegetarian is subjected at Thanksgiving to the inevitable uncle chiding, “you need to put some meat on you” and gesturing to the turkey; and Thoreau is correct: this is more or less fallacious.
On the other hand, it is a terrible idea to draw inferences about nutrition and human health from the diet of the ox. We now know that the process of turning raw material into energy and body structure works quite differently in different animals. An ox is a ruminant. It has four stomachs, the first of which, the “rumen,” is designed for digesting grass. This 25-gallon tank contains a huge population of microorganisms that we lack entirely, for breaking down cellulose. It regurgitates its rumen something like 500 times over eight hours to make the most of this extremely nutrient-poor food source—utterly unlike the way we do things.
The example more favored by vegetarians today is the gorilla—much closer to us on the family tree and also a fun one because with the gorilla everyone gets to imagine a leaf-eating silverback playfully pulling one’s skeptical meat-stubborn uncle to pieces. The gorilla, however, is what we would have become if we had evolved a survival strategy like that of the ox. Our line of descent diverged from that of the gorilla over seven million years ago, and in that time gorillas acquired cellulose-processing gut flora.
We took our own way. At no point in our long descent to Homo sapiens did we ever eat grass, as do the gorilla or the ox. Rather we ate the ox, or at least its ancestor the aurochs. To be more specific, we were once tree-dwelling fruitivores. Then the American continents collided, which changed the circulation of warm water worldwide. The heavily forested areas to which our ancestors were adapted gave way to an increasingly open landscape, and it was then—about 2.6 million years ago—that we began a brilliant career as the tool-wielding opportunistic omnivore. We grew a huge hungry brain and a shorter gut designed for quickly digesting rich fatty foods.
We don’t know exactly how much meat our hominid ancestors were getting their hands on. Tellingly, modern hunter-gatherer societies take in roughly 65 percent of their calories from animal foods (on average), to our roughly 25 percent. They eat considerably less starch and sugar.11. It is not just Thoreau and vegans that favor spurious animal analogies. Researchers have at times run the analogy the other way: such that if something is bad for another animal, it will be bad for us. It turns out, for example, that when rabbits are force fed cholesterol, they don’t do well: they end up with cholesterol in places where no animal is supposed to store the stuff, like their tendons. This grotesque outcome leant early support to the idea that dietary cholesterol causes problematic cholesterol levels in the blood that leads to atherosclerosis and that sort of thing in Homo sapiens. The same experiment was subsequently performed on guinea pigs, parrots, pigeons, rats, mice and goats, with the same results. Never mind that a rabbit’s digestive and metabolic arrangement is little like ours. Where the ox has its regurgitation and rumination, the rabbit actually eats its own shit to get the nutrients it needs out of grass. On the other hand, its body is not adjusted to animal foods and has no idea what to do with cholesterol from food. It turns out that when you try the same experiment on a meat-eater it is a different story. When you force feed a dog cholesterol—and they did—it just wags its tail. And if you draw the inference from what happens to the dog, you get the correct conclusion, which is that consuming cholesterol doesn’t significantly raise cholesterol levels in homo sapiens. This, incidentally, is something researchers of all stripes have agreed on since the 1940s. People accidentally mix it up with the idea that saturated fat raises cholesterol levels, and cholesterol in turn is bad for the heart. (If your doctor tells you to lay off the eggs because of the cholesterol that they contain you shouldn’t see him again: he ought to be able to manage relationships that involve three things). There is scattered and shaky evidence that laying off saturated fat will help your heart health—but let’s be clear that that’s the only thesis that has actually been on the table. More recently endocrinologists, who study hormones, are telling us that problems with cholesterol arise for humans, and for dogs, when we consume too many carbohydrates. This is because what is really much more important to us is the 80 percent of the cholesterol that our own bodies synthesize, not the 20 percent we eat, and when we consume too many carbohydrates this apparently screws with the system.
●
A lot of people, myself included, first heard about the ills of carbohydrates with the phenomenon of the Atkins diet, and assumed that this began with Atkins. In fact the idea that the volume of carbohydrates consumed in the standard Western diet is harmful to humans has been appreciated for a long time. Among the unexpected revelations that came out of British colonialism in the nineteenth century were the reports that returned from the field regarding the absence of civilization’s chief chronic illnesses in pre-agricultural peoples. This came from all corners. Doctors practicing in indigenous non-Westernized populations in Africa, Asia, the South Pacific and America would see plenty of certain sorts of ailments: broken bones, malaria, gangrene. But again and again the doctors observed that a whole range of problems—obesity, cancer, atherosclerosis, asthma, osteoporosis, and a long list of others—were either absent or extremely rare. They were absent, that is, until Western agricultural foods began to be incorporated into the local diet.
It was the same list everywhere, and this led some to call these diet-related diseases “diseases of civilization.” In the twentieth century anthropologists leant support to the general idea when they reported that man’s dental mass and body stature had diminished with the advent of agriculture.
The fact that the most conspicuous change to our diet after agriculture concerned our intake of starch and sugar was not lost on health professionals at the time that these discoveries were made. Furthermore the idea that it was carbohydrates, not protein or fat, that were the chief culprit, fit with the evidence that came in from people living with the super-healthy Masai, who eat exclusively blood, milk and meat; or the Inuit, some of whom get close to 99 percent of their calories from animals. To take one example, there was not a single known case of breast cancer anywhere among the Inuit until 1966—and yes, they were looking for it.
In the early twentieth century, however, a competing theory came on the scene very suddenly. Some hypothesized that the diseases of civilization must be related to food abundance, and blamed what they thought was an increase in calorie consumption in general—the theory of the so-called “diseases of affluence.” Later many of this persuasion began to focus more specifically on saturated fat.
In the 1960s the medical world was well aware that the evidence that this new theory had going for it—epidemiological evidence comparing the diet and health of (selected) disparate populations—was weak. This was the kind of data that is useful for drawing up fresh, to-be-tested hypotheses, but nothing more than that. Still to be performed, of course, were the rigorous scientific studies. These would be large-scale observation and intervention diet studies within a given population, to test more conclusively, in a controlled way, whether or not it was true that high calorie consumption in general and fat consumption in particular were bad for human health.
The story from here on, as Gary Taubes recounts it in his Good Calories, Bad Calories, is an edifying tale of epistemic drift. By the mid 1970s there had been a sort of sea change, and many, including those on record as saying that the more conclusive studies were eagerly anticipated, began acting as though they had already been performed. They had not been performed because they were too expensive. Just a few years later, the idea that fat consumption causes problems like obesity and heart disease had been accepted and canonized so thoroughly that the older hypothesis on starches and sugars was abandoned just because it was so hard to square with it. The rest is history—history that, by way of the 1977 McGovern commission, the FDA’s Food Pyramid with its base in spilling bags of golden grains, and a host of institutions and ads and so forth warning of the hazards of fatty foods, led directly to my dim sense that my vegetarianism wasn’t just ethically and environmentally right, it was also going to help me live long and lean.
●
In the spring of 2005 I began to feel what I at first mistook for a pulled hamstring. This became very painful, and the pain progressed aggressively. Soon I was sort of tottering down the sidewalk, trying to reconstruct how it was that walking was supposed to flow along.
Not long afterwards, pain between my shoulder blades was keeping me up in the middle of the night, and while I was lying in bed I would hear the normal thump-thump … thump-thump of my heartbeat, but then, occasionally, a thump-thump, thumthi ……. THUMP thi ………… THUMP thi…. This was what brought me in to discover the heart condition.
In 2005, after the heart diagnosis, I gave up the vegetarianism just in case. I feel now that I could had been much more wise and proactive than that in regard to the role of food. For there was one big diet-related clue I could have picked up on.
It is a peculiarity of mine that when I work intensively at a project, I often sort of forget to eat. When my stomach finally forces me to open the fridge and stretch out my hand for the grapes, I unaccountably veer back to my work by some sort of override. It becomes noon, 1:00, 2:00. The later it gets the more focused I am on the redeeming rendezvous with wisdom that I keep feeling is just now upon me. At 3:00 in the afternoon I will have had nothing but a mug of water.
On such days I would notice that my pain disappeared. At the time I put it down to distraction and adrenaline and the healing heat of creative work. As my brain was being trashed, my body, which by now would be found pacing back and forth from desk to door, was lucid. Sitting at a desk is supposed to make you stiff and ragged, but in comparison to the way I generally felt my body was sailing the floor, floating forward by my gaze.
Reader: if you’ve got a chronic health issue, especially one that may involve inflammation, try a 24-hour fast. If the symptoms abate you’ve got good reason to believe it has to do with what you’re eating. I didn’t make the connection. Given what we now know about AS, this experience makes a lot of sense.
●
In 1973 a big advance was made in understanding AS when it was discovered that almost all AS patients share a common gene called HLA-B27. A lot of people have HLA-B27, and only a small proportion of them—about 5 percent—develop the disease. It appears that AS also requires an environmental “trigger” to set it off, which is thought to be true of almost all autoimmune conditions.
Dr. Alan Ebringer is a London immunologist, and the lead protagonist in the story of diet medicine for AS. In 1976 Ebringer reported that he had caught the culprit. A common bowel bacterium named Klebsiella pneumoniae looks like the HLA-B26 protein. Ebringer proposed that the two are so similar that the immune system mixes them up.
Today most researchers think the claim that a Klebsiella overgrowth causes AS via molecular mimicry is probably overstated at best. But most agree that gut flora is the right place to be looking for AS’s pathogenesis.
Gut flora are microorganisms whose environment is the gastrointestinal tract. When we’re in the womb our tract is sterile and doesn’t harbor any bacteria at all, but as soon as we eat and breathe we begin to seed our own personal populations of these mole people. Then they multiply—if you could extract all of the microorganisms living in your gut and pack them together it would look like a grey-brown softball and weigh about three pounds: upwards of 100 trillion microorganisms, ten times the number of cells in the human body. We are beginning to appreciate what a big influence they have on our health, even if our understanding of the causal connections in play is inchoate.22. The study of gut ecology, or the “microbiome,” has taken off in a serious way in the last five years, and is considered to be on the cutting edge of medical science. Among the first lessons learned by the National Institute of Health’s ambitious Human Microbiome Project, launched in 2007, has been how unimaginably complex the interrelations of our floral populations are, as well as how radically the makeup of these populations vary from individual to individual. Although apparently researchers think that the health claims made for yogurt and so forth are premature to put it mildly, many are confident about the prospect of treating a range of diseases—especially those that have been on the rise in the last century—by adjusting the makeup of the microbiome. Treatments that are already being experimented with range from the targeted introduction of bacteria to offset problematic populations, to, taking a page from the Marquis de Sade, “fecal transplants” from healthy donors. Yes, this last has been done, and with remarkable rates of success.
In many ways the human body, having evolved in the context of gut microbes, is designed to coexist with them. We give the bacteria a place to live and a steady supply of food, and some of them, the ones we call “probiotics,” apparently go to work for us.
We don’t always get along with them, however, and this brings us back to diet. Remember that the human digestive system was set up to handle the kind of food we were eating in the Paleolithic era, and hasn’t changed significantly since. People who advocate a Paleo diet think that all sorts of things go wrong— namely the aforementioned “diseases of civilization”—when we stray too far into the modern Western diet. What may be particularly pertinent in my case is that our healthy, predictable relationship with the predictable populations of bacteria in our gut depends on our old, ancestral diet.
Especially problematic in regard to gut flora are starches. Starches are long chains of carbohydrates that have to be “broken down” in the stomach into simple sugars before we can digest them. Unlike simple sugars, which are pretty much always metabolized, about 10-20 percent of the starches we eat remain undigested in the tract and pass into the colon. There they feed bacteria.
Paleolithic man would have encountered starch primarily in a few fruits, vegetables and roots—a relatively small amount. But 10,000 years ago—far, far too recently to have given our DNA any time to adjust—along came agriculture, which is for the most part the cultivation of starch-rich plants for food. Particularly irresistible were grains. The fact that grains are so starchy is in part what makes them reliable, storable, shippable: starchiness is what makes grains a great commodity.
The FDA food pyramid I grew up with had us taking in about 40-50 percent of our calories in the form of starches. This is one of the rare places where American habits have been on par with health recommendations; and as a vegetarian replacing meat with grain products and beans, I was taking in quite a bit more. The result was that my gut flora population—my “microbiome”—was (to put it vaguely, since we lack the details) out of whack.
This, many immunologists now believe, may explain how it is that my body came to attack itself: the constitution of my gut flora was not to my body’s liking, and my body, attempting to beat back an infestation, ended up accidentally attacking the healthy cells in my spine, sacrum, heart and who knows what else.
PART II
In 2009 I was finally diagnosed with AS. Even then, however, I remained in the dark about the possible role of diet in the disease. Once you have a diagnosis for AS you are referred to a rheumatologist. Mine looked like a battle-hardened wrinkly witch, but she didn’t turn out to be as interesting as she looked. She took me into her office and breezily went over the basics of AS, holding up a plastic chart of what looked like a schema of WWII battleships and were actually anti-inflammatory pills. Five months later I had tried three of these and felt quite a bit worse. I called to find out what I should do and was told that next in her “big bag of tricks”—these were her assistant’s words, though it didn’t seem like a big bag to me—was to begin immunosuppressant infusions.
This was what was on my mind a year ago. The general public doesn’t know much about autoimmune diseases, but investors do. The immunosuppressant drug Humira is set to become the most lucrative pharmaceutical in the world in 2012. Drugs like Humira run a patient $15,000-25,000 per year and higher. Because immunosuppressants knock back the body’s defense system as a whole, they also make you more susceptible to serious infections, cancer, congestive heart failure and liver disease, to name just some of their side effects. Nevertheless they are hailed as a miracle by doctors, and for good reason: they are often the last line of treatment for otherwise debilitating pain and degeneration, and can give a patient who has been suffering for years profound relief within weeks. They have a positive effect on AS symptoms about half the time.
On the same day I learned that immunosuppressants were next up for treatment options, I remember my wife seeing me struggling to lean over to unload the dishwasher and saying: “Oh just go lie down.” It looked like I was probably going to need to give immunosuppressants a try, though this couldn’t be entered into lightly because apparently they can be hard to get off of once you’re on them.
First, though, I thought I should try fiddling with my diet. By this time, as I’ve related, both my mom and Max were on ancestral foods. Also a friend of mine was a proselytizing “Paleo person” with his own blog. I remember Paul at Thanksgiving picking up a huge turkey leg and gnawing on it. Unconvinced that there would be enough dark meat to satisfy his need for fat, he had shown up with his own personal duck. Paul looked like he was consuming about three times more calories than the rest of us which to be frank I found disturbing, but I had to admit he was acting about three times more vigorous than I was feeling. Where was the usual Chicago grad student late November pallor? So I had one big advantage that you, reader, may not enjoy: seeing is believing.
(Incidentally Paul also once told me that it is a good idea to sit for an entire week doing nothing at all when the mood comes upon one, which he figured was what it was like for cavemen back in the day from time to time.)
It was with my inquiries into the future of Paul’s heart health, and his responses to them, that I first began to pay attention to the history regarding fats and dietary recommendations I’ve related above. Today the large scientific studies anticipated in the 1960s are actually beginning to roll in, and by and large the results do not tend to support the idea that saturated fat is bad for you. What continues to pile up is the evidence that starches and sugars at the levels we’re consuming them can be very dangerous. For lots of people, this leads to insulin issues. For others a Western diet, and especially grains, may have the effect of damaging the digestive tract, and causing what is called “increased intestinal permeability.” The end result of increased intestinal permeability is that we fail to properly screen things that pass into the bloodstream, making the whole unfortunate molecular mimicry cascade to autoimmunity that much more likely, just because there is so much more for the immune system to fight and make mistakes with. And then there are the gut floral issues I have described.
Long before I knew I had AS, Paul had mentioned that a Paleo diet alleviates autoimmunity in some people. When I got my diagnosis I remembered the claim. So then, really as a way to sort of prepare myself for a life on immunosuppressants, I decided to experiment with a Paleo diet just to be sure, for the purpose of eliminating everything but the immunosuppressants I guess you could say.
This was in December of 2011. Doing a little research on my new diet quickly led me to a blog by a guy who had AS and said that as everyone knows the key with AS isn’t quite Paleo, but a diet that eliminates all starches. So I switched my dietary test to that instead. I thought it would most likely be a temporary thing—I’d give it two weeks.
●
The idea that you could effectively get over active AS by eating less starch wasn’t immediately obvious, even to those that initially proposed the gut flora connection. Fortunately, at this point serendipity stepped in. A patient of Dr. Ebringer’s who saw him for AS in the early 1980s asked him if he had any advice for losing weight. Ebringer, for reasons that had nothing to do with arthritis and autoimmunity—and departing, obviously, from the opinions of the time—recommended a diet of red meat and tomatoes. The patient took the advice and did lose weight, but more importantly he also came back reporting something shocking: his pain and inflammation were gone as well.
A light went on in Ebringer’s mind: these foods don’t contain complex carbohydrates. The starch must be what feeds the relevant gut flora; and gut floral overgrowth must not be a one-time passing “trigger” for AS but an ongoing cause and condition of its pathology. This means that we can treat AS by reducing the starch we take in. Ebringer came up with what he called the “London AS Low-Starch Diet,” which he administered to patients at his clinic. This basically consisted of four prohibitions: NO BREAD, NO POTATOES, NO CAKES (in Britain this comprehends pastries), NO PASTA. He published his findings on the diet in a 1996 paper, which reported that of the 450 patients he treated at his clinic in London between 1983-1996, over half now required no medication. His clinical intervention trial of 36 patients showed a statistically significant decline in markers for inflammation over a period of nine months.
In effect, the idea behind the diet is that eating less starch has the same result as my accidental fasts: it starves out certain gut flora, this time by more precisely targeting their starchy substrate. Bacteria are notoriously hard to eradicate entirely but they do die without sustenance. Whether or not Ebringer turns out to have been correct regarding the role of the Klebsiella organism specifically, it appears that he discovered how to change the constitution of one’s gut flora in such a way as to curtail AS autoimmunity.
The Spondylitis Association of America, the largest disseminator of information to AS patients in the world, until recently made the very visible and important claim on their website regarding Ebringer’s findings that “other studies … have found that the diet has little or no effect on symptoms.” When I ran across this statement, I contacted them to find out more. What followed was an exchange of emails in which I basically asked over and over “but what studies specifically?” They ended up dropping the claim from the site. Because, as they then found, there have been no such studies.
Why the SAA made such a stark and egregious error I’m not sure. (To their credit they amended it quickly). Why no one in the medical world other than Ebringer and his colleagues has touched the diet, and indeed no large-scale intervention diet studies have ever been performed on AS patients, and instead rheumatologists pooh pooh the very idea of diet-based medicine in spite of mounting consensus among researchers regarding the role of intestinal health and gut flora, is a different discussion for a different time.
●
The idea of treating AS with a diet low in starches could have languished in Ebringer’s ignored 1996 paper, and shed no light on the likes of me, if it weren’t for the excitement and evangelism of Ebringer’s healed patients, and later those influenced by them (this essay, for example, being my contribution).
By the late 1990s those who had their legs back because of Ebringer’s diet connected his work to two of the diet’s present-day heroes. First, someone discovered that there was already a book on eating starch-free called The IBS Low-Starch Diet by a layperson named Carol Sinclair. Sinclair had discovered, after much tinkering and testing, that she could cure her irritable bowel if she proscribed all starch. Her regimen is much more thorough than Ebringer’s rather low-key prohibitions, and today her book is the main how-to treatise for people like me who are on a strict “No Starch Diet,” or NSD. Sinclair recently discovered via genetic testing that she had AS all along.
The other individual Ebringer’s patients connected with was an AS sufferer the no starch community knows as Dragonslayer. Dragonslayer is NSD’s apostle to the internet, the co-founder and owner of kickas.org, the main forum for information on the diet. Dragonslayer has the quirk of capitalizing the word “you,” and when provoked by NSD skeptics he uses words like “demise” and “downfall” in sentences that sound like they should be followed by “mwaha- hahaha,” e.g. “And with every statement, You crawl further out on the limb You are sawing off.” “NONSENSE! You might want to re-read the study and pay attention this time. There may be a quiz at the end.” Since kickas.org went online in 1999, Dragonslayer reports that he has personally helped over two hundred people to remission through NSD via his online correspondence—which if it’s accurate is an astonishing achievement for a layman who never sought to make a penny from it. The night that I began the diet I read Dragonslayer’s “NSD QuickStart and tips.” While in general I try to avoid throwing away my sunny days on the internet like I’m slack-lipped at a rummage sale, this easy ability to share and glean information outside the orthodox medical community radically improved, and maybe saved, my life. I could not be more thankful to those who made the diet a presence online.
PART III
In reading what I am about to describe, you must remember that my intention is not to convince you to try a draconian diet like mine. Such things are for gimps such as myself. I’m trying to convince you to lean toward our hominid past and thus avoid autoimmunity as well as other diet-related diseases, which together affect 50-65 percent of people eating a contemporary Western diet worldwide.
Some of the things that NSD prohibits: all grains (which includes wheat, which of course means pastries, pasta, etc.; also corn and corn products, rice); all legumes; all root vegetables; a few fruits (like bananas); and certain nuts, including peanuts (anyway they’re peas not nuts).
Some things are tricky and their starchiness has to be determined item by item. For this I use an old middle school chemistry class trick I learned from Sinclair’s The IBS Low-Starch Diet: iodine turns blackish blue if you put a droplet on something starchy. All tomatoes need to be tested in this way. So do apples, vine fruits like watermelons, and coconuts.
That leaves one with all animal foods, most fruits and vegetables, and the remaining nuts. People on NSD can eat dairy, which makes the diet in this one sense more permissive than one that sticks strictly to Paleolithic foods.
This is all, needless to say, an immense pain in the ass. In the first weeks that I was eating the food of the hunter-gatherer, I felt a bit like a hunter-gatherer must have felt in a world where food was hiding and had to be ferreted out from the rocks. At the supermarket I soon found it was not worthwhile to venture into the actual aisles of the store, since almost everything that can be found there is starchy or has had starch added to it. I had to stick to the periphery. There is often literally nothing I can eat at gas stations.
As I say, initially I was sure this was just a two-week experiment. I would soon return to my old diet, the diet of my friends, family and neighbors. I thought it most likely that the online NSD people were willful witnesses and hypochondriacs. Right around two weeks in, however, when my trial was to conclude, I started to feel significantly better in my legs and back, and I understood that this was for real.
The community on kickas.org has built up an edifice of anecdotal info: it appears that the no starch diet doesn’t work for all AS sufferers, and it works best in those that catch the disease early. Relief arrives in two to ten weeks or so (it isn’t as immediate as fasting), and the pain will stay away unless you eat starch, in which case you are likely to get a “flare up” of symptoms. You need to be really rather strict with the diet it for it to work well: a bite of yogurt that contains corn starch will set you back.
The latter has indeed been my experience. Diets like mine are very difficult, and I wish I could congratulate myself on the kind of rare restraint that often has to go into such things. The truth is that I open the cupboard at my wife’s parents’ place and am presented with food that will put me into face-altering pain within five hours. This will last two or three days. Whatever respecting that is, it cannot be called restraint.
●
It took two weeks for me to see results, which was very motivating. But the diet of course took much longer to get accustomed to. In the end, what had to happen was a kind of reimagining of what was reasonable in regard to eating meat, and especially fat, and also a sort of magnification in my mind of foods I was already familiar with.
This morning when I got downstairs I sliced a sausage in half and fried it. Once it was almost done I drained off the fat and heaped in baby spinach so that this stuck out over the top of the pan. The result looks like two brownish-gold submarines in weeds. I keep poached eggs in ramekins in the fridge, and I heated two of these in the microwave with some butter (this is how restaurants do it—happily it’s quite delicate). I put blackberries around the edge of the plate.
For a long time I don’t think I quite realized that this or something like it was now to be an every-morning thing for me. It still felt like a once-a-month brunch. Incidentally, however, in this country, right up into the late nineteenth century, for those who could afford it this was a standard morning meal: Thoreau’s contemporaries very often ate a breakfast of eggs and meat much like mine. A meal of boiled grains such as farina would be the last resort of the poor, one of Dickens’s symbols for their suffering.
Actually American expectations in regard to breakfast have a rather dubious history, one worth recounting now, as I’m sure, reader, you’re thinking that I’ve started to stray beyond the bounds of what’s reasonable.
The story of dry grains and our notion that they’re the healthy way to begin the day got its start in religious radicalism. In the early nineteenth century, a Presbyterian minister by the name of Graham, namesake of the cracker, began to promote a high-fiber vegetarian diet as a way to curb carnal passions. He was especially obsessed with “self-abuse,” what they used to call masturbation. People had believed for a long time that a diet that excludes meat, the diet of the holy ascetic, reduces libido. Today there is actually a lot of evidence to back up that claim. In the nineteenth century with people like Graham, however, there emerged the erroneous notion that sexual arousal was itself terrible for our health—the problem, so he wrote, at the root of most modern ailments. So the vegetarian diet was not just the most spiritually pure diet, but the healthiest human diet too.
Graham influenced those who went on to radically change our culinary habits. Among them was the founder and head of the Seventh Day Adventist church, Ellen White. The Seventh Day Adventists are best known for their belief that Saturday is the Sabbath. They also have a very literal interpretation of Genesis 1:29: “Behold, I have given you every herb bearing seed … to you it shall be meat.” They are vegetarians. For religious reasons they believe that man is not supposed to eat meat. Now Graham and his followers were conveniently adding that this vegetarianism is best for the body as well as the soul. White and the Adventists founded a sanitarium in Battle Creek, Michigan, “The San”—the setting and subject of the 1993 comic bestseller The Road to Wellville—to promote the new health principles, and appointed their fellow parishioner and physician John Harvey Kellogg to run it.
Dr. Kellogg had the charisma and commercial savvy that Graham lacked. He shared with Graham an abhorrence of sexuality. In his manual on avoiding sexual arousal and activity, Plain Facts For Old and Young, Kellogg describes the procedures he used for preventing masturbation at The San:
Quite simple really. And there was something for the ladies: “In females, the author has found the application of pure carbolic acid (phenol) to the clitoris an excellent means of allaying the abnormal excitement.” Kellogg was a piece of work. When he couldn’t cure someone, he would accuse them of masturbation. He himself was married but celibate, penning parts of Plain Facts on his wedding night. He would have an assistant give him an enema every morning after breakfast.
It was Kellogg who bequeathed to us the reason still given for the healthfulness of dry cereals: that they cleanse us. Really everything Kellogg did was in one way or another about cleansing. He spent much of the latter half of his life coming up with ways to purify the white race (some sort of scheme involving nuts). At the top of Kellogg’s list for avoiding the arousal of the blood in carnal lust and its attendant maladies was bland, high-fiber vegetarian food, especially dry cereal grains, which would pass through the intestines and rake out matter that he thought stimulated the genitals. At this time, with The San at the middle of the action, many of the dry cereals that we know today and that still command a lot of shelf space at the supermarket were developed and marketed: shredded wheat, granola, bran and corn flakes. All were given the same recommendation: that they would keep us clean. Shredded wheat actually looks (and tastes—as Kellogg himself noted) like a bristle brush.
In the twentieth century a lot of fiber-rich dry foods like breakfast cereals and crackers had the fiber processed out of them to make them tastier (Graham would have shaken his head at the sugary sweet fate of the Graham cracker, which basically became a cookie). The scrubbing action of fiber was ripe for a new revival, and that revival—again led by religious men, this time two British missionary doctors, Denis Burkitt and Hugh Trowell—is what we’re still living through.
But aren’t I committing a genetic fallacy here? However misguided its beginnings, surely the benefits of fiber have now been backed up by modern dietary research? Actually no. Not even close. In the 170 years since the cereal revolution, very few controlled clinical or large-scale observation studies of dietary fiber from grains have ever been performed—seriously—and in those that have the results have been dismal. As Gary Taubes writes of the hypothesis that fiber is important for human health: “The pattern is precisely what would be expected of a hypothesis that simply isn’t true: the larger and more rigorous the trials set up to test it, the more consistently negative the evidence.” For fiber, as with anti-fat recommendations, the royal road ran from hypothesis to advertising to accepted health doctrine, without making a stop at meaningful, rigorous research. Only now is that research beginning to roll in. On one point the fiber enthusiasts appear to be correct: fiber does lead to more “regularity” and bigger poops—the obsession of octogenarian weirdoes. The trouble is that we have yet to find any correlation between bowel regularity and a single significant mark of human health, including colon cancer.
The data on fiber isn’t conclusive yet, but by all appearances fiber is going the way of the douche. We used to think that a lot of other things needed to be cleaned: vaginas, veins, throats, ears, sinuses. If there’s a hole, someone has thought that it would be a good idea to get up in there regularly and go to work. With all of these most of us have returned to the common sense idea—having, today, perhaps a little more trust in the body—that the inaccessible insides of an animal will generally do a fine job keeping tidy on their own.
Pretty much everything Kellogg did at The San has turned out to be arrant quackery. What hangs on is our prejudice as to what’s for breakfast. Kellogg was also a leader in the new formula that set the standard for making money in food development in the next century and a half. Basically big business in marketing food has to focus on what it can commodify and brand. It’s not easy to do this with my breakfast—eggs, sausage, spinach, blackberries—at least not in a proprietary way. You can brand things that you’ve changed, cooked up, modified to make edible, and you can commodify what you can store and ship. This you can give a name like “Grape Nuts.” “Grape Nuts,” neither grape nor nut, was invented by C.W. Post, one of Kellogg’s patients at The San and another deplorable mix of religious fervor, charlatanry and commercial savvy.
Interestingly there is one thing Dr. Kellogg may have gotten right: he appears to have been ahead of his time in his speculation that many modern ailments have to do with intestinal flora. Only he thought that this should be treated by eating bristle brush-like substances, and the liberal application of yogurt up the butt.
It so happens that one paper I’ve seen cited in the context of AS and gut flora is a 1973 study of 47 contemporary Seventh-Day Adventists. The vegetarian Adventists had on average 30,000 Klebsiella organisms per gram in their feces—the organism that may cause AS via molecular mimicry—compared to 700 organisms per gram in the controls on a normal mixed American diet: in other words, these vegetarians had more than forty times the amount of this microbe in their gut.
There have to date been no epidemiological studies on the link between vegetarianism and autoimmune disease, though many suspect one. For what it’s worth all four of the people I know with an autoimmune disorder, including myself, were vegetarians in their youth—tragically two still are.
●
Figuring out Breakfast, or I guess you could say allowing it, was one of the first big adjustments I had to make on the diet. There are plenty of things that I still miss, of course. One that seizes me from time to time is that without starches, there are very few things that I can eat that are crunchy. Not a chomp like a cucumber but a dry crackly crunch—apparently one of the few food sensations a baby doesn’t have to learn to like. I recently baked kale chips purely for crunch, but when I bit into them they made a crinkle.
In general, however, I’ve found that I’ve made a good bargain when it comes to delight and satisfaction in food: starch and sugar for fat. And there is nothing in the diet that is like choking on a bran muffin.
Thoreau, a lax vegetarian, a vegetarian “in his imagination,” as he says, lets fly in one of Walden’s many parrhesic paragraphs:
The reformed vegetarians I know have some such story to tell. My recollection, which I suppose is the other side of Thoreau’s, is of approaching home across the park below my apartment with numb fingers on a raw February night, looking up at my windows where my wife was cooking and knowing in my heart that no soybean would warm my extremities.
One wonders about Thoreau. Walden is his long paean on the ills of cultivation and civilization in favor of the primitive and the wild—the mock-Georgic agricultural chapter “The Bean Field” not at all excepted. He explains that he is “less and less a fisherman” because he is learning to heed an intuition. He says of this: “It is a faint intimation, yet so are the first streaks of morning.” What he leaves hanging is why he heeded what was so faint, and why he didn’t find a place for his other much less faint feelings.33. I can’t help noting that Thoreau died at 44, ostensibly of TB. TB is an infectious disease, but many suspect that there is a connection between TB and autoimmunity, as the two are strongly correlated.
For lunch today I have brought to work some leftover beef tips and salad, supplemented by rolled-up turkey cold cuts with mustard, olives, and summer squash. I have an apple for a snack (a Fuji—one of the few reliably non-starchy varieties), and a chunk of chocolate.
I have taken to making a batch of crab cakes on the weekend. When I get home tonight I will probably have a pre-dinner snack of one of these. For dinner it’s wild salmon, of which I will serve myself about three times as much to eat in one sitting as I did last summer.
●
The question “What ought we to eat?” involves not just nutritional considerations, but also ethical, environmental, and pecuniary ones. These after all were the considerations that led me to my vegetarianism all those years. Where have I put them? I’ll try to be open about what still unsettles me.
The truth is that whereas I remember a time when my wife and I budgeted $150 per week on groceries for the two of us—about average for an American household—now it is more in the range of $220. I bite the bullet on spending significantly more money on what I eat now. We’ve cut back on other things like dining out and, happily, health bills.
If you can’t afford it, well then this just isn’t an option obviously; but there are different degrees of “can’t” here. As late as the 1940s, Americans allocated more than 40 percent of their income to food. Something that it is important to recognize is that the reason we are now able to allocate so little—around 15 percent, including eating out—is that such low prices are sustained by conditions that must change: cheap fossil fuels, abuse of an immigrant work force, and especially the once rich but now increasingly depleted and desiccated soil of the Midwest. In the case of factory-farmed meat, prices are kept low by our tolerance of the animals’ abominable anguish and misery.
My diet has me buying grass-fed beef and bison and other animal foods raised down the road in New Hampshire where I live (which, like much of the rest of the planet, isn’t suitable for grain agriculture anyway). That is what much of my high expenditure is going toward. There’s little that I’m confident about in regard to the social responsibility of my diet, but there is no question, on this one matter, that the pasture is a better life for the cow.
The pasture may also be preferable environmentally. Unlike a cow on a feed lot, what a pastured cow consumes is transformed, firstly, into cow, and then the remainder comes back to the soil as nitrogen-rich pee and manure. The ruminant is part of a process that turns renewable resources—sunlight and rain—into both nutritious food and fertility, the same fifty-million-year-old ecology that built up the rich topsoil of the world’s agricultural areas. It does so largely without the intervention of fossil fuels.
This last point is often passed over. Some of our reflexes when it comes to the social irresponsibility of carnivorism come from the fact that when we compare eating meat to eating vegetables, and impressively add up how many more people can be fed on an acre of soybeans and so forth, we are often obliviously talking about agriculture that draws on various forms of nonrenewable natural capital. One such source, the prairie soil, was actually made from generations of the resident ruminants that the soybeans are said to be preferable to. The truth is that intensive agriculture in the absence of livestock either enters endgame or depends heavily on fertilizers that are themselves made with, and from, fossil fuels. You may have heard that in many agricultural areas aquifers are depleted (read: no water), and the cracking, wizened-white soil is washing away. Increasingly the soil that the world’s grains come from is literally dead, with none of the biotic vitality required for plant life but pumped-in fertilizers.
In The Omnivore’s Dilemma, Michael Pollan estimates that an acre of corn takes fifty gallons of oil all told to come to table. There may or may not be a catastrophe resulting from a serious hike in the price of fossil fuels, but it isn’t clear to me which food choices will do a better job of averting or assuaging it.
Feeding everyone on spangled cows in a grassy dale with wet nostrils open to the breeze, and fish from the deep blue ocean, is not in the cards. I have no illusions that the hardest question—one certainly on my mind—is one I’m only further from an answer to: What about the whole world? The distressing truth as far as I can tell is that comparing, say, responsibly raised animal foods to grains, the choice is between food that is viable but hardly high-yield enough for all, or a food we can (and do) grow for the whole world, using up resources we’re rapidly running low on. Without the use of fossil fuels for fertilizer, our planet has a human carrying capacity of around four billion. We’re at seven billion. That’s an untidy constraint I have no idea what to do with when something in me inarticulately wants to know how the whole world and its children’s children can live, and wants to live like that.
●
Meanwhile it does seem obvious to me which is the correct choice nutritionally. Five years ago my cardiologist told me I wasn’t likely to live a long life. One year ago I was told that in the event I did live a long life I might not be able to look my friends and family in the eye, because even if treatment for AS went well it was likely that my sacrum and spine all the way up to my skull would be gradually fusing into one bent bone.
And I was in a lot of pain. It is hard to describe the way it feels to have inflammation in the hip area. It doesn’t feel like there is some spot that hurts. Rather it’s like as long as you stay still the world will keep its normal laws, but take a step and a different dimension flickers on, and there is no possibility of understanding how it works. It’s like you’re trying to move in an anti-medium, a thickening thing, pain, with the slow awkwardness of an underwater Kung Fu scene. I also had insomnia for those years, which may or may not have been related to autoimmunity.
Two weeks after I began to eat a no starch diet my pain pretty much went away—from like a 4 or a 5 every day with no remission for the final two years to a 1. I also began to sleep well. Three months in, my inflammation had crept back to its deepest strongholds; I didn’t feel pain anymore unless I poked really, really hard in one spot on my tailbone. Then one day that too was gone. Today, nine months in, I’m fine, and I can jog and so forth like old times. At my most recent (and I’m hoping my last) visit to the rheumatologist my blood work showed my indicators for inflammation were normal—low, in fact, as Americans go.
Over the years my heart had mysteriously been improving its function bit by bit. Originally the cardiologist told me that this couldn’t happen, but she and I watched as my heart went from pumping out 30 to 35 to 40 to 45 percent, with each semiannual echocardiogram from 2007-2010. I became glad I declined the invasive internal defibrillator that she had prescribed. Of course I can’t be sure what caused this recovery, but my best guess is that when I was tested at 30 percent I had, as I have recounted, been a vegetarian for five years. At my most recent visit to the cardiologist, five months into the no starch diet, the echocardiogram showed my heart pumping out 55 percent of its volume with each beat—in other words, normal. The report literally says “all valves working normally.”
If I had assented to what the experts I saw prescribed to me in the half- decade before I began the diet, I would have a device the size of a cigarette box implanted in my chest with permanent wires running into my heart. This would be liable, scandalously often it turns out, to give me a shock without warning when it isn’t needed. There is no off switch. One poor guy, I learned, had it misfire on him 74 times in one day.
I would today be on anti-inflammatories, antidepressants (I was soon to begin these to treat the insomnia), and needling myself each week for dangerous immunosuppressant infusions. I don’t think that any of these recommendations were out of the ordinary; I sought out and got some of the best help available.
I was so close to this fate—really if any one of the parts of the story of the no starch diet and its dissemination, or any one of the factors leading me to try it, hadn’t played out, I would be on all of these treatments. And I would be lethargic from the unreal sleep afforded by pharmaceuticals. And I would most likely still be in a lot of pain. And fusing up. Then there would be other costs. The defibrillator would have cost my insurance $50,000 (of which I would pay ten percent), with a $5,000 new battery every five years. The immunosuppres- sants would be running $15-25,000 per year, to say nothing of all the continued exams, visits to specialists, surgery. I would be journeying into a dark, dependent and bewildering life. Now, instead, I’m quite well. That’s the bottom line in my personal story.
●
If you peruse the health columns in the New York Times, you’ll see that the titans are rebelling: pissed off Salt, Saturated Fat, and Cholesterol are re-ascending Mt. Olympus. Waify Stretching has just been cast down the rocks. They’ve got blond Grains in their arms and are heaving him back and forth, winding up for the blissful banishing toss.
Next month you’ll see columns cheering on the old order. This sort of twenty-first century fog of data can make any attempt to address one’s health by diet feel like guesswork. But I think what I am suggesting is a straightforward idea. When archaeologists and anthropologists can tell us that hominids ate such and such, then this food is probably fine; and when they tell us that hominids never ate such and such, or very little of it, there is good reason to suspect it isn’t—because we are hominids. Given that hominids historically rarely ate grains, for example, we shouldn’t eat them in mass quantities. By no means is that an air-tight rubric; but it’s a good general guideline, to be followed in a way one finds reasonable.
Hippocrates is supposed to have said “Let food be your medicine and medi- cine be your food.” That’s for the afflicted. For ordinary people it ought to be: make sure your food is food for the kind of thing that you are. Michael Pollan’s now famous definition of food is that it is only food if your grandmother would recognize what it is. The idea of eating like our ancestors, at least the version of it I take up, is something similar: it is human food if your great-great-grandmother out to about 150 generations was eating it.
●
Hippocrates Is also supposed to have said: “Walking is man’s best medicine,” and in this he and I are most firmly in agreement. Last April my son Ben was born, our first. Those first few weeks stretch people pretty thin. It’s hard to know what it would have even looked like if my insomnia and inflammation were still around then. By April, however, I was in full remission from the disease, and I was able to be the legs of the family while my wife recovered from the birth.
I take care of Ben in the morning, and every day I strap him into his sort of backpack and we take a meandering walk, if a modest one. There is a 50 percent chance that Ben inherited the HLA-B27 gene from me, but hopefully there will be no need to ever find out. He and my wife will eat—what becomes inevitable with me eating the way I do—a sharply starch-reduced diet, and if the account I’ve laid out here is correct then there’s not much chance of him developing the disease in our house. I’m hoping that his dad will get stronger and stronger as he grows up.
If you liked this essay, you’ll love reading The Point in print.