Teotb graphic
Teotb image

Appendix 6: Pestilence/Disease

Appendix 6

Secular Chronology Confirmation

How current trends corroborate the Bible’s revealed timeline


If man were really as clever as he thinks he is, he would have figured out how to eliminate disease by now. In fact, there was a time, in our not-too-distant past, when we humans actually believed we were on the cusp of wiping out “pestilence” altogether. The serendipitous observation of what the fungus Penicillum rubens would do if grown in the right substrate led to the development of antibiotics—the wonder drugs that were expected to usher us all into a healthy new tomorrow. Scottish scientist Alexander Fleming’s fortuitous 1928 find eventually won him the Nobel prize, and began the quest for new drugs that could be used to cure virtually anything that ailed you.

The initial optimism proved to be premature, however. Some diseases are caused not by bacteria (which responded—at least at first—to antibiotics like Penicillin), but by viruses, ubiquitous and invasive organisms that are so incredibly tiny, they can even infect bacteria themselves. Anti-viral strategies were hatched that proved quite effective against such dreaded maladies as polio and smallpox. But again, the initial euphoria was dampened when it became apparent that both viruses and bacteria had the ability to defend themselves against our efforts to eradicate them: they changed—just enough to avoid being easy targets for the medicines we’d developed. This defense is usually chalked up to “mutations” in their genomes, but I suspect that most of the pathogens’ self-defense response is “merely” the bringing of recessive genes to the surface—genes that were always present (by God’s design) but had never had to play a role before.

So here we are, over half a century later, with a very different outlook. Responsible doctors are now reluctant to prescribe antibiotics except in extreme circumstances, for fear of unnecessarily building immunity to them in the patient. Meanwhile, vaccines have taken on a life of their own: we’ve gone from inoculating our children against a few of the most deadly viruses to trying to head off scores of them. Basically, we’ve tried to outlaw risk—the pursuit of fools in a fallen world. It has gotten to the point where, truth be told, the inoculations themselves are more dangerous than many of the diseases they were designed to stop. The pathogens, meanwhile, are alive and well, forever poised to assume slightly different faces and reintroduce themselves to the human race.

People attuned to Biblical prophecy should find none of this surprising. Pestilence, after all, was predicted as a “feature” of the Last Days, both before and after the rapture. Yahshua warned us, “See that you are not troubled; for all these things must come to pass, but the end is not yet. For nation will rise against nation, and kingdom against kingdom. And there will be famines, pestilences, and earthquakes in various places. All these are the beginning of sorrows.” (Matthew 24:6-8) For all of our medical advancements, pandemics still threaten to sweep the globe. If they aren’t “natural” (i.e., accidental), they’re self-inflicted (like HIV/AIDS), or even part of the arsenal of insane modern nations—the capacity for biological warfare.

But it’s not as if biological WMDs are really necessary to destroy entire populations once war breaks out on a global scale. Disease shows up more or less automatically—or at least, it always has. The last of the so called “four horsemen of the Apocalypse” (arriving after the introduction of the Antichrist, another world war, and unprecedented food shortages) was described thus: “I looked and a pale-colored horse appeared.” In the original Greek, the color is a pale, sickly green. “Its rider’s name was Death, and Hades came close behind him; and authority was given to them over the fourth part of the earth, to kill with the sword or with famine or pestilence or by means of the wild beasts of the earth.”  (Revelation 6:8) We’ve seen this scenario many times in the past, of course, but never before on a worldwide scale. If history teaches us anything, it is that where war goes, famine and pestilence are sure to follow. During the American Civil War, more soldiers died from infections and disease than from gunshot wounds.

It is not my purpose here to catalog all of the diseases that currently (or potentially) threaten mankind. I am merely endeavoring to point out that despite our best efforts and vastly improved knowledge, we live in a world that is just as dangerous in this regard as it ever was—if not more so. But because these essays concern not Biblical prophecy (per se), but rather the many secular evidences tending to confirm the prophetic timeline revealed in scripture (the 2033 hypothesis), we need to consider not only infectious or contagious diseases (those maladies the Bible would call “pestilence”) but also the chronic, non-communicable ailments that are conspiring to bring our whole world to its knees.

Let’s face it: it won’t matter if we vanquish AIDS like we did smallpox, or if we rid the world of ebola, cholera, typhoid, anthrax, e. coli, influenza, the common cold, and a hundred other infectious diseases—if we all end up suffering from things we didn’t “catch” from bacteria or viruses. Would the world still be able to function if everyone had to deal with some debilitating malady? Cancer. Alzheimer’s disease. Cystic Fibrosis. Rheumatoid Arthritis. Huntington’s Chorea. Asthma or Emphysema. ALS (Lou Gehrig’s Disease). Autism. Epilepsy. Down’s Syndrome. Crohn’s Disease. Parkinson’s. Schizophrenia. Multiple sclerosis. Fibromyalgia. Bi-polar Disorder. Lupus. Hemophilia. Ulcerative colitis…. Any one of these—and the list could go on practically forever—has the potential to dominate your life, to render you unproductive to some extent, and to add insult to injury, make caring for you a full time job for someone else.

At what point does a civilization faced with a growing population of medically dependent citizens simply grind to a halt? For thirty years, my wife invested her whole life in our handicapped kids (and then ailing parents) while I held down a job to keep us afloat financially. It worked, thanks to the grace of God. But if either one of us had become physically disabled or mentally unstable, the whole thing would have collapsed like a house of cards.

In an Associated Press article entitled Chronic Illness Burden Rising Faster Than Expected (November 29, 2000), Lauran Neergaard reported: “Nearly half of Americans suffer at least one chronic disease, everything from allergies to heart disease—20 million more than doctors had anticipated this year, researchers say. And they warn that the fast-growing toll, now at 125 million among a population of 276 million, will reach 157 million by 2020. One-fifth of Americans have two or more chronic illnesses, complicating their care and making it more expensive.” Bear in mind that the article is well over a decade old—the statistics have only gotten worse. “‘The nation is unprepared to cope with the growing burden of chronic disease, with annual medical bills alone expected to almost double to $1.07 trillion by 2020,’ [said] Dr. Gerard Anderson of Baltimore’s Johns Hopkins University…. Already 60 million Americans suffer multiple chronic illnesses, a number expected to reach 81 million by 2020 as the population ages, Anderson reported.” I have reason to believe that even these dire predictions will fall well short of reality.

The rising incidence of diabetes can be taken as a wake-up call. A 1997 article by Harris Coulter stated: “In 1947, there were an estimated 600,000 cases of diabetes in the United States.....Today the Metropolitan Life Insurance Co.’s quarterly Statistical Bulletin estimates that diabetics make up 5 percent of the U.S. population, or 13 million persons…. So, while the U.S. population has approximately doubled since the 1940s, the number of diabetics has risen more than 20 times.” Again, though the article is rather old now, the trend had not reversed itself. By 2011, the percentage had risen to 8.3 percent—that’s 25.8 million people in the U.S. alone. Susan Fenelon Kerr reports: “Some doctors who specialize in pediatric diabetes say patients with the so-called ‘lifestyle’ diabetes, which can be controlled by exercise and nutrition, now make up 15 percent of their patient load compared to 1 or 2 percent in years past.”

To give you a better feel for the potential for disaster that’s looming before us, let us consider the case of autism, the fastest growing developmental disability in the U.S., though no definitive cause has been identified. (I posited a possible starting point back in our chapter on Famine Factors—the advent of genetically modified foods, the manifestations and statistics for which track perfectly with the incidence of autism. Vaccine over-use has been blamed as well, but there is no consensus on the matter.) The Center for Autism reports that one percent of the population of children ages 3 to 17 in the U.S. have an autism spectrum disorder. At present, its prevalence is estimated at 1 in 88 births—with four times more boys than girls being affected. 1 to 1.5 million Americans live with an autism spectrum disorder: that’s an astounding 1,148% growth rate since the rise in autism incidence was noted back in the 1970s—a 600% increase in prevalence over the last two decades alone. Even more alarming, the growth rate today is 10-17 % annually (depending on where you live). Put another way, the incidence of autism in 1975 was one in 5000; by 2009, it was one in only 110! Extrapolate this trend out over the next couple of decades, and you’ll begin to see what I’m concerned about.

Autism is a very labor- and cost-intensive thing to deal with, never mind the emotional toll on the parents (and remember: marriage and family are now considered by many to be obsolete customs—leaving Mom to shoulder the emotional burden all by herself in many cases). What will happen when one kid in fifty develops autism? And another one in fifty has asthma? And another one in fifty has leukemia? And another one in fifty is bi-polar? And another one in fifty is dealing with the symptoms of autoimmune disease?

And what will happen if their parents have their own debilitating chronic diseases to deal with—arthritis, fibromyalgia, migraines, lupus, diabetes, depression, etc. Consider this: socialism is a cultural “disease” in which society crumbles when the unproductive majority become financially dependent on the productive minority. So what will happen to civilization when the majority of people in this world are not healthy enough to take care of themselves (or anybody else) anymore? That’s the direction we’re headed, and the “event horizon,” I’m afraid, is only a few decades off.

 SmartGlobalHealth.org puts the issue of “chronic disease” into perspective for us. “The Health Challenge,” they say, is that “rates of non-communicable or chronic disease continue to increase dramatically in all countries (industrialized, middle income, and low income), surpassing infections as a disease burden among adults…. Chronic disease rates have been rising in all countries. In 2002…60% of the 56 million deaths worldwide were due at least in part to chronic diseases. Nearly three quarters of the world’s chronic disease-related deaths that year occurred in developing countries.” So the whole thing can’t be glibly written off as the price of American-style overindulgence.

“Globally, the leading chronic disease problems are: cardiovascular diseases (including strokes), cancer, chronic lung disease (including asthma), and diabetes. These problems are often the result of behaviors that increase disease risk, such as smoking tobacco, alcohol use, unhealthy diet and physical inactivity. These latter risk factors give rise to intermediate conditions such as obesity, high blood pressure and abnormal lipid (cholesterol) and glucose metabolism. The major economic impacts of chronic diseases include higher health care costs, lost productivity due to illness, disability and death among the working age population, and the need to replace these lost workers.”

It is axiomatic that “nobody gets out of here alive.” We all have to die of something. So a pie-chart delineating “Projected main causes of death, worldwide” (all ages, 2005) presented on Smart Global Health’s website can be a bit misleading: after all, it’s always going to add up to 100%. Even the deaths of people who “die of old age” are attributed to some immediate “cause.” Nevertheless, the numbers can be revealing, since the “chronic disease” slice of the pie is steadily increasing, and because “almost 45% of chronic disease deaths occur prematurely, under the age of 70 years.” 9% of deaths are due to injuries or trauma. This figure would presumably include the casualties of war and crime. 30% are due to communicable diseases, maternal and perinatal conditions [meaning that the horrendous worldwide death toll due to abortions—some 45 million souls per year—are not included in the tally], and nutritional deficiencies. That leaves the remainder—61% of all deaths—due to chronic (i.e., non-communicable) diseases. So cardiovascular issues account for 30% of deaths. Cancer: 13%. Chronic respiratory diseases: 7%. Diabetes: 2%. And “other” chronic diseases add up to 9%.

But since death is a universal reality, it’s not so much a matter of what kills us in the end, but of how we live while life remains. There’s a spiritual component to this, but only a fool would fall into the trap of suggesting that “since God wants the best for us, if we are in the center of His will He will automatically bless us with good health.” It doesn’t work that way. Just ask Job. We live in a fallen, sinful world—ergo, bad things can happen—to anyone. They’re not evidence of the proactive wrath of God (necessarily). Rather, He allows it to rain on the just and the unjust alike, mostly to see what we’ll do when faced with a challenge, I suspect. Our mortal lives are for learning, and in school, there are always tests along the way. They help us (and our instructors) gauge our progress.

So when (or if) we get sick, we should receive it as an opportunity to rely physically on our God, just as we do spiritually. If nothing else, our ailments should remind us that we were not built to dwell in this world permanently—there’s a whole different paradigm in store for God’s children. But in the meantime, we can look for the glory of God even in our infirmities (beyond the Pauline epiphany that His grace is quite sufficient for us). They allow us the opportunity to show courage or patience in the face of adversity; they encourage us to consider the marvelous—dare I say, miraculous—way God has built our bodies to cope with trauma and infection; they encourage us to give thanks for the natural world (and yes, the science of medicine, too) in which can often be found the cure to whatever ails us—if we’re willing to use the eyes God gave us.

But thinking about this in strictly temporal terms, “how we live while life remains” becomes a measure of our ability to be a positive force in this world. If we’re of sound body and mind, then there’s nothing to prevent us from being “part of the solution” if we choose to be. But if we are compromised, whether physically or mentally, by disease of some sort, then we are less able to contribute anything of value—even if we have a heart to do so. So as more and more of us require more and more care, the situation will increasingly become a drag on our society, a boat anchor weighing down our whole civilization to some degree. At some point, the needy will overcome society’s ability to care for them. At that point, our real beliefs will make themselves evident: genuine Christians will help the needy or die trying; secular humanists, pagans, Communists, Nazis, and Muslims, on the other hand, will throw them under the bus the moment the pressure’s on. Yahshua once said, “The poor you will always have with you.” That apparently goes for the infirm as well. But if He tarries, the world will eventually find itself overrun with the poor and infirm—and there will be neither money nor people enough to deal with this unstoppable flood of dire need.

Allergies and Auto-Immune Diseases

A hundred years ago, allergies were rare, and auto-immune diseases (as such) were virtually unheard of. Today that is no longer the case. In “civilized” countries, allergies have reached epidemic proportions. UCLA Health recognizes the connection between allergies and a compromised immune system. They ask: “Why are Allergies Increasing? The occurrence of allergic disease is skyrocketing, and some estimates are that as many as one-in-five Americans have an allergic condition.” That’s over sixty million people, in one nation alone. “Allergies are specific and reproducible undesired and unpleasant immune responses that are triggered by naturally occurring substances such as foods, pollens or other influences in our surroundings. Overwhelming evidence from various studies suggests that the ‘hygiene hypothesis’ explains most of the allergy epidemic.

“The hygiene hypothesis states that excessive cleanliness interrupts the normal development of the immune system, and this change leads to an increase in allergies. In short, our ‘developed’ lifestyles have eliminated the natural variation in the types and quantity of germs our immune systems needs for it to develop into a less allergic, better regulated state of being…. Many of the advances of modernization, such as good sanitation and eradicating parasitic infections (helminths—worm-like eukaryotic organisms), may actually be fueling this epidemic of allergies.”

Even those who consider the “hygiene hypothesis” a myth seem to agree with it to some extent. For instance, Professor Sally Bloomfield says, “The underlying idea that microbial exposure is crucial to regulating the immune system is right. But the idea that children who have fewer infections, because of more hygienic homes, are then more likely to develop asthma and other allergies does not hold up.” Professor Graham Rook adds, “The rise in allergies and inflammatory diseases seems at least partly due to gradually losing contact with the range of microbes our immune systems evolved with, way back in the Stone Age. Only now are we seeing the consequences of this, doubtless also driven by genetic predisposition and a range of factors in our modern lifestyle—from different diets and pollution to stress and inactivity. It seems that some people now have inadequately regulated immune systems that are less able to cope with these other factors.” To me, that sounds suspiciously like saying, “the hygiene hypothesis is perfectly correct, but there’s more to it.”

We’ve all heard the old adage, “Cleanliness is next to Godliness.” It’s not a Biblical precept, I’m afraid. And taken to extremes, cleanliness can take on the proportions of a subtle form of idolatry: God isn’t able to keep my child healthy, so I’ve gotta help Him out by keeping the house spotless—all germs must die! Really? Around 1500 B.C., Moses pegged the “normal” lifespan of man at seventy or eighty years (Psalm 90:10). Nothing much has changed in all that time, except, of course, for the relative sterility of our environments. Moses, as far as I know, never once used hand sanitizer or disinfectant. But he picked up manna right off the ground and ate it for forty years—and he lived to be 120.

As it turns out, the people statistically most likely to develop allergies and auto-immune diseases, asthma, eczema, and rhinitis are subjected to relatively “sterile” environments in their youth, isolated from dirt and germs by their overprotective parents. They are most often raised in urban settings, and those most at risk are only-children or firstborn kids. Meanwhile, those raised in rural or farming environments (where dirt and barnyard animals are a normal part of life), those with older siblings or early socialization, those who aren’t shielded from helminth infections and microbial exposure at an early age, are far more likely to grow up healthy and free of allergies. I’m not saying we’d be healthier if we all drank water contaminated with camel poop, but our bodies were crafted by God to develop immunities and antibodies at an early age to protect us from the ordinary microbes that populate our environment—by His design. The Torah strikes the right balance: “You shall have a place outside the camp, and you shall go out to it. And you shall have a trowel with your tools, and when you sit down outside, you shall dig a hole with it and turn back and cover up your excrement.” (Deuteronomy 23:12-13) Even when deployed for battle, rudimentary sanitary safeguards were to be implemented.

And what about food allergies? The UCLA group opines, “Food allergies are increasing as part of the overall trend of increasing allergies due [again] to the hygiene hypothesis. However, there are some specific reasons that food allergies are increasing…. The recent practice of delaying the introduction of some foods, such as peanuts, with high potential for allergy may be associated with higher rates of food allergy…. Different forms of the same food appear to be more likely to provoke an allergic response.” For example: “Roasting peanuts rather than boiling them makes them much more likely to cause an allergic reaction. Also, many people with milk or egg allergy can tolerate baked forms of these foods.” They suggest, however, that part of the increase in these allergies may simply be due to heightened awareness and better reporting.

Drug allergies are also becoming a bigger issue in recent days. “Any unintended, undesired effect of a medication is called an adverse drug reaction. Allergic reactions are just one type of these reactions, caused by specific immune responses. It is important to know if an adverse reaction is actually an allergy due to an immune mechanism because these reactions can be unpredictable, and severe allergic reactions can be very dangerous…. Each new drug or supplement has possible unwanted and undesired side effects and the potential to cause allergic immune-mediated reactions…. It is likely that both adverse drug reactions and drug allergies are increasing.”

Another theory is presented by Barbara Loe Fisher, writing (in November, 2004) for the National Vaccine Information Center (an anti-vaccination advocacy group). She sees a link between the recent increase in allergies and the prevalence of vaccines: “The number of American children suffering from life threatening peanut allergies has doubled in the past five years, and the number of Americans with food allergies has risen from 6 million to 11 million. This runs parallel with the doubling of asthma, learning disabilities, ADHD; the tripling of diabetes and a 200 to 7,000 percent increase in autism in every state in the U.S. during the past 20 years. As more and more vaccines are mandated to prevent more and more infectious diseases in early childhood, more and more Americans are stuck on sick. So the pharmaceutical industry produces drugs and vaccines that medical doctors sell to patients to try to ‘cure’ the chronic illness that vaccines and suppression of all infectious disease helped to cause in the first place. What a racket.” Elsewhere she describes the trade-off: “Instead of epidemics of measles and polio, we have epidemics of chronic autoimmune and neurological disease. In the last 20 years rates of asthma and attention-deficit disorder have doubled, diabetes and learning disabilities have tripled, chronic arthritis now affects nearly one in five Americans, and autism has increased by 300 percent or more in many states.” I’ll leave it to you to decide whether that comprises “progress” or not.


Autoimmune diseases are “kissing cousins” to allergies, and they too are on the rise. 50 million Americans—75% of them women—suffer from autoimmune disorders, according to Virginia Ladd, president and founder of the American Autoimmune Related Diseases Association, Inc. She writes, “With the rapid increase in autoimmune diseases, it clearly suggests that environmental factors are at play due to the significant increase in these diseases. Genes do not change in such a short period of time.”

Our bodies are protected (by God’s design) from infection and disease through our “immune system,” which, when functioning properly, detects agents foreign to an organism’s own healthy tissue—typically viruses or other parasites—and isolates and destroys them. But under certain conditions, the immune system “turns traitor,” attacking the body’s own healthy cells by mistake. This is called autoimmune disease. The cells being targeted can be from many parts of the body—the gut, the joints, glands, muscles, you name it.

Call me paranoid, but I’m sensing a subtle spiritual lesson in the sudden prevalence of autoimmune diseases—a parallel between autoimmune disease and the compromised human condition of the last days. The earth’s “body” is attacking its own healthy cells—that is, the human race is actively betraying the one group within it that actually functions the way it was supposed to: the followers of Yahweh. It is us (always a small minority) who, after keeping the lines of communication with heaven open for millennia, and keeping the knowledge of God alive in the earth, are now seen as viruses and parasites—outsiders who must be assaulted and eliminated—even though our only “crimes” are admitting that the world is sick and showing it how to get well again. The result, of course, is that the more successful the “body” is in isolating and rejecting us, the sicker it becomes. Left alone to pursue this trend, the earth’s allergy to truth would eventually kill it.

Anyway, the fact that there are over one hundred different types of autoimmune disease—many of them quite similar—makes diagnosis difficult. Common symptoms include fatigue, muscle aches, and a low fever. AlterNet describes the onset of an autoimmune disease: “Imagine the slow, creeping escalation of seemingly amorphous symptoms: a tingling in the arms and fingers, the sudden appearance of a speckled rash across the face, the strange muscle weakness in the legs when climbing stairs, the fiery joints that emerge out of nowhere—any and all of which can signal the onset of a wide range of life-altering and often debilitating autoimmune diseases.” Treatment concentrates on reducing inflammation in joints or tissues. Thus many doctors, in an effort to blunt the symptoms (as they were trained to do), prescribe corticosteroids or other drugs designed to suppress the immune response. A blind man could see the recklessness in this sort of approach. The immune system is what protects us from infection. Repressing the immune system, if you’ll recall, is what kills you if you contract the most dreaded sexually transmitted disease of all—AIDS: Acquired Immune Deficiency Syndrome. Duh!

From personal experience, I can vouch for the insidious nature of autoimmune diseases: my wife has dealt with them (without being accurately diagnosed until relatively recently) her entire adult life, ever since she came down with mononucleosis—twice—when she was in high school, half a century ago. Now she battles severe food allergies, celiac disease, Grave’s disease, hypothyroidism, Raynaud’s disease, Sjogren’s syndrome, fibromyalgia, and the occasional bout of rheumatoid arthritis. (No multiple sclerosis or lupus—yet—thank God.) It all goes together, and none of it is much fun. (In apparent confirmation of the “hygiene hypothesis” demographic profile, she was an only child who was born to urban parents—who had miscarried twice before and were as a result extremely protective and fastidious about sanitation and disease prevention when she was a child.) Now that she knows what the root of the problem is, she controls the symptoms (as much as possible) with a severely restrictive vegan diet. I used to joke that “all she gets to eat is a lettuce leaf and a glass of water,” but these days, that’s a little too close to reality to be funny anymore. While we’re on the subject, note that half a million Americans have been diagnosed with multiple sclerosis (MS)—one of the autoimmune club’s “usual suspects.” But because the symptoms often go unrecognized for what they are, that number could easily fall far short of reality.

Grace Rattue, writing for Medical News Today (June 22, 2012) says, “According to a new study the prevalence and incidence of autoimmune diseases, such as lupus, celiac disease, and type 1 diabetes, are on the rise, and researchers at the Center for Disease Control and Prevention are unsure why. Between 2001 and 2009, the incidence of type 1 diabetes increased by 23%, according to The American Diabetes Association…. Type 1 diabetes occurs when the body’s own immune system destroys the insulin-producing cells of the pancreas, while Type 2 diabetes, the most common form of diabetes, occurs when the body does not produce enough insulin or cannot use the insulin adequately.”

Virtually every manifestation of autoimmune disease is on the rise—especially in the “developed” world. “The incidence of celiac disease, which causes the body’s immune system to attack the small intestine, is also on the rise, according to the U.S. National Institutes of Health and the University of Chicago Celiac Disease Center. In the United States, 1 in 133 people are affected by celiac disease.” As my wife discovered, the most fundamental defense against celiac disease is a strict gluten-free diet—which says something about the times in which we live: wheat (the primary source of gluten) has been a dietary staple for much of mankind since the dawn of history.

Again, it seems as if our own planet has turned against us. Can you really blame it?

Treatment-Resistant Virus/Bacteria Mutations

We’ve all heard of horror stories of infectious diseases that once responded well to “standard” treatments reemerging later in an altered form—now resistant to everything the doctors throw at them. Antibiotics like Penicillin had stunning successes against their bacterial foes in the 1930s, but by 1947, resistant strains like Staphylocccus aureus had begun to appear. (The dreaded MRSA—methicillin-resistant Staphylococcus aureus—was first seen in Britain in 1961, and has now become commonplace in hospitals worldwide.) Today, at least seventy percent of the bacteria strains responsible for the kind of infections people pick up in hospitals have become resistant to one or more of the antibiotics that are deployed against them.

Contrary to popular opinion, these bacteria aren’t evolving, exactly. That is, they’re not in the process of becoming more complex, better suited organisms, even though mutations, upon which natural selection acts, are sometimes involved. What we’re seeing, rather, is the operation of defense mechanisms God built into the bacteria from the very beginning. In particular, once an antibiotic-resistant gene has been generated, bacteria have the ability to “swap” DNA with each other through plasmid exchange—something called “horizontal gene transfer.” Faced with a new threat from an antibiotic, bacteria can actually share resistance genes. If a bacterium carries several of these resistance genes, it is referred to as being multiresistant—in common parlance, a “superbug.”

It works this way because antibiotics don’t actually destroy the whole bacteria. They merely make take one crucial function off line. It’s like derailing an entire railroad train by simply loosening a single segment of track. If a resistance trait emerges that is able to withstand the particular “weapon” brought to bear by an antibiotic, the bacteria are able to share that trait among themselves if given enough time and room to maneuver. (And I’m not talking about a lot of time, either. Under the “right” circumstances, certain Staph bacteria can divide every half hour—in theory, a single cell can multiply into a million-cell colony in only ten hours. And with a genome of 2.8 million nucleotide base pairs, the potential for a fortuitous mutation is ever present.) That’s why it’s so important for patients to finish an entire course of antibiotics, even if they feel better after a few days: the idea is to destroy the entire targeted bacteria population in the host so an immunity to this particular antibiotic can’t be formed via horizontal gene transfer from the hardiest bacteria survivors.

But no new functionality has been created; no new DNA has been formed. In most cases, the change in defense strategy is due to recessive genes being brought to the surface. Something is invariably lost in the process: in order for a mutation or recessive gene to be acted upon by natural selection, some proteins will lose their normal functions. While the “new” variant may be able to fight off an antibiotic, it gives something up in return. It will prove less able to compete against the original strain in environments where antibiotics aren’t a threat.

But don’t take too much comfort in that trade-off. The fact is, new antibiotic-resistant “superbugs” are a very real danger. Mike Adams’ September 17, 2013 article, posted in NaturalNews.com, is headlined, hysterically enough, “The coming plague will not be stopped by drugs: CDC now admits era of antibiotics at an end as bacteria out-wit drug companies.” That’s sort of like the Pope confessing that Martin Luther was right after all—it’s not the kind of statement we’ve come to expect.

“In a breakthrough moment of truth for the CDC [the Centers for Disease Control], the agency now openly admits that prescription antibiotics have led to a catastrophic rise in superbugs, causing the death of at least 23,000 Americans each year (an estimate even the CDC calls ‘conservative’)…. What’s truly astonishing about this report is that it admits, in effect, that modern medicine is a failure when it comes to infectious disease. The whole approach of fighting bugs with isolated chemicals was doomed to fail from the start, of course, since Mother Nature adapts to chemical threats far more quickly than drug companies can roll out new chemicals.” I, for one, would caution against declaring “Mother Nature” your god. As we have seen elsewhere, “she” herself is showing all the symptoms of having caught a debilitating—and perhaps fatal—disease. “Mother Nature” is neither omnipotent nor immortal. Yahweh is both. Worship Him alone.

The article continues, “Sadly, the very approach of using an isolated chemical to combat disease is rooted in a 1950s mentality that has nearly reached its endpoint in the history of medicine. The CDC all but admits this now, saying the era of antibiotics is nearing its end. ‘If we are not careful, we will soon be in a post-antibiotic era,’ says Dr. Tom Frieden, director of the CDC. The admission should send alarm bells ringing across the medical establishment, because what it really means is the day isn’t too far off when doctors and hospitals can no longer offer treatments for common infections….” The fourth decade of the twenty-first century, perhaps? Just a guess.  

“The CDC, predictably, also says part of the solution is to get more people vaccinated.” More on that in a moment. “This makes no sense whatsoever as vaccines only make immune systems weaker while doing nothing to prevent infections of superbugs. On the positive side, the CDC did say that antibiotic use should be curbed in agriculture (meaning fed to animals for meat production). In truth, more antibiotics are used each year in agriculture than in humans….” As I mentioned before, my experience in designing packaging for the natural foods industry revealed that sub-theraputic levels of antibiotics (in poultry, at least) were administered not so much to ward off disease as to artificially raise the water content of the meat—and their weight—allowing chicken and turkey ranchers to butcher their birds fifteen to twenty percent earlier than if they had been raised naturally. When you’re feeding a couple of hundred thousand chickens, the savings aren’t chicken feed. Or rather, they are chicken feed. You know what I mean.

Wisdom and discernment need to be applied here, but these things are always in short supply when wheelbarrows full of money are at stake. “Ultimately, the use of antibiotics needs to be sharply limited. There are still cases in which limited, targeted use of antibiotics is a real lifesaver, but the widespread abuse of antibiotics is what has led our medical system to the brink of collapse when it comes to deadly superbugs….”

Adams points out that a parallel problem is taking place in the world of agriculture, where runaway “superweeds,” the result of the widespread use of glyphosate with GMOs, are making it harder and harder to deal with. The escalation of potency, whether trying to eradicate superweeds or superbugs, can’t be sustained forever. “Both problems are promised to be solved with ‘yet more chemicals’ to overcome the resistance to the previous round of chemicals. There's a fatal problem with this: each round of chemicals needs to be substantially more toxic than the last round, causing a ‘spiral of chemical toxicity’ that will only lead human civilization to its own destruction.  

“Even right now, we are very close to the rise of a highly infectious superbug that is resistant to all known antibiotics. Once unleashed, such a superbug could sweep through the population and cause the death of over a billion people across the planet. In such a scenario, the entire system of western medicine has zero tools to deal with it. There is no vaccine, no drug and no FDA-approved treatment that will even touch it…. In the coming superbug plague, the drug companies will be useless. Your doctor will be useless. Hospitals will be disease-infested death traps. The CDC will be paralyzed with bureaucracy and hopelessly stuck in the era of chemical medicine. Only people who possess the nearly-lost knowledge of natural plant-based antibiotics will have any real chance of surviving such an outbreak if they become infected.” Mike is such a smiley-faced Pollyana, always looking on the bright side. Level with us, Mr. Adams. Don’t hold back: tell us what you really think.

Of course, if you’ve been paying attention, you know that runaway disease is only one of scores of unprecedented ways for people to die in vast numbers during the next few decades. Adams suggests developing a working knowledge of “natural” remedies and herbal antibiotics. I would recommend a slightly more radical (or at least more counterintuitive) approach: wake up to the fact that these mortal lives we’re living were never intended by our Creator to be all there is to it. Repent before Yahweh, and put your life in His capable hands. He told us that Tribulation was in our future—and He told us how to rise above it.

Most of the world, however, says via their actions, “I’d rather die than accept the idea of a holy, uncreated God with morals and standards, one who wouldn’t approve of the way I’m living my life.” It’s not that they know God’s Law and reject it, however: their own consciences condemn them. Ironically, that same God who revealed His standards of morality and instilled the human race with those inconvenient consciences, gave our race the privilege of making up our own minds on the matter—He built us with free will, the right to make our own moral choices. Our love is apparently all He wants, but He won’t force anyone to love Him (knowing, as He does, that the very idea is nonsense: love that’s compelled isn’t love at all, but something else entirely—surrender, submission, compliance, maybe obeisance). In order to be love, the act must be totally voluntary. The “problem” with having free will is that our choices bring with them their own inevitable consequences. If we choose not to live under God’s “law” (the law of love), then we are on our own. If we choose not to trust Yahweh, we must put our trust in something (or someone) else, something that’s by definition inferior to Him—like, for instance, ourselves.

What does all of that have to do with disease? Ask yourself: would not a loving and omniscient Creator have built our bodies able to fight off most infections? Of course He would. If He were smart enough to have made us in the first place, He would certainly have crafted our species to be in balance with everything else He created—including microorganisms. Our immune systems were designed to ensure that we didn’t succumb to every germ that came along. Granted, that perfect balance, that ironclad immunity, was lost when we fell into sin. But even after the fall—even after our bodies became corrupt and vulnerable—if they had been susceptible to every bacteria or virus that entered our environment, the human race would have died out in the first few generations. And we didn’t. God has a vested interest in seeing us alive and well, for dead people don’t choose, don’t respond, and don’t love.  


But I digress. We were talking about antibiotics, vaccines, and their growing inability to cope with the infectious diseases that beset mankind. It is becoming increasingly obvious that what began as our “best laid plan” has taken on a life of its own. The driving force is no longer an altruistic desire to cure the diseases that afflict our fellow man, but has become something a wee bit less noble: the pursuit of profits. TheHealthyHomeEconomist.com (August 25, 2011) reports: “The market for vaccines is expected to surpass $36 billion dollars by 2013. Vaccines are clearly an important growth industry for Big Pharma. Just witness the rapid increase in the childhood vaccination schedule from 3 shots in 1950 to 68 vaccines by age 11-12 today (25 of those by the age of 6 months)! In 1986, it cost a parent about $80 for her child to receive every government recommended vaccine. In 2011, it cost $2200! Clearly, vaccines are one of the fastest growing sectors of the pharmaceutical industry. $4 billion in tax dollars are spent every year purchasing vaccines for health clinics in the United States alone.”

To be fair, just because somebody is making tons of money pushing an agenda of vaccinating kids against every ailment they can think of, it doesn’t necessarily follow that vaccinations are bad. But it certainly makes it harder to be objective—especially if you’re a Big Pharma executive with Porsche payments to make. But we should heed not our paranoia, but rather the telling and “inconvenient” statistics. They’re becoming harder than ever to ignore or explain away. One broad multinational survey of unvaccinated children revealed that:

“Less than 10% of unvaccinated children suffer from allergies of any kind. This compares with 40% of children in the USA ages 3-17 reporting an allergy to at least one allergen and 22.9% with an allergic disease.” In other words, children who have received the full complement of vaccines as demanded by our government are four times more likely to develop allergies than those who aren’t.

“0.2% of unvaccinated children suffer from asthma. This compares with 14-15% of vaccinated children with asthma in Australia, 4.7% in Germany, and 6% in the USA.” Again, you are thirty times more likely to develop childhood asthma if you are vaccinated in America—and seventy times more likely if you live in Australia. (Statistics from theHealthyHomeEconomist.com.)

And the bad news (for vaccines) keeps coming: “1.5% of unvaccinated children suffer from hay fever. This compares with 10.7% in Germany. 2% of unvaccinated children had neurodermatitis. This autoimmune disorder affects over 13% of children in Germany. ADHD was present in only 1-2% of the unvaccinated children. This compares with nearly 8% of children in Germany with ADHD and another 5.9% borderline cases. Middle ear infections are very rare in unvaccinated children (less than 0.5%). In Germany, 11% of children suffer from this problem. Less than 1% of unvaccinated children had experienced sinusitis. This compares with over 32% of children in Germany.” If it was just one malady, you could perhaps chalk it up to statistical anomaly. But this evidence points toward a pervasive, systematic assault on the health of an entire generation, perpetrated entirely for monetary gain.

I have elsewhere statistically linked autism to the new prevalence of GMOs. But chronologically, the trend toward overvaccination parallels the genetic modification of our food very closely. So note this stunning fact: “Only four unvaccinated children out of the 7,600+ total surveys reported severe autism. In all four cases, however, the mother tested very high for mercury.” That is, there is environmental toxicity present where the family lives—something that could easily account for the autism issue. “In the USA, approximately 1 in 100 children suffer this neurological illness and 1 in every 38 boys in the UK.” And the rates of autism incidence are continuing to rise—in parallel with both GMOs and the practice of vaccination.

So the logical conclusion is that children today are probably at risk because of the vaccinations they’ve been receiving. What about the elderly (gulp…my generation)? We were the first to receive vaccines on a nationwide level—and for good reason. I remember well the scourge of polio in the ’50s, and the miracle of the Salk vaccine that virtually wiped it out in the western world. (Where the vaccine didn’t go, polio remained a threat. One of my adopted daughters, born in 1974 in India, contracted the polio virus in her orphanage when she was about five years old, taking the use of her legs—adding injury to insult, as it were. And it’s the gift that keeps on giving: what little muscle strength and functionality she had left was stolen in the last few years of her life by post-polio syndrome. Polio is still a recurring scourge in the Middle East, notably in Syria, Egypt, and Pakistan. But the Salk and Sabin vaccines spared millions of westerners from a similar fate.)  

So know this: I’m not anti-vaccination on philosophical grounds. I’m merely reporting what has happened in the recent past with the aim of perceiving what our future might look like. The case of vaccines seems to be sort of like life itself: if one thing doesn’t get you, something else will. Case in point, reported on the website RawForBeauty.com, October 3rd, 2013: “The CDC has admitted [and just as quickly removed the admission from their website] that more than 98 million Americans received one or more doses of polio vaccine contaminated with a cancer causing polyomavirus called SV40, within an 8-year span from 1955-1963. It has been estimated that 10-30 million Americans could have received an SV40 contaminated dose of the vaccine.

“SV40 is an abbreviation for Simian vacuolating virus 40 or Simian virus 40, a polyomavirus that is found in both monkeys and humans. Like other polyomaviruses, SV40 is a DNA virus that has been found to cause tumors and cancer. SV40 is believed to suppress the transcriptional properties of the tumor-suppressing genes in humans through the SV40 Large T-antigen and SV40 Small T-antigen. Mutated genes may contribute to uncontrolled cellular proliferation, leading to cancer…. Polio vaccines contaminated with SV40 virus…caused cancer in nearly every species infected by injection. Many authorities now admit much, possibly most, of the world’s cancers came from the Salk and Sabin polio vaccines, and hepatitis B vaccines, produced in monkeys and chimps.” So the cure may turn out to be as bad as the disease. If the polio doesn’t get you, maybe the cancer or hepatitis will.

What kind of cancer? “It is said mesothelioma is a result of asbestos exposure, but research reveals that 50% of the current mesotheliomas being treated no longer occurs due to asbestos but rather the SV40 virus contained in the polio vaccination. In addition, according to researchers from the Institute of Histology and General Embryology of the University of Ferrara, SV40 has turned up in a variety other tumors. By the end of 1996, dozens of scientists reported finding SV40 in a variety of bone cancers and a wide range of brain cancers, which had risen 30 percent over the previous 20 years….

There were two kinds of polio vaccine—an oral, live virus, and an injectable inactive version. Both forms were tainted with the SV40 strain. And it has come to light that the technique that was used to inactivate the polio virus in the injectable version relied on formaldehyde—which was later found unreliable in killing the SV-40 virus. Formaldehyde itself is a known carcinogen (causing proteins to irreversibly bind to DNA), and the chemical styrene is suspected as well. But these substances are still found in virtually every vaccine. Somebody is asleep at the switch. Or perhaps he’s making so much money, he just doesn’t care.

The website Whale.to offers a plethora of research tying the over-use of vaccines to increasing rates of diseases that they were not designed to prevent. In an article by Richard Gale and Dr. Gary Null (December 23, 2009) with the unwieldy title, “The Council of Foreign Relations Enters the Vaccine Biz—Desperate Attempts to Salvage a Corrupt Science with Sound-bites,” they make the case that some of the most dreaded virus strains were in retreat long before the vaccines deployed against them were even available. “According to mortality figures from the British Office of National Statistics, measles and pertussis (whooping cough) began their rapid decline at the end of the nineteenth century. Both were down 99 percent from 1838 to the year the vaccines were introduced (pertussis in 1950 and measles in 1968).

“There is another example of an infectious disease that was far more deadly than smallpox that never had a widely accepted vaccine and yet eventually fell into obscurity. During the nineteenth century, scarlet fever was responsible for more deaths than measles, pertussis and smallpox. An ineffective vaccine was created in 1924 but disappeared after the introduction of penicillin. What is important in the example of scarlet fever is that infectious diseases declined not because of vaccine miracles, but because of many other factors including improved health, cleaner water and sanitation, public utilities, better living and working conditions, improved nutrition and other medical advancements. This is the same for just about every infectious disease during the first half of the twentieth century that was already in rapid decline before the advent of their respective vaccines.” So the case can be made that simple sanitary precautions are more effective than vaccines in stamping out many infectious diseases over time. It is as I’ve always suspected: given a fighting chance, our bodies are quite capable of defending themselves against most microbes. It’s the way we were designed.

Maybe it’s a flaw in human nature: we tend to crave the “miracle cure” so we don’t have to stop living like pigs. And I’m not just talking about communicable diseases here, but in every facet of our lives. We shun hard work, but dream of hitting a big lotto jackpot. We defer maintenance because we’re lazy (or perhaps because we don’t really appreciate what we have), hoping that dumb luck will achieve what we didn’t. We’d rather live on welfare than earn the same amount of money doing honest work for minimum wage. We want to have a pill that cures AIDS, but we aren’t willing to restrict our sexual contact to one partner for our entire lives. And some among us love the idea of “salvation by grace” because we have this insane notion that it somehow authorizes us to sin like Caligula, since our transgressions have already been forgiven. What part of “Take up your cross daily and follow Me” didn’t we understand? What part of “Be holy, for I am holy” don’t we comprehend? We need to come to terms with the fact that in God’s economy, although miracles do occasionally happen and providence is an ever-present background reality, we have a big part to play in our own well being and mortal destiny. It’s part of having free will. Yahweh promised to spare the Israelites the ravages of the “diseases of the Egyptians,” but only if they followed His Instructions—like “don’t eat pigs,” “don’t drink blood,” or “Don’t have sex with someone other than your own spouse.” God never offered us a magic pill for anything. So when men do, we should at the very least view it with suspicion.

Dr. Gerhard Buchwald M.D., in The Decline of Tuberculosis despite “Protective” Vaccination, paints a bleak picture: “Vaccinations are now carried out for purely commercial reasons because they fetch huge profits for the pharmaceutical industry…. There is no scientific evidence that vaccinations are of any benefit, but it is clear that they cause a great deal of harm…. Today there are 800,000 children and youngsters under the age of 15 years (in Germany) with asthma. 800,000! Neurodermitis, once a rare complaint, has become so common that there are several support networks with many thousands of members. The ‘Frankenpost’ of April 2004 reported an estimated 27 million people now suffer from hayfever, neurodermitis and allergic asthma in Germany.”

Another “inconvenient truth” from the whale.to website: “A critical point which is never mentioned by those advocating mandatory vaccination of children is that children’s health has declined significantly since 1960 when vaccines began to be widely used. According to the National Health Interview Survey conducted annually by the National Center for Health Statistics since 1957, a shocking 31% of U.S. children today have a chronic health problem, 18% of children require special health care or related services and 6.7% of children have a significant disability due to a chronic physical or mental condition. Respiratory allergies, asthma and learning disabilities are the most common of these.”—Incao’s Hepatitis B Vaccination Testimony. When the statistical correlation between vaccines and chronic childhood diseases is this strong, it would be irresponsible not to consider if there might be a cause-and-effect link. In fact, it’s hard not to perceive a nefarious conspiracy behind the whole thing—one designed to collapse human civilization.

Barbara Loe Fisher in her article In the Wake of Vaccines writes, “One American child in 166 has been diagnosed with autism spectrum disorder…. 9 million American children under 18 have been diagnosed with asthma…. Nearly 3 million children [are] learning disabled…. 4 million children between the ages of 3 and 17 years have been diagnosed with ADHD…. 206,000 Americans under the age of 20 have type 1 diabetes…. 1 in 400 to 500 American children and adolescents are now diabetic. Today, arthritis affects one in three Americans, and about 300,000 American children have juvenile rheumatoid arthritis. Juvenile rheumatoid arthritis used to be so rare that statistics were not kept until its recent rise in children.” A hundred years ago, these maladies were either extremely rare or nonexistent.

In short, it’s not a few isolated cases. It’s not just one disease. It’s not restricted to one geographical region. And there’s probably not just one all-encompassing cause. But something we’re doing (or something that’s being done to us) is making a large contingent of an entire generation chronically ill. And the world’s recent and increasing reliance on drugs and vaccines (instead of reliance on Yahweh’s incredibly robust design for our bodies’ self defense) is harming the human race instead of helping it. Although the vaccine hoax is still largely known only to “conspiracy theorists” here in the U.S. because of our pervasive culture of cover-up and self-deception, the truth has begun to leak out—like Niagara Falls—in other places.

The blog nsnbc posted an article by Andrew Baker about the Freedom of Information Act in the UK, entitled, “The Vaccine Hoax is Over—Documents from UK reveal 30 Years of Coverup.” It stated, “A Freedom of Information Act [request] in the UK filed by a doctor there has revealed 30 years of secret official documents showing that government experts have (1) known the vaccines don’t work; (2) known they cause the diseases they are supposed to prevent; (3) known they are a hazard to children; (4) colluded to lie to the public; and (5) worked to prevent safety studies.” He notes that “Those are the same vaccines that are mandated to children in the U.S. Educated parents can either get their children out of harm’s way or continue living inside one of the largest most evil lies in history: that vaccines—full of heavy metals, viral diseases, mycoplasma, fecal material, DNA fragments from other species, formaldehyde, polysorbate 80 (a sterilizing agent)—are a miracle of modern medicine.” Well, they’re a miracle of modern marketing, at least.

Baker goes on to claim, “The CDC is obviously acting against the health of the American people. But the threat to the lives of the American people posed by the CDC’s behavior does not stop there. It participated in designed pandemic laws that are on the books in every state in the U.S., which arrange for the government to use the military to force unknown, untested vaccines, drugs, chemicals, and ‘medical’ treatments on the entire country if it declares a pandemic emergency.” Call me paranoid, but I’d willing to bet that such an “emergency” would be more likely to be triggered by a resurgence in patriotism or faith—or even common sense—than by the actual emergence of disease.

So the article goes on to state, “The CDC’s credibility in declaring such a pandemic emergency is non-existent, again based on the Freedom of Information Act. For in 2009, after the CDC had declared the H1N1 ‘pandemic,’ the CDC refused to respond to Freedom of Information Act filed by CBS News, and the CDC also attempted to block their investigation. What the CDC was hiding was its part in one of the largest medical scandals in history, putting out wildly exaggerated data on what it claimed were H1N1 cases, and by doing so, created the false impression of a ‘pandemic’ in the US. The CDC was also covering up a financial scandal to rival the bailout, since the vaccines for the false pandemic cost the U.S. billions. And worse, the CDC put pregnant women first in line for an untested vaccine with a sterilizing agent, polysorbate 80, in it. Thanks to the CDC, the number of vaccine-related ‘fetal demise’ reports increased by 2,440 percent in 2009 compared to previous years.”  

Another vaccine that’s raising a few eyebrows these days is “designed” to prevent HPV (human papilloma virus), a rather common and usually uneventful bug that is purportedly linked to cervical cancer and genital warts. But as Brent Lambert (writing for FeelGuide.com, July 16, 2013) reports, “Dr. Diane Harper was the lead researcher in the development of the human papilloma virus (HPV) vaccines, Gardasil™ and Cervarix™. She is now the latest in a long string of experts who are pressing the red alert button on the devastating consequences and irrelevancy of these vaccines. Dr. Harper made her surprising confession at the 4th International Conference on Vaccination which took place in Reston, Virginia. Her speech, which was originally intended to promote the benefits of the vaccines, took a 180-degree turn when she chose instead to clean her conscience about the deadly vaccines so she “could sleep at night.”

“The following is an excerpt from a story by Sarah Cain: ‘Dr. Harper explained in her presentation that the cervical cancer risk in the U.S. is already extremely low, and that vaccinations are unlikely to have any effect upon the rate of cervical cancer in the United States. In fact, 70% of all HPV infections resolve themselves without treatment in a year, and the number rises to well over 90% in two years. Harper also mentioned the safety angle. All trials of the vaccines were done on children aged 15 and above, despite them currently being marketed for 9-year-olds. So far, 15,037 girls have reported adverse side effects from Gardasil™ alone to the Vaccine Adverse Event Reporting System (VAERS), and this number only reflects parents who underwent the hurdles required for reporting adverse reactions. At the time of writing, 44 girls are officially known to have died from these vaccines. The reported side effects include Guillian Barré Syndrome (paralysis lasting for years, or permanently—sometimes eventually causing suffocation), lupus, seizures, blood clots, and brain inflammation. Parents are usually not made aware of these risks. Dr. Harper, the vaccine developer, claimed that she was speaking out so that she might finally be able to “sleep at night.” “About eight in every ten women who have been sexually active will have HPV at some stage of their life,” Harper says. “Normally there are no symptoms, and in 98 per cent of cases it clears itself. But in those cases where it doesn’t, and isn’t treated, it can lead to pre-cancerous cells which may develop into cervical cancer.”’

“Although these two vaccines are marketed as protection against cervical cancer, this claim is purely hypothetical. Studies have proven there is no demonstrated relationship between the condition being vaccinated for and the rare cancers that the vaccine might prevent, but it is marketed to do that nonetheless. In fact, there is no actual evidence that the vaccine can prevent any cancer. From the manufacturers own admissions, the vaccine only works on 4 strains out of 40 for a specific venereal disease that dies on its own in a relatively short period, so the chance of it actually helping an individual is about the same as the chance of her being struck by a meteorite.”

But at the same time (as reported by the National Conference of State Legislatures), “Since 2006, legislators in at least 42 states and territories have introduced legislation to require the vaccine, fund, or educate the public or school children about the HPV Vaccine. At least 25 states and territories have enacted legislation, including Colorado, District of Columbia, Illinois, Indiana, Iowa, Louisiana, Maine, Maryland, Michigan, Minnesota, Missouri, Nevada, New Mexico, New York, North Carolina, North Dakota, Oregon, Puerto Rico, Rhode Island, South Dakota, Texas, Utah, Virginia, Washington and Wisconsin.” So the HPV vaccines have been proven worthless, dangerous, and unnecessary, and yet they’re required by law throughout half the country.

Gee, if you can’t trust your government, who can you trust? Trick question, of course—governments have never been trustworthy, comprised, as they are, of mortal men. Today’s governments seem to be getting worse, but perhaps that’s only because we’re close enough to see what they’re up to. As for me, I’m trusting the “government” that will soon rest upon the shoulders of Yahshua, the risen and glorified Messiah (Isaiah 9:6; Matthew 28:18; Revelation 19:15-16). I’m certain He will commence His Kingdom upon the earth as the Bible promises, and I’m pretty sure I even know when. But even if I’m mistaken about the timing, one thing still seems certain: if the current trend continues at its present pace, by the fourth decade of the twenty first century, diseases both chronic and communicable will be crippling human society, and there won’t be a thing mankind can do to stem the tide of misery.  

Sexually Transmitted Diseases

The Apostle Paul wisely wrote, “Flee sexual immorality. Every sin that a man does is outside the body, but he who commits sexual immorality sins against his own body.” (I Corinthians 6:18) We’re about to find out precisely what that means, in visceral and practical terms.

If there’s one category of pestilence that is inextricably tied to sin, this is it. Think about it. You can catch typhoid, cholera, African sleeping sickness, or the common cold simply by being in the wrong place at the wrong time. You can be afflicted with autism, cancer, diabetes, or lupus through no fault of your own. But there is a whole range of sexually transmitted disease rampant in the world today that need not ever have been a problem—if only mankind would have heeded our Creator’s Instructions. Nobody seems to want to hear it, but STDs don’t infect people unless there’s sexual contact, and God’s pattern decrees that one man is to marry one woman, and the two are to remain sexually exclusive for their entire lives—beginning as virgins, and ending as aging lovers with fond memories of the unrestrained passion they shared in their youth. 

Furthermore, STDs would disappear in one generation if everyone suddenly began abiding by this one simple principle (yeah, picture that). We shouldn’t write off God’s Instructions just because there is a heavy and obvious symbolic component to them—teaching us what our relationship with Yahweh is supposed to be like as his “bride” and “wife.” They are also eminently practical.

God allowed polygamy—also for symbolic reasons—but careful attention to the scriptural details reveals that whenever a man had more than one wife (or concubine), there was always trouble attached. Widows were allowed to remarry as well (for practical reasons), and here too, there is a symbolic component—something beyond the scope of this present chapter’s subject, I’m afraid. But the bottom line is, God never authorized or promised to bless the sort of non-committal serial monogamy that’s so prevalent in today’s world, much less the unrestrained licentiousness that characterizes broad swaths of humanity in these Last Days—the kinds of things that make sexually transmitted diseases a deadly and growing phenomenon. After all, “He who commits sexual immorality sins against his own body.”

They used to be called VDs, “venereal diseases” (named after Venus, the pagan Roman goddess of love and sex). But STD (sexually transmitted disease) was a more accurate term. Lately, that label has been redefined somewhat by the even broader “STI”—sexually transmitted infection—since “disease” may imply that the carrier is symptomatic, and this is not always the case. But it doesn’t really matter what you call them: these are self-inflicted curses, and totally preventable (which is not to say they’re totally curable once you’ve contracted them).

Unlike other disease “families,” STDs are defined by their mode of transmission (sexual contact) rather than the type of organism that causes the infection, bugs that run the gamut of germ classifications. Wikipedia provides the following list of STDs, grouped by organism type:

(1) Bacterial: Chancroid (Haemophilus ducreyi); Chlamydia (Chlamydia trachomatis); Gonorrhea (Neisseria gonorrhoeae), colloquially known as “the clap”; Granuloma inguinale (or Klebsiella granulomatis); and Syphilis (Treponema pallidum).

(2) Fungal: Candidiasis (yeast infection).

(3) Viral: Viral hepatitis (Hepatitis B virus)—spread via saliva, venereal fluids; Herpes (Herpes simplex virus 1, 2) skin and mucosal, transmissible with or without visible blisters; HIV (Human Immunodeficiency Virus)—venereal fluids, semen, breast milk, blood; HPV (Human Papillomavirus)—skin and mucosal contact. ‘High risk’ types of HPV and cause cervical cancers, as well as some anal, penile, and vulvar cancer. Some other types of HPV cause genital warts; and MCV (molluscum contagiosum virus)—close contact.

(4) Parasites: Crab louse, colloquially known as “crabs” or “pubic lice” (Pthirus pubis); Scabies (Sarcoptes scabiei). 

(5) Protozoal: Trichomoniasis (Trichomonas vaginalis), colloquially known as “trich.”

It is virtually impossible to come down with an STD without sexual contact somewhere in the loop. The Wikipedia article explains: “As may be noted from the name, sexually transmitted diseases are transmitted from one person to another by certain sexual activities, rather than actually being caused by those sexual activities. Bacteria, fungi, protozoa or viruses are still the causative agents. It is not possible to catch any sexually transmitted disease from a sexual activity with a person who is not carrying a disease; conversely, a person who has an STI got it from contact (sexual or otherwise) with someone who had it, or his/her bodily fluids. Some STIs such as HIV can be transmitted from mother to child either during pregnancy or breastfeeding.”

These diseases are passed from one person to another by fluid transfer (blood, semen, saliva, mucous secretions etc., varying from disease to another), so the body’s orifices (whether designed by the Creator as sexual apparatus or not) offer pretty much the only opportunity for such fluid transfer to occur. “Although the likelihood of transmitting various diseases by various sexual activities varies a great deal, in general, all sexual activities between two (or more) people should be considered as being a two-way route for the transmission of STIs, i.e., ‘giving’ or ‘receiving’ are both risky although receiving carries a higher risk.”

All of this begs the question: if you can only get an STD from someone who was infected before you, how did these diseases begin in the first place? You’re not going to like the answer, I’m afraid. I was only able to track down a few of them, but they point toward parallel perversions. Syphilis is from a bacteria that lives naturally (and harmlessly) in sheep; and the HIV virus was first contracted by humans having sexual contact with primates—apes—in Africa. Chlamydia is caused by bacteria carried by dogs. Herpes is endemic in chimpanzee populations. In other words, the problem is bestiality. I’m having trouble seeing the attraction, but I’m told it exists. 

Yahweh even went out of His way to forbid the practice in the Torah. Immediately after He forbade homosexuality by saying “You shall not lie with a male as with a woman. It is an abomination,” God commanded, “Nor shall you mate with any animal, to defile yourself with it. Nor shall any woman stand before an animal to mate with it. It is perversion. Do not defile yourselves with any of these things; for by all these the nations are defiled, which I am casting out before you. For the land is defiled; therefore I visit the punishment of its iniquity upon it, and the land vomits out its inhabitants.” (Leviticus 18:22-25) Homosexuality and bestiality were listed as specific reasons why the seven Canaanite nations were being evicted out of the Land of Promise. The practices were so common, and so intimately associated with idolatry, that Yahweh promised to throw the Israelites out of the Land too if they tolerated or practiced these perversions of the natural order. Part of the prohibition (a small part, to be sure) was because God didn’t want His people physically weakened by the diseases that can arise and spread from doing these things. So later, the punishment was prescribed: “If a man mates with an animal, he shall surely be put to death, and you shall kill the animal. If a woman approaches any animal and mates with it, you shall kill the woman and the animal. They shall surely be put to death. Their blood is upon them.” (Leviticus 20:15-16) Dead perverts spread no diseases.

The so-called “gay” lobby hates Yahweh for raining on their gay-pride parade. He’s such a killjoy, they say. If I want to stick my sexual apparatus into a wood chipper, I should be allowed to do it! In fact, all you boring “straights” should be required to tolerate—and even support—my God-given right to do whatever God warned us not to do! But God was showing His people how to remain healthy, whether on a physical level or as a spiritual metaphor. He was saying something that you’d think would have been so obvious as to not even require mention: Please don’t soak your head in kerosene and light it on fire. I have no earthly idea where gonorrhea or trichomoniasis first came from. I don’t think I want to know.

Convention “wisdom” universally recommends “safe sex,” that is, using a condom when having sex with total strangers. While lowering the odds of contracting an STD (or pregnancy), they’ve missed the point altogether. The only really safe sex is to restrict your sexual contact to your lifelong spouse—the wife (or husband) of your youth—exclusively. Within the marriage bond, however, God encourages—even commands—sex. I’ve always found it strange that so many folks think that God is somehow prudishly anti-sex. He isn’t. He invented it, for cryin’ out loud. And His very first commandment to Adam and Eve was to “Be fruitful and multiply: fill the earth.”  (Genesis 1:28) That requires sex, and lots of it. God didn’t forbid sex; He merely told us how to avoid the health issues that can result from its misuse. You don’t have to recognize its symbolic significance—that Yahweh’s relationship with His people is like that of a committed married couple: loving, affirming, fruitful, enduring, and pleasurable—to benefit from the practical results of keeping His Instructions.

In fact, the only time Yahweh said to abstain from sex was during the wife’s menstrual period, typically lasting about five days per month. “You shall not approach a woman to uncover her nakedness as long as she is in her customary impurity.”  (Leviticus 18:19) As usual, there are practical reasons backing up God’s law, though we should follow His Instructions even if we have no clue as to what those reasons are. Intercourse during menstruation, as it turns out, makes a woman more vulnerable to a variety of vaginal infections, and puts her at greater risk for cervical cancer. Moreover, abstinence during menstruation is known today to be a safe, low-tech method for enhancing a couple’s fertility.  

Paul, one of the foremost Torah scholars of his day, suggested this cyclical pattern: “Do not deprive one another [of sexual contact] except with consent for a time [during the wife’s menstrual cycle], that you may give yourselves to fasting and prayer; and come together again so that Satan does not tempt you because of your lack of self-control.” (I Corinthians 7:5) We can only imagine how different the attitude and walk (and love life, for that matter) of the average young Christian husband would be if he and his wife “gave themselves to fasting and prayer” in place of sex for five or six days out of every month while God took care of the routine periodic maintenance chores on his wife’s reproductive apparatus.

The CDC reports (2011) that the rates of STDs are generally increasing, especially among adolescents and young adults. Sadly, it is a “given” that these statistics include very few in traditional marriages. Virtually all STDs, rather, are the result (at some point in the chain of transmission) of extramarital sex. “Estimates suggest that young people aged 15–24 years acquire nearly half of all new STDs. Compared with older adults, sexually active adolescents aged 15–19 years and young adults aged 20–24 years are at higher risk of acquiring STDs for a combination of behavioral, biological, and cultural reasons…. The higher prevalence of STDs among adolescents also may reflect multiple barriers to accessing quality STD prevention services, including lack of health insurance or ability to pay, lack of transportation, discomfort with facilities and services designed for adults, and concerns about confidentiality.” Gee, kids, you’ve managed to catch grown-up diseases. Maybe it’s time to grow up a little yourselves. “Traditionally, intervention efforts have targeted individual-level factors associated with STD risk which do not address higher-level factors (e.g., peer norms and media influences) that may also influence behaviors. Interventions for at-risk adolescents and young adults that address underlying aspects of the social and cultural conditions that affect sexual risk-taking behaviors are needed, as are strategies designed to improve the underlying social conditions themselves.”

I don’t know which saddens me more: the fact that children as young as fifteen are routinely contracting STDs, that our prevailing “social and cultural conditions” encourage “sexual risk-taking behaviors,” or that the CDC (or any other “authority figure”) has nary a clue as to why this is happening or what to do about it. The real underlying cause of the increase in STDs is not “behavioral, biological, or cultural.” It’s not even moral, exactly. It’s spiritual. The past two or three generations in this country have been unrelentingly taught that “man is his own god,” that “you’re only an animal,” and “if it feels good, do it.” And then our government is shocked when social diseases run rampant. The best advice they can conjure up is “wear a condom.” But if someone at the CDC were to publically recommend that people should remain virgins until marriage, and then remain faithful and sexually exclusive for their whole lifetimes, he’d no doubt be fired on the spot, branded as some sort of hysterical religious bigot.  

So here are the CDC’s findings: “Chlamydia: Rates of reported chlamydial infection among persons aged 15–19 years and 20–24 years continue to increase. During 2010–2011, rates increased 4.0% for those aged 15–19 years and 11.0% for those aged 20–24 years…. Gonorrhea: During 2010–2011, gonorrhea rates remained essentially unchanged for persons aged 15–19 years (decreased 0.1%) and increased for persons aged 20–24 years (5.8%)…. Syphilis: rates among women aged 15–19 years increased annually during 2004–2009, from 1.5 cases per 100,000 females to 3.3 cases in 2009, but decreased to 2.9 cases in 2010 and 2.4 cases in 2011.”

And what about AIDS or HIV—the STD that has been getting the lion’s share of the media attention and research money for the past twenty years? “HIV and AIDS remain a persistent problem for the United States and countries around the world. While great progress has been made in preventing and treating HIV, there is still much to do…. About 50,000 people get infected with HIV each year. In 2010, there were around 47,500 new HIV infections in the United States…. If we look at HIV infection by race and ethnicity, we see that African Americans are most affected by HIV. In 2010, African Americans made up only 12% of the US population, but had 44% of all new HIV infections. Additionally, Hispanic-Latinos are also strongly affected, making up 17% of the US population, but had 21% of all new HIV infections. If we look at HIV infections by how people got the virus (transmission category), we see that men who have sex with men (MSM) are most at risk. In 2010, MSM had 63% of all new HIV infections, even though they made up around 2% of the population. Individuals infected through heterosexual sex made up 25% of all new HIV infections in 2010.” That last statistic, of course, doesn’t factor in the participants’ previous sexual liaisons. Gay men aren’t always exclusively homosexual.

Let those statistics sink in. Although the LGBT (lesbian, gay, bisexual, transgender) lobby would like you to believe that one person in every four shares their perverted proclivities, the real number totals out to about 3.5%, and (as the CDC reported above) the number of homosexual men is only about 2% of the population. That means that AIDS is still—after all these years—still spread primarily by men having sex with other men: over thirty times more frequently than what their statistical demographic would indicate. Why, then, do they rail against God, who warned them, saying, “You shall not lie with a male as with a woman. It is an abomination.” They have no one to blame but themselves for spreading the curse of AIDS. What was it Solomon said? “Reproof is more effective for a wise man than a hundred blows on a fool.” (Proverbs 17:10)

So the CDC reminds us, “In the United States, about 15,500 people with AIDS died in 2010. HIV disease remains a significant cause of death for certain populations. To date, more than 635,000 individuals with AIDS in the United States have died.” And beyond our borders? “HIV disease continues to be a serious health issue for parts of the world. Worldwide, there were about 2.5 million new cases of HIV in 2011. About 34.2 million people are living with HIV around the world. In 2010, there were about 1.8 million deaths in persons with AIDS, and nearly 30 million people with AIDS have died worldwide since the epidemic began. Even though Sub-Saharan Africa bears the biggest burden of HIV/AIDS, countries in South and Southeast Asia, Eastern Europe and Central Asia, and those in Latin America are significantly affected by HIV and AIDS.”

The world spends between 8 and 10 billion dollars per year on AIDS research, when any idiot knows that it is 100% preventable—just don’t have sex with people (especially homosexual men) who have the HIV virus. But as I said, they’re looking for that magic bullet, that miracle pill, that can take all the risk out of what appears to be suicidal sexual behavior. Call me unsympathetic, but it seems to me there’s nothing particularly “gay” about harboring a death wish.

Ironically, a report published on WhyDontYouTryThis.com claims that “Scientists at the Washington University School of Medicine in St. Louis found that melittin, a toxin found in bee venom, physically destroys the HIV virus, a breakthrough that could potentially lead to drugs that are immune to HIV resistance.” Why is that ironic? Because (1) bees, as we noted previously, are becoming an increasingly endangered species. (2) It’s a natural cure, one with which the drug companies would have a hard time making a killing (so to speak). And (3) the bee-cure researchers state, “Melittin-loaded nanoparticles are well-suited for use as topical vaginal HIV virucidal agents.” Apparently, they didn’t get the memo about the vast majority of HIV infection being spread by men having sex with other men.

A CDC “fact sheet” (February 2013) summarized the extent of the sexually transmitted infection problem in the United States. “CDC’s new estimates show that there are about 20 million new infections in the United States each year [that’s one person in sixteen], costing the American healthcare system nearly $16 billion in direct medical costs alone. America’s youth shoulder a substantial burden of these infections. CDC estimates that half of all new STIs in the country occur among young men and women (up to age 24). In addition, CDC published an overall estimate of the number of prevalent STIs in the nation. Prevalence is the total number of new and existing infections at a given time. CDC’s new data suggest that there are more than 110 million total STIs among men and women across the nation.” Don’t let that last statistic escape unnoticed: one out of every three people living in America is walking around with one or more sexually transmitted diseases!

Remember our discussion of treatment-resistant multiresistant bacteria strains? Some of these “superbugs” are venereal diseases. Fox News (September 16, 2013) reported, “Antibiotic-resistant gonorrhea, a diarrhea-causing superbug, and a class of fast-growing killer bacteria dubbed a ‘nightmare,’ were classified as urgent public-health threats in the United States on Monday. According to a new report by the U.S. Centers for Disease Control and Prevention (CDC), at least 2 million people in the United States develop serious bacterial infections that are resistant to one or more types of antibiotics each year, and at least 23,000 die from the infections….”

When antibiotics like penicillin first became available, they were quite effective against STDs like syphilis and gonorrhea. But now that mankind’s vigilance has dropped to the same level as our morals, we find ourselves in trouble once again. “Overprescribing of antibiotics is a chief cause of antibiotic resistance, affording pathogens the opportunity to outwit the drugs used to treat them. Only a handful of new antibiotics have been developed and brought to market in the past few decades, and only a few companies are working on drugs to replace them. In addition to resistant gonorrhea, the others now seen as urgent threats, according to the first-of-its-kind report released on Monday, are C. difficile [Clostridium difficile] and the killer class known as carbapenem-resistant Enterobacteriaceae, or CRE….

“Last March the chief medical officer for England said antibiotic resistance poses a ‘catastrophic health threat.’ That followed a report last year from the World Health Organization that found a ‘superbug’ strain of gonorrhea had spread to several European countries….”

The article discusses the dangers of C.difficile and CRE, and then goes on to enumerate the specific menace of “the third ‘urgent’ threat in the report…drug-resistant Neisseria gonorrhea, which causes 246,000 U.S. cases of the sexually transmitted disease gonorrhea each year. Gonorrhea is increasingly becoming resistant to tetracycline, cefixime, ceftriaxone and azithromycin—formerly the most successful treatments for the disease. Gonorrhea is especially troublesome because it is easily spread, and infections are easily missed. In the United States, there are approximately 300,000 reported cases [322,000 in 2011, making it the second most commonly reported notifiable infection in the nation—NBC], but because infected people often have no symptoms the CDC estimates the actual number of cases is closer to 820,000. If left untreated, gonorrhea can lead to pelvic inflammatory disease, ectopic pregnancy, stillbirths, severe eye infections in babies and infertility in men and women.

“Not surprisingly, [the report] underscores the need for new antibiotics, citing ever-slowing development efforts by pharmaceutical companies due to the high cost of such programs and relatively low profit margins of the drugs.” Yes, unfortunately, it’s not surprising. What would have been surprising is if they had recommended exercising a little sexual responsibility, a little self control.

Reporting on the same story, NBC News added, “The CDC report points out, there aren’t any new antibiotics in the immediate works that will kill any of these bugs. So patients may have to be treated with older, more toxic drugs, or with cocktails of antibiotics that may cause side-effects…. Gonorrhea may not be immediately life-threatening, but it’s developing resistance to the drugs that used to easily treat it. Patients can be left infertile, and in January, Canadian researchers reported that seven percent of patients weren’t cured by the only pill left to treat gonorrhea.” The whole issue also makes one other fact glaringly obvious: drug companies are not in business to cure disease, but to make money. That is their right, of course: no sane person would suggest that they aren’t entitled to earn a living. But we should never forget that profit—not health—is their driving motivation. It affects everything they do.

Here, then, are the basic facts: (1) In the United States (and perhaps worldwide, though accurate statistics are harder to come by) one third of the entire adult populace has (or has had) one or more sexually transmitted infections. (2) The moral/spiritual condition of the vast majority of mankind is not conducive to either lifelong marriage or sexual abstinence: promiscuity is the “new normal.” (3) The drugs once relied upon to cure these diseases are no longer effective (or as effective) as they used to be half a century ago.

So the seemingly “hopeful” prognostication of the World Health Organization concerning what can be expected to kill us off in the fourth decade of the twenty-first century can be misleading: “Large declines in mortality between 2004 and 2030 are projected for all of the principal communicable, maternal, perinatal and nutritional causes, including HIV/AIDS, TB and malaria. Global HIV/AIDS deaths are projected to rise from 2.2 million in 2008 to a maximum of 2.4 million in 2012, and then to decline to 1.2 million in 2030, under a baseline scenario that assumes that coverage with antiretroviral drugs continues to rise at current rates. Overall, noncommunicable conditions [i.e., cancer, heart disease, etc.] are projected to account for just over three quarters of all deaths in 2030.” We’re all going to die, of course. The question is, “From what?”

Why is this misleading? Because most sexually transmitted diseases don’t kill you outright. They merely make your life miserable, unfruitful, and unproductive. Instead of being part of the solution, a person with an untreated STD eventually becomes a drag on society, requiring more and more “symptom management” as the disease progresses. And getting “treatment,” as we have seen, is getting harder by the day, as the arsenal of drugs becomes progressively more ineffective. The WHO’s faith in antiretroviral drugs to treat AIDS is misplaced: they don’t cure the disease—they merely make it possible for you to live with it a bit longer—a win-win scenario for the companies that manufacture these expensive drugs. Once again we must ask ourselves how much strain can humanity put on its own society before it breaks down altogether? What will happen when twenty percent of the populace is asked to carry the weight of the other eighty? Although there are (as we are learning) plenty of candidates for the job, STDs could easily become the proverbial “straw that breaks the camel’s back.” And by the fourth decade of the twenty-first century, sexually transmitted diseases could easily reach such epidemic proportions, finding an uninfected mate could prove to be next to impossible.  

Culturally-Inflicted Pestilence

Our “civilization” itself is the source of a whole range of ailments that are making our existence on this planet increasingly untenable. Some of these afflictions are psychological, some environmental, and some physical—but they all conspire to make humanity less able to function as we were intended in this world. We touched on a few of them in other contexts—factors such as air pollution and nutrient depletion. Cultures (and the challenges they present) vary from region to region, of course, so this section will be primarily concerned with the hazards of living in “developed” nations—places (like the U.S. and Western Europe) where we tend to presume that “life is good,” even if it’s not, simply because we don’t have to get our water from a creek or share our living rooms with goats.

One of the most obvious “cultural pestilences” is obesity. It’s a two-edged sword, for while our culture promotes an ideal of “thin at all costs,” it’s not all a style thing. The fact is, we actually have become fatter over the past half century. The healthyliving.msn.com website (Harvard Health Publications) asks, “What accounts for a worldwide epidemic of obesity? It’s hard to understand how human genetics, hormone levels, or metabolic activity could change rapidly and simultaneously in millions of people, yet obesity has been increasing sharply throughout the industrialized world. In less than 40 years, the prevalence of obesity in the U.S. has increased by over 50 percent, so that two of every three American adults are now overweight or obese. Even worse, the obesity epidemic is rapidly spreading to our children.

“Diabetes, hypertension, and heart disease are the most obvious consequences of obesity, but other results range from cancer, arthritis, and depression to kidney stones, fatty liver disease, and erectile dysfunction. All in all, obesity and overweight account for nearly one of every 10 American deaths, and they also drain our society of $223 billion a year [i.e., in health-related expense].” On the other hand, Americans spend over $40 billion a year on diet products and self-help books. We spend $2.6 billion annually on gym memberships (most of it wasted because we don’t actually go). The fitness industry generates $19 billion in business—gotta have the proper shoes, right? $1.5 billion is spent in America every year on dietary supplements. Consumers in the U.S. spend over $60 billion per year trying to lose weight—that’s $200 for every man, woman and child in the country. Diet pills and meal replacements are a $3 billion market. We spend another billion dollars on diet food home delivery services like NutriSystem.

So why are so many of us still so fat? Harvard Health again: “Using data from the U.S. Bureau of Labor Statistics, researchers evaluated the relationship between physical activity in the workplace and obesity over the past several decades. In 1960, nearly half the jobs in the private sector required at least moderate physical activity, but in 2010, less than 20 percent demanded this much physical work. Advances in manufacturing and agriculture explain the drop in human energy needed at work. That’s good news for a man’s back, but not for his belly. In fact, the change in occupational energy expenditure means that the average American man is now burning 142 fewer calories each day than he did in the 1960s. That may not sound like much, but over the years, it adds up. Between 1960 and 1962, the average American man weighed 169 pounds, but during the 2003–2006 time period, he weighed 202 pounds. A decreased energy output of 142 calories a day can account for 28 of those extra 33 pounds. And another 2011 study, this one of 288,498 Europeans, found that inactivity packs on extra weight where it is most harmful, in the abdomen.”

I can probably speak for half of America when I opine that computers are a big part of the obesity problem (not that I’d want to go back to working without one). Speaking strictly for myself, I’m not making any progress if I’m not sitting in this chair, either pounding away at the keyboard or prowling the Internet, my Bible software, or my library looking for clues to life’s mysteries. Even back in the days when I earned my living as a graphic designer, most of my income was earned sitting down. There are any number of people who can relate.

“A decrease in physical activity at work would not lead to weight gain if it were counterbalanced by an increase in leisure-time exercise. Unfortunately, that hasn’t happened; over the decades, the fraction of Americans who say they meet national guidelines for leisure-time exercise has remained stable at 25 percent — but objective measurements suggest the actual percentage of adults who get enough exercise is closer to 5 percent. And all you have to do to get the leisure-time exercise you need is to walk for 30 minutes a day. If people don’t work out in their spare time, what do they do? They sit still; the average American, in fact, spends 55 percent of his waking hours sitting down. And when Americans sit, they are often perched in front of a video display, either a workplace computer or a living room TV. Sedentary work is an inevitable byproduct of the Information Age, but TV watching is voluntary and optional—and it often involves watching seductive ads for junk foods just when snacks are close at hand.” To be fair, of course, reading good books is just as sedentary an activity.

And then, there’s the “calorie creep” factor. Simply stated, we eat more than we used to. “To find out how changing eating habits affect weight, researchers from the University of North Carolina evaluated data from four large national surveys that included 44,754 Americans ages 19 and above. The research covered a 30-year span from 1977 through 2006, during which time the national waistline continued to expand.

“The average daily caloric intake increased steadily during the study period, but the basis for the increase changed over time. During the first half, increasing portion sizes accounted for the lion’s share of the caloric splurge. But in the 1990s, doctors and nutritionists sounded the alarm about the supersizing of American snacks and meals. The warnings seemed to work: starting in 1994, portion size stabilized and then dropped slightly. Food choices also appeared to improve a bit, since the consumption of calorie-rich foods dipped a bit in the 1990s. Was it a triumph for doctors and nutritionists? Sadly, it was not. Although the portion size and caloric density of the average American diet changed for the better, the improvements were very slight. Even worse, these small gains were more than offset by a new threat. Although the caloric content of individual meals and snacks stabilized, Americans began eating more often. Over the 30-year period, the average number of meals and snacks rose from 3.8 a day to 4.9 a day.”

It all happened so gradually, nobody really saw it happening. We were all just going about our lives, chalking up our expanding waistlines to the passing of the years—the “new normal.” “All in all, both portion size and eating frequency accounted for the rise in caloric intake; sugar-sweetened sodas made the single largest contribution to the caloric glut. Because the increases accrued slowly but steadily over 30 years, the annualized average calorie intake increased by 28 calories a day. That may not sound like much, but over three decades, it adds up to several notches on the typical guy’s belt.” A few paragraphs back, we learned that “the average American man is now burning 142 fewer calories each day than he did in the 1960s.” Now we see that he’s consuming 28 more calories—for a net gain of 170 calories—something it would take about half an hour of fast walking every day for the average guy to burn off.

I should also reiterate one factor from a previous chapter: perhaps one of the reasons we eat more (when given the chance) is that the nutritional content of our food has plummeted over the past half century. There are less vitamins and minerals—by a wide margin—in any given carrot, apple, or slice of bread, than would have been there in times past, even though the calorie count has remained relatively constant. So it’s entirely possible that we are subconsciously trying to make up for the deficit.

Another factor that has become increasingly significant during the past half century is leptin resistance. Nutrition guru J.J. Virgin, in her best-selling book The Virgin Diet, writes, “Inflammation…creates leptin resistance, which makes it even harder for you to lose weight. Leptin is a hormone that responds to how much you’ve eaten and signals the brain that you’ve had enough (satiety is the technical term). Leptin also helps burn fat by cueing your metabolism to run faster when you have extra fat to burn and slowing down your metabolism when your body needs to hold onto fat…. When leptin is working well, you eat until you don’t need any more food and then you stop. When your system is leptin-resistant, you have a lot of leptin circulating in your blood, but your brain can’t ‘hear’ it…. As a result, [overweight people] experience increased hunger, food cravings and weight gain. Even though they’re overweight, their bodies believe they’re hungry and go into a state of fat storage.”

Her recommendations for reducing leptin resistance and reducing inflammation include these four nutritional targets: (1) “Instead of consuming inflammatory fats from processed foods, corn, dairy, and eggs, switch to anti-inflammatory fats from wild fish, raw nuts and seeds, and olive oil…. (2) Get rid of the sugar, artificial sweeteners and high-glycemic foods. Sweet and starchy foods raise blood sugar, which raises insulin, which leads to inflammation…. (3) Genetically modified organisms (GMOs)…can disrupt your healthy gut flora (intestinal bacteria), trigger an immune response and create inflammation…. (4) Let go of the top 7 high F1 foods, the ones most likely to cause an adverse reaction.” These foods include gluten, soy, dairy, eggs, corn, peanuts, sugar, and artificial sweeteners.

Obesity is a growing problem not just in the West, but globally. The World Health Organization reports that “worldwide, obesity has nearly doubled since 1980. In 2008, more than 1.4 billion adults 20 and older were overweight. Of these, over 200 million men and nearly 300 million women were obese. 35% of adults aged 20 and over were overweight in 2008, and 11% were obese. 65% of the world’s population live in countries where overweight and obesity kills more people than underweight. More than 40 million children under the age of five were overweight in 2011.” Even in China, the most populous nation on earth, obesity has become an issue in the cities—where increasing prosperity has led to expanding waistlines. “Recent statistics have showed that the average waistline of Chinese urban males has gone from 63.5 cm in 1985 to 76.2 cm in 2012, growing by 20 percent over just 27 years.”—China.org.  


 What do people do when they finally decide to get serious about their obesity problem? They go on a diet. There are a thousand ways to do it, and a few of them actually work—for some people, anyway. Health considerations are one thing. The cultural impetus, on the other hand, works out like this: since people who look healthy are more sexually attractive, most people calculate (consciously or unconsciously) that they if they look better, they’ll get more sex, all things considered. (That’s why guys reflexively suck in their guts when a pretty woman walks by—it’s practically automatic.) Taken to extremes, however, obsession with self image and body weight can lead to destructive eating disorders.

The sex drive is part of human nature, of course. It keeps the species viable. God intended it to be channeled through marriage and family, but sadly, since temporary liaisons have become the norm (replacing lifelong marriage commitments), “looking for love (in all the wrong places)” has become a constant avocation for many people. Furthermore, since obesity is universally acknowledged to be unhealthful today, it has taken on the cultural connotation of being unattractive (read: “un-sexy”). Therefore, the goal in modern society is to become healthy, or more to the point, look healthy. (It’s funny how perception changes. A couple of thousand years ago, being fat meant you were wealthy and lived a life of leisure—hence you were by definition “attractive” and “admired.” The only guys with “six-pack abs” were slaves and common laborers.)

Being healthy is a good thing, of course. Our neshamah-equipped bodies are designed to be the temples of the Holy Spirit—a mind-blowing concept if you stop and think about it—so it makes sense to take care of them and treat them with respect, just as with any valuable gift. But our obsession with being thin—whether or not that’s an accurate reflection of our health status—began to be manifested in what are called “eating disorders,” mental illnesses in which people become preoccupied with food and its effect on our bodies. Nutrition, taste, energy, health—all of these things take a back seat to the fact that food is, in the end, what makes a fat person fat: it is deemed the enemy. The supposed excesses of obesity are thus traded for completely different excesses: strategies designed to make one thin, whatever the cost, whatever the consequences.

Image issues—skewed perceptions of what the ideal person looks like, no matter how unrealistic—often play a part. But in truth, that’s an oversimplification. In some cases, logic and purpose have nothing to do with it: there is a complicated dynamic between eating disorders and other types of psychological dysfunction—PTSD, sexual trauma, personality disorders, substance abuse, anxiety, and impulse control issues.

Wikipedia defines the three most prominent types of eating disorder:

(1) “Anorexia nervosa is an eating disorder characterized by immoderate food restriction and irrational fear of gaining weight, as well as a distorted body self-perception. It typically involves excessive weight loss and occurs far more often in females than in males. Because of the fear of gaining weight, people with this disorder restrict the amount of food they consume. This restriction of food intake causes metabolic and hormonal disorders…. Patients suffering from anorexia nervosa may experience dizziness, headaches, drowsiness and a lack of energy.” The word anorexia means “loss of appetite,” but that’s not strictly true in the case of this malady. Hunger is still present, but it is suppressed by the sufferer. The condition actually presents as compulsive fasting, driven by skewed image issues—a feeling that she (there are ten times as many anorexic women as there are men) is fatter than she really is, even if she is actually underweight. Anorexia has the highest fatality rate of any mental illness—as many as four percent of anorexic individuals die from complications of the disease.

(2) “Bulimia nervosa is an eating disorder characterized by binge eating and purging, or consuming a large amount of food in a short amount of time followed by an attempt to rid oneself of the food consumed (purging), typically by vomiting, taking a laxative, diuretic, or stimulant, and/or excessive exercise, because of an extensive concern for body weight…. Bulimia is also commonly accompanied with fasting over an extended period of time. These dangerous, habit-forming practices occur while the sufferer is trying to keep their weight under a self-imposed threshold. It can lead to potassium loss and health deterioration, with depressive symptoms that are often severe and lead to a high risk of suicide. Bulimia nervosa is considered to be less life threatening than anorexia; however, the occurrence of bulimia nervosa is higher….” As many as 4% of women will develop the condition sometimes during their lives, but only 6% of them will seek treatment.

“Patients with bulimia nervosa often have impulsive behaviors involving overspending and sexual behaviors as well as having family histories of alcohol and substance abuse, mood and eating disorders. The overwhelming majority (90–95 percent) of individuals with bulimia are women…. Among women, adolescents are the most at risk. A survey of 496 adolescent girls reported that more than 12 percent experienced some form of eating disorder by the time they were 20. Anorexia and bulimia are the most culturally specific psychological disorders yet identified, and for many young women, looking good for their social life is much more important than being healthy. Self-worth, happiness, and success are largely determined by body measurements and percentage of body fat for young women. Over the years the size and weight of the average woman has increased with improved nutrition, but there has also been an increased message from the media to be thin. The media projects a thin ideal rather than a healthy ideal, and this causes women and young girls to work toward having a thin body even if it means purging.”

(3) “Binge eating disorder (BED) is the most common discrete eating disorder in the United States, affecting 3.5% of females and 2% of males, and is prevalent in up to 30% of those seeking weight loss treatment…. BED is frequently comorbid with obesity, although it can occur in normal weight individuals. There may be a genetic inheritance factor involved in BED independent of other obesity risks, and there is also a higher incidence of psychiatric comorbidity, with the percentage of individuals with BED and an Axis I [i.e., psychological diagnostic categories other than mental retardation and personality disorder—including depression, anxiety disorders, bipolar disorder, ADHD, autism spectrum disorders, anorexia nervosa, bulimia nervosa, and schizophrenia] comorbid psychiatric disorder being 78.9%, and for those with subclinical BED, 63.6%.”

In other words, binge eating is not the same as ordinary gluttony: it’s an abnormal psychological compulsion. Those who binge eat often suffer from other common psychiatric disorders as well. 2.8% of American adults will struggle with Binge Eating Disorder sometime during their lifetimes. But almost 43% of individuals suffering from BED will obtain treatment—the higher statistic most likely due to the fact that many are already being cared for by psychiatric professionals who spot the danger signs as they deal with their other problems.

EatingDisorderHope.com provides these statistics, revealing how pervasive the eating disorder problem has become, at least in America: “Eating disorders are a daily struggle for 10 million females and 1 million males in the United States. Four out of ten individuals have either personally experienced an eating disorder or know someone who has.” So although only about a third of one percent of the populace is laboring under an eating disorder at any given time, forty percent of the population is, to one extent or another, bogged down dealing with the disorder. The question—one I’ve asked before—is, how many people can be forced into the role of caregiver before the whole system breaks down? I’m told that the Viet Cong discovered that if you kill an American soldier you’ve taken one man out of the fight; but if you merely disable him, you remove him plus four of his comrades in arms—in addition to sucking up untold resources behind the scenes. That’s how you defeat an enemy fifty times your strength.

Anyway, “Over a lifetime, the following percentages of women and men will experience an eating disorder. Female eating disorder prevalence rates: 0.9% of women will struggle with anorexia in their lifetime, 1.5% of women with bulimia, and 3.5% with binge eating. Male eating disorder statistics: 0.3% of men will struggle with anorexia, 0.5% with bulimia, and 2% with binge eating disorder.

“Prevalence rates of eating disorders in adolescents/students: The National Institute of Mental Health reports that 2.7% of teens, ages 13-18 years old, struggle with an eating disorder…. 50% of teenage girls and 30% of teenage boys use unhealthy weight control behaviors such as skipping meals, fasting, smoking cigarettes, vomiting, and taking laxatives to control their weight. 25% of college-aged women engage in bingeing and purging as a method of managing their weight. 

“Prevalence of eating disorders among athletes: 13.5% of athletes have subclinical to clinical eating disorders. 42% of female athletes competing in aesthetic sports [e.g. dancing or figure skating] demonstrated eating disordered behaviors.

“Dieting statistics and prevalence: Over 50% of teenage girls and 33% of teenage boys are using restrictive measures to lose weight at any given time. 46% of 9-11 year-olds are sometimes, or very often, on diets, and 82% of their families are sometimes, or very often, on diets). 91% of women recently surveyed on a college campus had attempted to control their weight through dieting, 22% dieted often or always. 95% of all dieters will regain their lost weight in 1-5 years. 35% of normal dieters progress to pathological dieting. Of those, 20-25% progress to partial or full-syndrome eating disorders. 25% of American men and 45% of American women are on a diet on any given day.”

In a perfect world, the only “eating disorder” would be over-doing it a bit when you gathered at special times with your friends and family to celebrate Yahweh’s love and provision. God Himself set aside three times a year when this sort of thing was something of a “given.” The Feast of Unleavened Bread (the spring Passover celebration), the Feast of Weeks (seven weeks later), and the Feast of Tabernacles (the harvest festival in the fall) were times when the entire nation of Israel was to gather together and have a huge party—two of them lasting an entire week. Sacrifices of praise and thanksgiving to God were shared between the worshipers, Levites, and priests. These times—especially Tabernacles or Sukkot—practically begged for a bit of overindulgence. But the rest of the year, folks were expected to eat sensibly, work hard, and honor Yahweh in a somewhat—shall we say—less enthusiastic manner, as far as their eating habits were concerned.

In all of that, image perception, sex appeal, and fashion had nothing to do with how or what someone ate. In scripture, fasting is a companion of prayer, meditation, and repentance—not weight loss. (For that matter, fasting was never commanded in scripture—even on the Day of Atonement when “affliction of soul” was the order of the day, rabbinical extrapolation notwithstanding. It was something one did voluntarily, like taking a Nazirite vow, and if done in the right spirit, was seen as honoring to God. In the same way, observing the Levitical dietary rules (see Leviticus 11) wasn’t done to control one’s weight, but to demonstrate one’s trust in our Creator’s instruction—thousands of years before we had a clue as to why He had forbidden certain animals as food.

I suppose if you wanted to push the issue, you could say that any dietary practice that compromises the health of your body could be construed as an “eating disorder.” That being said, most of us don’t give our health a second thought when we sit down to eat. And maybe that’s for the best. These days, it’s practically impossible to know what’s in your food unless you grew it yourself—and not too many of us can do that anymore. But dietolatry (i.e., “diet idolatry”) comes in many forms: it could be avoiding (or purging) anything that you think might make you fatter than your super-model ideal (something known in the real world as “food”); but it could just as easily take the guise of the gourmet, glutton, health snob, cultural vegan (as opposed to the merely health conscious type), or junk food junkie. However we slice it, food makes for a poor deity. We are to have only one God—Yahweh. But we are to assimilate Him into our bodies: “Oh, taste and see that Yahweh is good!” (Psalm 34:8)  


When Christ told us we must “eat His flesh and drink His blood” if we wanted to have His life within us (John 6:53), the symbol He used implied something He didn’t say (not there, anyway): nothing needs to be added to what God provides to make our spiritual sustenance complete. Thus Yahweh warned the people through Moses: “Whatever I command you, be careful to observe it; you shall not add to it nor take away from it.” (Deuteronomy 12:32) And John, at the very end of the Bible, warned us in a similar vein not to add or subtract from God’s perfect prophetic recipe: “For I testify to everyone who hears the words of the prophecy of this book: If anyone adds to these things, God will add to him the plagues that are written in this book; and if anyone takes away from the words of the book of this prophecy, God shall take away his part from the Book of Life, from the holy city, and from the things which are written in this book.” (Revelation 22:18-19)

I mentioned a while back that I once designed food packages for a living. The ingredients lists were sometimes a real eye opener. The “health food” guys tended to have short lists, with short, pronounceable words in them, while some of the others read like a chemistry mid-term exam. The DiscoveryHealth website notes, “Oftentimes, you can look at the back of a food package and identify artificial ingredients, preservatives or some long name that doesn't exactly sound like food. But though they can spot them and know they aren’t a healthy option, most people don’t bother to seek these ‘ingredients’ out. In our culture, with such easy access to food, it’s easy to ignore the fact that most of the items we purchase in traditional supermarkets are processed and contain additives.”

The “why” of it isn’t terribly hard to figure out: “Man-made preservatives give food and cosmetics a longer shelf life, which allows manufacturers to bring in a bigger revenue. Additives are also used to preserve flavor and color. For centuries people have used salts, vinegars, herbs, boiling and refrigeration [i.e., ice] to naturally hold food items, but in the last 50 years man-made preservatives have become the common method.

“The most popular chemical additives in the food industry today are benzoates, nitrites, sulphites and sorbates. These additives kill and prevent molds and yeast from growing on food. Sulfur dioxide is the most common man-made preservative; it acts as a bleaching agent in food. There are more than 300 additives used today.” But are these additives dangerous? “It is not uncommon for a food additive that was originally believed to be safe for consumers to later be found toxic. Some studies have found additives are a source of headaches, nausea, weakness and difficulty breathing. New research [source: Karen Lau] has shown that the mixture of additives and certain foods can damage human nerve cells. The truth is, we do not understand all of the long-term effects that additives could have on our health because man-made additives are a relatively new invention.”

They list seven particularly dangerous food additives: food dyes; lead (found in juice drinks); BPA (Bisphenol A—a dangerous synthetic hormone found in 92% of all canned foods); Phthalates (another synthetic hormone found in canned foods); Palm oil (a highly saturated fat); MSG (monosodium glutamate, linked to hormonal imbalances, weight gain, brain damage, obesity, and headaches); and hidden trans fats.

MedIndia.net lists their top twelve most dangerous food additives. Not surprisingly, there are some conspicuous overlaps with the Discovery Health list:

(1) “Sodium nitrate: Sodium nitrate tops the list of dangerous food additives. It is highly toxic and stimulates the formation of nitrosamines which are highly carcinogenic (cancer causing) in nature. This deadly compound takes up a good portion of our processed meats, acting as a good preservative which prevents bacterial growth and fast decaying of meat. Skip the hot dogs and bologna and choose from organic chicken and lean meats.” One statistical bombshell: children who eat over twelve hot dogs per month are nine times more likely to develop childhood leukemia—University of California Medical School.

(2) “Trans fats: Making up a considerable amount of our hamburgers, biscuits, chips and popcorns, Trans fats are responsible for triggering obesity and heart problems among millions of people around the globe. While saturated fats raise cholesterol levels, trans fats go a step further. They not only raise cholesterol levels, but also deplete the amount of HDL (the good cholesterol) in your body, making you more susceptible to heart disorders.

(3) “Olestra: Olestra is the ‘non-fat’ fat, found in your potato chips, which is actually nothing but a fake fat. However, it is known to bind to fat soluble vitamins A, D, E and K, which protect the body from cancer and boost immunity. The binding of olestra to these vitamins in general makes you much more prone to cancer. Also, it has demonstrated digestive upset in 12% of population. Replace your packet of potato chips for a fruit or a few wholegrain biscuits.

(4) “Propyl Gallate: Propyl Gallate is a component of meat products, vegetable oils, potato sticks, chewing gum and ready-to-make soup mixes, which prevents them from spoiling, basically acting as a preservative. Studies, however, show that regular consumption of these products may cause colon and stomach cancer.

(5) “Butylated hydroxyanisole: Functionally similar to Propyl Gallate, Butylated hydroxyanisole is responsible for preventing oils and other food stuffs from going rancid. However, research shows they both play a major role in causing cancerous tumors in rats.  

(6) “Monosodium Glutamate: Commonly known as Chinese salt, this compound features near the top of the list of dangerous food additives. Why? First, because MSG is now a part of almost everything you consume. This amino acid is used to flavor salads, soups and other food preparations and was typically used in Chinese food preparations. However, its use has now spread to the Eastern and Western areas, and by now, almost every restaurant or fast food joint literally coats their foods with MSG. Studies conducted on a number of rats showed that almost all of the rats taken under study reported a damaged hypothalamus (an important part of the brain concerned with regulatory activities of the body) and neurons of the inner retinal layer on just one dose of MSG. Also, MSG is used to lab-induce obesity among rats. What makes it far more dangerous is the fact that humans are 3-5 times more sensitive to MSG than rats.” Because of its deservedly bad health reputation, MSG masquerades under several innocuous pseudonyms on ingredients lists: hydrolyzed protein, sodium casinate, autolyzed yeast or yeast extract, or even gelatin.

(7) “Aspartame: Responsible for causing brain tumor in rats, aspartame is an artificial sweetener—an alternative to sugar, used in low-cal diet foods. Overconsumption of these products is known to increase the susceptibility to lymphomas and luekemias. When it enters the body, aspartame converts to formaldehyde, and causes migraine, vision loss, seizures, multiple sclerosis and even Parkinson’s disease. Short term side effects include headache and dizziness.” It has also been linked to depression, irritability, phobias, severe PMS, hyperactivity in children, chronic fatigue syndrome, and fibroyagia.

(8) “Potassium bromate: Potassium bromate is an oxidizing agent used in the bread-making process. This compound has demonstrated carcinogenic effects and has also proven to be nephrotoxic (toxic to the kidneys) both in man and animals. It has developed thyroid and kidney tumors among rats when they were fed bread using potassium bromate as the oxidizing agent.

(9) “Food coloring: Blue #1 is responsible for causing cancer, whereas red #40 may lead to ADHD (attention deficit hyperactivity disorder) in children. Red #3, which is widely used in baking and giving artificial color to cherries, induces thyroid tumors in rats. Stay away from foods and drinks that look unnaturally colorful and bright.” More on this in a bit.

(10) “Butylated hydroxytoluene: BHT is found in chewing gums, potato chips and other packaged foods as a preservative. In spite of being approved by the FDA, it has been proven to be cancer-causing, making it among the top 12 dangerous food additives. One simple way to avoid it is to check the label. Many brands do not use this preservative, so you can switch to using those brands of packaged foods.” By the way, it’s best to avoid Butylated hydroxyanisole (BHA) as well. For that matter, just skip whatever you can’t say: it obviously doesn’t belong on your tongue.

(11) “Acesulfame-K: A new sweetener which is now being used in various soft drinks and baked foods, Acesulfame-K, is actually 200 times sweeter than natural sugar, and is responsible for causing cancer among mice. Also, it affects the thyroid gland in other animals like rabbits and dogs.

(12) “Chloropropanols: This family of drugs is common in Asian food sauces like black bean, soy, and oyster sauce. Two specific substances within this category are known to be cancer producing and are banned in many countries.

My wife, who suffers from a whole range of really annoying autoimmune diseases, has her own personal “Additives-To-Avoid” list (in addition to those listed above): Azodicarbonamide (found in bread); recombinant bovine somatotropin (rBST) and recombinant bovine growth hormone (rBGH), found in dairy products; anything GMO—corn, soybeans, canola oil, etc.; sodium myreth sulfate, sodium laureth sulfate, ethylhexylglycerin, PEG (Polyethylene glycol), TEA (Triethanolamine), and ceteareth (all found in personal care products); carrageenan (a thickener)… and the list goes on. Basically, we try not to eat anything we can’t pronounce.

If we’re eating processed foods (and most of us do to some extent) we are by definition eating chemical additives that are not really food at all. Typical is the ubiquitous ingredient sulfur dioxide. RawGuru.com reports: “The process of preserving food with sulphur dioxide is intended to provide a longer shelf life, kill harmful bacteria that might grow on vegetation and foods, and help food products maintain a certain visual appearance; however, it’s also considered one of the top six air pollutants. Sulphur dioxide occurs naturally in volcanic gases, in some dissolved waters of warm springs, and is a result of coal, fuel, and gas combustion. It is not confined to a specific area as it travels long distances in a short amount of time. It is produced industrially as a bleach alternative, a reducing agent, and for sulfites (preservation). As it has no role in humans or mammalian biology, when introduced it inhibits specific nerve signals, restricts lung performance, and is a direct allergen—over 65% of asthmatic children are sensitive to SO2 (World Health Organization, 1999), and it negatively affects over 70% of children with behavioral problems.

“Sulphur dioxide is used as a common sulphite that is added to dried fruit, fruit juices, fruit, breakfast cereals, and many processed snacks including cookies, soft drinks, meat, cereal bars, muesli bars, yogurt, ice cream, candy, frozen french fries, bread, margarine, and gluten-free flours. It is also used in the process of making wine, even wines that say ‘no sulphites,’ and is found in ingredients like vinegar, corn syrup, corn starch, maltodextrin, potato starch and flakes, beet sugar, bottled lemon juice flavor and dressings, and glucose syrup.”

They have good reasons for using it, of course. “Sulphur dioxide is used as a preservative because it works as an antimicrobial preventing the growth of bacteria, mold, and fungus; an antioxidant preventing rancidity; and as a chemical that attacks enzymes that cause discoloration, ripening, and rotting, usually in fruits after harvest.” So we’re faced with a trade off: either eat chemicals with our food, or deal with shelf-life that’s shorter than the attention span of a goldfish.

And what about the very first entry on MedIndia’s list—sodium nitrate? Again, it is used as a way to greatly extend the shelf-life of certain meats—ostensibly a good thing. But the Mayo Clinic reports, “Sodium nitrate, a preservative that’s used in some processed meats such as bacon, jerky and luncheon meats, could increase your heart disease risk. Aside from the salt and saturated fat in these meats that can disrupt a heart-healthy diet, sodium nitrate also may harm your heart. [And, I might add, many of these processed meats are on Yahweh’s ‘Don’t Eat’ list in Leviticus 11.] It’s thought that sodium nitrate may damage your blood vessels, making your arteries more likely to harden and narrow, leading to heart disease. Nitrates may also affect the way your body uses sugar, making you more likely to develop diabetes.”  

Staying away from dangerous food additives is not as easy as it looks. Take the case of carrageenan, a “food grade” ingredient made from red seaweed—a substance whose lobby has worked (and spent) hard to encourage us to accept as a “safe and natural food additive” because it does its job so well. It’s in so many products these days, it’ll make your head swim. WholeGreenLove.com (April 7, 2013) reveals, “It is not used for nutritional value, flavor, or color, but is instead used in many products as a thickener, giving food a fattier texture or sensation on your palate, imitating a thick, rich, full-fat food option…. It is also used as a stabilizer, so it creates an even texture and consistency throughout the entire product. (An example would be a beverage with particles that naturally separate. Instead of having to shake it to reincorporate it, it just stays evenly mixed.”  

Carrageenan is found is all sorts of things: “Dairy: ice cream, chocolate milk, sour cream, cottage cheese, whipping cream, squeezable yogurts…. Dairy alternatives: soy milk, hemp milk, almond milk, coconut milk, a good majority of the previously listed alternative products i.e. soy yogurt, pudding, ice cream etc…. Meats: Prepared chicken products, and sliced turkey…. Nutritional Drinks: Ensure™, SlimFast™, Carnation Breakfast Essentials™, and Orgain™…. Prepared Foods: microwaveable dinners, canned soups, broths, frozen pizza, even pet foods.

Okay, so it’s ubiquitous, as well as being useful. But what’s wrong with it? “The way carrageenan is chemically structured triggers an autoimmune response. Autoimmune responses lead to inflammation within the body.” Yeah, I think we’ve danced to this tune before. “The inflammation has been noticed more specifically in the gastrointestinal system. Ranging from ‘belly bloat,’ to irritable bowel syndrome (IBS), to Irritable bowel disease (IBD). Prolonged inflammation within the body is a precursor to more serious health morbidity.” Inflammation can and does lead to such conditions as rheumatoid arthritis, arteriosclerosis, and cancers. “There is concern by scientists that the acidity found within the stomach causes ‘food grade’ carrageenan to ‘degrade,’ which would expose your digestive system to a recognized carcinogen…. numerous studies have been published identifying carrageenan’s unique chemical structure and how it triggers an immune response in the body, which is similar to the effects of pathogenic bacteria like Salmonella.”


Another insidious “additive” to many people’s dietary environment in the U.S. is the fluoride that has been added to many municipal water supplies since the 1940s. There has been a raging controversy from the very beginning about the practice’s purpose, implementation, ethics, and efficacy, ranging from “You all need this stuff to ward off tooth decay” to “It’s a Communist plot—they’re out to get us.” Dr. Paul Connett, writing for FluorideAlert.org, lists fifty reasons to oppose fluoridation—while pointing out that neither extreme of the argument is accurate. I’ll refrain from quoting the whole list, but merely report the factors that stood out to me:

Fluoride is the only chemical added to water for the purpose of medical treatment, but it is done without anyone’s informed consent—a key reason why many European nations have ceased the practice. The dose cannot be controlled, since people don’t drink the same amount of water, nor can the “patient’s” need or vulnerability be individually assessed. Even if fluoride is efficacious is warding off tooth decay (something still being debated) other sources of dietary fluoride are now available. It is not an essential nutrient like vitamins or minerals are: no disease (even tooth decay) is caused by “fluoride deficiency.” Worse, fluoride tends to accumulate in a person’s bones over his entire lifetime.

Furthermore, no one is monitoring fluoride exposure for side effects. In fact, there has never been a single randomized controlled trial to demonstrate (or disprove) its efficacy or safety. Statistically, fluoride is not necessary to control tooth decay. In fact, tooth decay rates remain high in “low-income” communities that have been fluoridated for years, and the rate of dental disease does not rise when fluoridation is halted. While the rate of dental problems has declined in “fluoridated” countries like the U.S. and Ireland, it has also declined—and to a much greater degree—in “non-fluoridated” nations like Iceland, Italy, and Japan. (In other words, simple dental hygiene and improved diet are enough to fix the problem.)

Can fluoride actually cause harm? Perhaps. Its efficacy (if it exists at all) is as a topical treatment: there is nothing to be gained by ingesting the substance. Dental fluorosis is a discoloring of tooth enamel caused by too much fluoride in the diet—especially before the teeth erupt. Because there is so much more fluoride in tap water than in breast milk, bottle-fed babies are receiving far too much fluoride for their body weight. There is also some evidence that as fluoride builds up in the bones over time, causing skeletal fluorosis, a bone disease marked by rheumatic attack, pain, stiffness, and damage to bones and joints. The incidence of bone fractures, especially hip fractures in the elderly, is statistically higher in subjects with lifelong fluoride exposure. According to animal experiments, excessive fluoride may also cause brain damage, learning disabilities, lower I.Q., impaired visual-spatial organization, and behavioral issues. Some studies have linked fluoride consumption to impaired thyroid and pineal gland function, the early onset of puberty, impaired kidney function, and arthritic symptoms. Animal studies have proved that high concentrations of fluoride play havoc with the male reproductive system, damaging sperm and increasing the rate of infertility. It may even be a factor in bone cancer (osteoscarcoma).  

So if the Commies were really that smart, they would have pushed fluoride onto the American populace. Of course, that’s like saying we introduced vodka and atheistic Marxism to the Russians in order to destroy their society. The effect doesn’t necessarily prove the cause.

And speaking of teeth, risk, and statistics, there is one very common dental procedure—performed twenty-five million times annually in the United States—that bears a shocking statistical correlation. It turns out that 97% of terminally ill cancer patients in this country have had root canals sometime in their past. That, if you ask me, seems well beyond the realm of coincidence. Dr. Joseph Mercola, writing for Realfarmacy.com, explains: “Root-canaled teeth are essentially ‘dead’ teeth that can become silent incubators for highly toxic anaerobic bacteria that can, under certain conditions, make their way into your bloodstream to cause a number of serious medical conditions—many not appearing until decades later. Most of these toxic teeth feel and look fine for many years, which make their role in systemic disease even harder to trace back. Sadly, the vast majority of dentists are oblivious to the serious potential health risks they are exposing their patients to, risks that persist for the rest of their patients’ lives. The American Dental Association claims root canals have been proven safe, but they have NO published data or actual research to substantiate this claim.”

How does a root canal procedure compromise your body’s health? “Your teeth are made of the hardest substances in your body. In the middle of each tooth is the pulp chamber, a soft living inner structure that houses blood vessels and nerves. Surrounding the pulp chamber is the dentin, which is made of living cells that secrete a hard mineral substance. The outermost and hardest layer of your tooth is the white enamel, which encases the dentin. The roots of each tooth descend into your jawbone and are held in place by the periodontal ligament. In dental school, dentists are taught that each tooth has one to four major canals. However, there are accessory canals that are never mentioned. Literally miles of them! Just as your body has large blood vessels that branch down into very small capillaries, each of your teeth has a maze of very tiny tubules that, if stretched out, would extend for three miles…. Microscopic organisms regularly move in and around these tubules, like gophers in underground tunnels.

“When a dentist performs a root canal, he or she hollows out the tooth, then fills the hollow chamber with a substance (called guttapercha), which cuts off the tooth from its blood supply, so fluid can no longer circulate through the tooth. But the maze of tiny tubules remains. And bacteria, cut off from their food supply, hide out in these tunnels where they are remarkably safe from antibiotics and your own body’s immune defenses…. Under the stresses of oxygen and nutrient deprivation, these formerly friendly organisms morph into stronger, more virulent anaerobes that produce a variety of potent toxins. What were once ordinary, friendly oral bacteria mutate into highly toxic pathogens lurking in the tubules of the dead tooth, just awaiting an opportunity to spread.

“No amount of sterilization has been found effective in reaching these tubules—and just about every single root-canaled tooth has been found colonized by these bacteria, especially around the apex and in the periodontal ligament. Oftentimes, the infection extends down into the jawbone where it creates cavitations—areas of necrotic tissue in the jawbone itself. Cavitations are areas of unhealed bone, often accompanied by pockets of infected tissue and gangrene. Sometimes they form after a tooth extraction (such as a wisdom tooth extraction), but they can also follow a root canal. According to Weston Price Foundation, in the records of 5,000 surgical cavitation cleanings, only two were found healed. And all of this occurs with few, if any, accompanying symptoms. So you may have an abscessed dead tooth and not know it. 

“As long as your immune system remains strong, any bacteria that stray away from the infected tooth are captured and destroyed. But once your immune system is weakened by something like an accident or illness or other trauma, your immune system may be unable to keep the infection in check…. Nearly every chronic degenerative disease has been linked with root canals, including: heart disease, kidney disease, arthritis, joint, and rheumatic diseases, neurological diseases (including ALS and MS), autoimmune diseases (Lupus and more).

“There may also be a cancer connection. Dr. Robert Jones, a researcher of the relationship between root canals and breast cancer, found an extremely high correlation between root canals and breast cancer. He claims to have found the following correlations in a five-year study of 300 breast cancer cases: 93 percent of women with breast cancer had root canals; 7 percent had other oral pathology; Tumors, in the majority of cases, occurred on the same side of the body as the root canal(s) or other oral pathology.”

Instead of having a traditional root canal, Dr. Mercola recommends removing the tooth entirely and replacing it with a partial denture, bridge, or dental implant, but beyond that, also extracting the periodontal ligament along with one millimeter of the bony socket—breeding grounds for deadly bacteria that have the ability to compromise or overrun our bodies’ immune systems.

My function is not to give medical advice, but merely to point out that the cultural pestilences that afflict us in these last days can be caused or exacerbated by the most common and ostensibly benign practices. Over the past century or so, our civilization has been introducing one factor after another that stealthily eats away at our bodies’ ability to heal themselves as God designed them to do. With 41,000 root canals being performed every day in this country alone, how long can we continue before the sick outnumber and overwhelm the caregivers?


What about the dyes and colorants added to the foods we eat? This is another item that seems to appear on everybody’s no-no list. Forbes.com gives us the scoop: “For centuries, people and companies used dyes derived from natural ingredients to color food. But many of these natural colors contained toxins such as mercury, copper and arsenic. Around the turn of the 20th century, scientists began formulating synthetic colors, derived from coal tar, to replace the existing toxic natural ones. Unfortunately, these synthetic alternatives have proven to have their own slew of problems.

“In 1906, the Pure Food and Drugs Act (a.k.a. the ‘Wiley Act’) instituted the first restrictions on color additives in the United States. In general terms, the law banned artificial colors that proved ‘injurious to health,’ and the government hired chemist Dr. Bernard Hesse to investigate which of the existing 80 dyes being used in foods were safe enough to keep legal. The next three decades saw a process of eliminating colors that caused recurrent adverse health effects in the public. By 1938, only 15 synthetic colors were still legal, and those were subsequently divided into three categories: those suitable for foods, drugs, and cosmetics; those suitable only for drugs and cosmetics; and those suitable only for cosmetics.

“Today only seven colors remain on the FDA’s approved list. Almost every decade, another coal tar issue surfaces, eliminating more and more of the artificial additives in America. For example, after Halloween in 1950, the government banned Orange #1 when many children became ill after consumption. In the 1970s, scientific testing pointed to Red #2’s potential carcinogenic properties (caused intestinal tumors in rats), and it too was banned. Yellows #1, #2, #3, and #4 are among the others that have since been made illegal, and Yellow #5 is currently undergoing further testing for links to hyperactivity, anxiety, migraines and cancer (the color has already been banned in many European countries).

“The link between artificial colors and behavioral problems is a concern, especially for parents of children diagnosed with ADHD. But conflicting results from studies among scientists explains why there are still seven approved colors in the United States. The CSPI (Center for Science in the Public Interest), a non-profit watchdog group, continues to push to ban the existing dyes, or at least apply warning labels on products that contain them, like the E.U. does for six. After a study in 2007 at the University of Southampton, the six dyes that came to be known as the “Southampton Six” were linked to hyperactivity in children, and now require warning labels in the E.U. The FDA, however, is not so convinced that such measures are necessary.”

Considering the role the FDA plays these days, however—that of protector of big agribusiness political contributors above all else—it seems that “necessary” warnings are reserved for when people begin falling over dead in the streets—not “necessarily” when prudence would suggest consumer caution. It took the government decades to admit the health risks of tobacco use, and they still didn’t ban it (they’d learned that lesson with alcohol back during Prohibition). They waited as long as they could (1964), when they were faced with a growing mountain of evidence. They then slapped a warning label on it, regulated it state by state, and taxed it within an inch of its life (not that I’m sorry I don’t have to breathe in somebody else’s second-hand smoke every time I go out anymore).

Once again, I’m not trying to alarm you, or even suggest that you change the way you eat in hopes of staving off disaster. (Disaster is on its way, no matter what you do.) I only want to awaken you to the fact that the hour may be later than you think. The “deck” is stacked against the human race: the prospects for our long term survival look less likely with every new revelation. Even when our “benefactors” try to get it right (and I truly believe that the agribusiness people, politicians, and medical professionals mean well), they somehow end up getting it wrong. It’s not just food additives: no matter what direction you look, no matter what area of common human experience you examine, the track we’re on seems destined for a colossal train wreck not too far down the line. There is hope—even assurance of deliverance—but it lies not in fixing what’s wrong with our world, but in letting God rebuilt it to His own perfect standards. I’m afraid we don’t even know what that looks like anymore.


We couldn’t leave the subject of “culturally inflicted disease” without touching on the issue of illicit drugs—voluntary pestilence designed to alter one’s connection with the world in which he lives. The breadth of the subject varies, depending on how “strict” you are: we could begin with caffeine in coffee, tea, or soft drinks. Alcohol abuse has been a perennial problem since man learned how to brew beer (three days after he discovered fire). Legal prescription drugs are not always used according to what’s prescribed by the doctor. We could discuss performance enhancers (steroids, etc.), nicotine in cigarettes, or even explore “natural” drugs like endorphins and adrenaline. But for our purposes here, I’m going to be concentrating on the sort of “drug abuse” that’s become endemic over the past century or so—the kinds of substances that are calculated to make people a danger to themselves or others.

Believe it or not, such drug use is mentioned—and condemned—in the Bible. For example, Paul writes, “Now the works of the flesh are evident, which are: adultery, fornication, uncleanness, lewdness, idolatry, sorcery, hatred, contentions, jealousies, outbursts of wrath, selfish ambitions, dissensions, heresies, envy, murders, drunkenness, revelries, and the like; of which I tell you beforehand, just as I also told you in time past, that those who practice such things will not inherit the kingdom of God.” (Galatians 5:19-21) Did you catch it? It’s not “drunkenness,” as we might have guessed. It’s the word translated “sorcery.” The Greek word pharmakeia denotes “the use or administering of drugs; poisoning; sorcery, magical arts, often found in connection with idolatry and fostered by it.” (Strong’s) Zodhiates notes that the word connotes either a curative/medicinal drug or a poisonous one, and again ties the word’s usage to the occult, sorcery, witchcraft, illicit pharmaceuticals, trances, and magical incantations with drugs. The Dictionary of Biblical Languages characterizes pharmakon (the related noun) as a “black magic potion.”

As if to emphasize the role of drugs in the Last Days, the word is used twice (the only other scriptural instances) in the book of Revelation. In reference to the sixth trumpet judgment (i.e., toward the end of the Great Tribulation), John reports, “But the rest of mankind, who were not killed by these plagues, did not repent of the works of their hands, that they should not worship demons, and idols of gold, silver, brass, stone, and wood, which can neither see nor hear nor walk. And they did not repent of their murders or their sorceries [i.e., pharmakon—drug fueled idolatry] or their sexual immorality or their thefts.”  (Revelation 9:20-21)

The surprise comes in the context in which the final use of the word is found: “For your [i.e., commercial Babylon’s] merchants were the great men of the earth, for by your sorcery [pharmakon—drugs] all the nations were deceived. And in her was found the blood of prophets and saints, and of all who were slain on the earth.” (Revelation 18:23-24) The passage speaks not about overt idolatry or drug-fueled pagan orgies, but rather a worldwide system of commerce and industry—something so blatant and obvious in our world, we don’t even see it. It’s a big part of something code-worded “Babylon,” the pervasive, systematic rejection of Yahweh’s truth in favor of man-made counterfeits. (Two other “flavors” of “Babylon bouillabaisse” taken to task in scripture are religion and political-military pursuits—but they all smell pretty fishy.) Here we see a not-so-subtle warning against buying into the lie that there is (or should be) a “magic pill” (whether literal or figurative) to fix all of the world’s ills.

The pharmaceutical industry is most certainly part of this, pushing pills and potions, vaccines, antibiotics, and false hope to people who aren’t willing to trust God with their physical welfare. They pander to people desperate to cling to their fleeting youth, as if we aren’t all destined to be dust. The industry plays both sides of the street. On the one hand, they make products designed to help a man achieve an erection without emotion, or enable a woman to look and feel sexually attractive long past her childbearing prime (both of which seem desirable when sex is divorced from marriage). Then they manufacture condoms and birth control pills to deal with the inevitable result—or failing that, provide tools and drugs designed to facilitate the abortions of tens of millions of unwanted and unloved children in the womb every year. There are big profits to be made in AIDS research and antiretroviral drugs, designed, of course, not to cure the self-inflicted disease, but merely to mask the symptoms, making it seem almost “okay” instead of the moral curse it actually is. It’s all legal, socially acceptable, and very, very profitable. So the Bible identifies these merchants of death: they’re not just inner city gangsters, pimps and drug dealers; rather, they’re hailed as the “great men of the earth,” corporate giants who make a literal killing “deceiving all the nations.”

This bit of semi-satirical fluff from RealFarmacy.com sort of says it all: “I took aspirin for the headache caused by the Zyrtec™ I took for the hay fever I got from the Relenza™ I took for the upset stomach and flu-like symptoms caused by the Viagra™ I took for the erectile dysfunction caused from the Propecia™ for the hair loss caused by the Ritalin™ I took for my short attention span caused by the Scopoderm TTS™ I needed for the motion sickness that I got from the Lomotil™ I took for the diarrhea caused by the Xenical™ I needed for the weight gain that was caused by the Paxil™ I took for the anxiety that I got from the Zocor™ that I’m taking for my high cholesterol because a good diet and exercise is just too much trouble.”  

But think beyond big pharma. They’re only a symptom, a symbol, of what’s really going on—the eternal quest for a “magic pill,” a drug, a pharmakon, that will eliminate the desire or perceived need for a holy God. Note what John says about this permutation of Babylon: “In her was found the blood of prophets and saints, and of all who were slain on the earth.” In their quest for power and profit, these people feel they must attack the only force that stands in their way: the saints (i.e., those set apart to Yahweh) and prophets (the spokespeople of the saints). In the end, God holds Babylon responsible for the blood of all of the earth’s slain—both reprobate and the redeemed.

But for now, let’s just concentrate on one dirty corner of this world—illicit drugs (whether the illegal sort, or those commonly prescribed and abused)—substances designed to mask the reality of life. The DEA (Drug Enforcement Administration) website (Justice.gov) lists the following as the main illicit drugs afflicting our culture today. Broken down by broad category, they are:

Narcotics: Heroin (a highly addictive, rapid acting drug processed from morphine, causing drowsiness, respiratory depression, constricted pupils, nausea, a warm flushing of the skin, dry mouth, and heavy extremities); Hydromorphone (an opioid with an analgesic potency two to eight times that of morphine, causing severe respiratory depression, drowsiness progressing to stupor or coma, lack of skeletal muscle tone, cold and clammy skin, constricted pupils, and reduction in blood pressure and heart rate); Methadone (a synthetic—man-made—narcotic, causing slow and shallow breathing, blue fingernails and lips, stomach spasms, clammy skin, convulsions, weak pulse, coma, and possible death); Morphine (a natural—plant based—narcotic, the principle constituent of opium, effective as a pain reliever, causing cold, clammy skin, lowered blood pressure, sleepiness, slowed breathing, slow pulse rate, coma, and possible death); Opium (a highly addictive narcotic extracted from the opium poppy, the Papever somniferum, causing slow breathing, seizures, dizziness, weakness, loss of consciousness, coma, and possible death); Oxycodone (a semi-synthetic narcotic analgesic normally prescribed for pain relief or sedation, causing extreme drowsiness, muscle weakness, confusion, cold and clammy skin, pinpoint pupils, shallow breathing, slow heart rate, fainting, coma, and possible death).

Stimulants: Amphetamine (a stimulant that speeds up the body’s system, causing increased blood pressure and pulse rates, insomnia, loss of appetite, physical exhaustion, agitation, increased body temperature, hallucinations, convulsions, and possibly death); Cocaine, including base cocaine, or “crack” (an intense, euphoria-producing stimulant with a strong addiction potential, causing increased alertness and excitation, restlessness, irritability, anxiety, cardiac arrhythmias, ischemic heart conditions, sudden cardiac arrest, convulsions, strokes, and death); Khat (a flowering evergreen shrub abused for its stimulant-like effect, causing grandiose delusions, paranoia, nightmares, hallucinations, hyperactivity, loss of appetite, difficulty with breathing, and increases in both blood pressure and heart rate); Methamphetamine (a stimulant causing increased wakefulness, increased physical activity, decreased appetite, rapid breathing and heart rate, irregular heartbeat, increased blood pressure, and hyperthermia—overheating. High doses can elevate body temperature to dangerous, sometimes lethal, levels, and cause convulsions and even cardiovascular collapse and death. Meth abuse may also cause extreme anorexia, memory loss, and severe dental problems).

Depressants: Barbiturates (drugs that produce a wide spectrum of central nervous system depression from mild sedation to coma, causing mild euphoria, lack of inhibition, relief of anxiety, sleepiness, impairment of memory, judgment and coordination, irritability paranoid or suicidal ideation, shallow respiration, clammy skin, dilated pupils, weak and rapid pulse coma, and possible death); Benzodiazepines (depressants that produce sedation, induce sleep, relieve anxiety and muscle spasms and prevent seizures, associated with amnesia, hostility, irritability, and causing vivid or disturbing dreams, shallow respiration, clammy skin, dilated pupils, weak and rapid pulse, coma, and possible death); GHB (Gamma-Hydroxybutyric acid, or sodium oxybate, a versatile date rape drug, causing euphoria, drowsiness, decreased anxiety, confusion, memory impairment, unconsciousness, seizures, slowed heart rate and breathing, lower body temperature vomiting, nausea, coma, and possibly death); Rohypnol® (a central nervous system depressant used as a date rape drug, causing drowsiness—sedation, sleep—pharmacological hypnosis, decreased anxiety, amnesia, altered reaction time, impaired mental functioning and judgment, confusion, aggression, excitability, slurred speech, loss of motor coordination, weakness, headache, respiratory depression, and possible death).

Hallucinogens: Ecstasy/MDMA (a synthetic chemical that can produce euphoria, feelings of closeness, empathy, and sexuality, as well as confusion, anxiety, depression, paranoia, sleep deprivation, drug cravings, muscle tension, tremors, involuntary teeth clenching, muscle cramps, nausea, faintness, chills, sweating, blurred vision, increased body temperature, resulting in liver, kidney, and cardiovascular system failure—death); K2/Spice (a mixture of herbs sprayed with a compound chemically similar to THC, the psychoactive ingredient in marijuana, causing paranoia, panic attacks, giddiness, increased heart rate and blood pressure); Ketamine (a dissociative anesthetic causing hallucinogenic effects, increased heart rate, involuntary rapid eye movement, dilated pupils, salivation, tear secretions, muscle stiffening, slow breathing, and unconsciousness); LSD (a potent hallucinogen with no recognized medical use, causing dilated pupils, increased body temperature, heart rate, and blood pressure, sweating, loss of appetite sleeplessness, dry mouth, tremors, psychosis, and possible death); Peyote (a drug made from a small spineless cactus whose active ingredient is the hallucinogen mescaline, causing illusions, hallucinations, altered perception of space and time, altered body image, nausea, vomiting, dilation of the pupils, increased heart rate and blood pressure, a rise in body temperature, in turn causing perspiration, headaches, muscle weakness, and impaired motor coordination); Psilocybin (a chemical obtained from certain mushrooms, causing hallucinations, nausea, vomiting, muscle weakness, psychosis, and possible death); Marijuana/Cannabis (a mind altering or psychoactive drug affecting pleasure, memory, thought, concentration, sensory and time perception, and coordinated movement, causing disinhibition, exhilaration, relaxation, talkativeness, as well as problems with memory and learning, distorted perception, difficulty in thinking and problem solving, loss of coordination, dizziness, tachycardia, facial flushing, dry mouth, and tremors); Inhalants (invisible, volatile substances found in hundreds of common household products, inhaled to induce psychoactive or mind altering effects. Their use can cause damage to the areas of the brain that control thinking, moving, seeing, and hearing, ranging from mild impairment to severe dementia, leading to weight loss, muscle weakness, disorientation, inattentiveness, lack of coordination, irritability, depression, permanent brain damage, heart failure, and possible death.  

There’s far more to it, of course. The DEA also lists “Drugs of Concern”: Bath Salts or Designer Cathinones; DXM, Salvia Divinorum, Steroids…. But you get the picture: these drugs (and many others) are taken (or in some cases, given) in order to in some way separate the drugged one from reality. The user begins by seeking (and yes, achieving) an artificial state of euphoria or well being, but then the “side effects” set in—the dementia, nausea, tremors, organ failure, and sometimes even death. Not to mention the fact that while “on” the drugs, the user is invariably a non-productive member of society—unable to attend to his own needs or those of anyone else. And when “coming off” the effects of the drug, the user is likely to be thinking about one thing only: how to get more.

The problem is twofold. First, people feel that there is something about their lives from which escape via drugs is seen as a viable solution. It need not be a “bad” thing like poverty or pain; it could just as easily be the empty feeling that comes from the vague realization that even the best this world has to offer still falls woefully short of the longing for “heaven” we all share—the “God-shaped vacuum” within each of us, aching to be filled. Second, the drugs are habit-forming, if not outright addictive—they are a self-perpetuating prison. WebMD (mostly lifted wholesale from DrugAbuse.gov) notes, “Drug addiction is a chronic, often relapsing brain disease that causes compulsive drug seeking and use, despite harmful consequences to the drug addict and those around them. Drug addiction is a brain disease because the abuse of drugs leads to changes in the structure and function of the brain. Although it is true that for most people the initial decision to take drugs is voluntary, over time the changes in the brain caused by repeated drug abuse can affect a person’s self-control and ability to make sound decisions, and at the same time create an intense impulse to take drugs….

“Drugs are chemicals that tap into the brain’s communication system and disrupt the way nerve cells normally send, receive, and process information…. Nearly all drugs, directly or indirectly, target the brain’s reward system by flooding the circuit with dopamine. Dopamine is a neurotransmitter present in regions of the brain that control movement, emotion, motivation, and feelings of pleasure. The overstimulation of this system, which normally responds to natural behaviors that are linked to survival (eating, spending time with loved ones, etc), produces euphoric effects in response to the drugs. This reaction sets in motion a pattern that ‘teaches’ people to repeat the behavior of abusing drugs. As a person continues to abuse drugs, the brain adapts to the dopamine surges by producing less dopamine or reducing dopamine receptors. The user must therefore keep abusing drugs to bring his or her dopamine function back to ‘normal’ or use more drugs to achieve a dopamine high.”  

The surprising fact is that God Himself built within our bodies the very “drug-delivery system” that humanity universally craves: dopamine, endorphins, adrenaline—all delivered naturally, in the proper “doses,” and as an appropriate response to the various stimuli of a normal, God-centered life—a hug from your children, sex with your spouse, a wonderful piece of music, art, or literature, a glorious sunset, a job well done, and (of course) chocolate.

Since I have never voluntarily taken drugs (even though I attended college in the sixties), you may logically accuse me of being “out of touch” on this issue—concluding that I can’t possibly understand what makes a drug user do what he does. But my few personal brushes with drugs have perhaps given me all the insight I need.

Years ago, I developed a kidney stone. Not knowing what the problem was (but only that I was in severe distress) I found myself in the emergency room being pumped full of fluids to help it pass—along with morphine for the pain. I must admit, the relief—the escape from torment, replaced with an out-of-control sense of euphoria—was a uniquely pleasant experience, one I could easily have sought again, had I been of such a mindset. While “under,” I had absolutely no concern for anything—not my job, my family, my personal responsibilities, or even how I was going to pay for the E.R. I was just floating there on cloud nine, out of it and happy to be so—until the drug wore off and I returned to planet earth, which was still by no means perfect, though it was where I belonged and where I was needed. For the first time in my life, I could “see” the attraction one might have to taking drugs.

Three times in my life I have had terminally ill children in my life undergoing hospice care. (Four of our nine adopted children were severely handicapped when we got them.) The first two died decades ago, but the passing of the third is still quite fresh in my mind. During her last week of life, our daughter Marianne was given some sort of narcotic pain killer to keep her comfortable. It was a good news, bad news story: yes, she was free of pain, but she was also unable to lucidly communicate with her mother and me. I know she wanted to ask me probing questions about her destination (since she was always bubbling with interesting queries), but she couldn’t quite find the words in her drugged state, and I couldn’t be sure she understood my answers or assurances, either.

Those two anecdotes point out the two-edged sword that is drug use: on one hand, people take them for a reason—sometimes legitimate, sometimes not so much. And most of them (especially the ones that were designed to deal with real medical problems, like Oxycodone) do their intended jobs reasonably well, if taken “as directed” and only for short periods of time. On the other hand, like virtually anything you can put into your body, these drugs can be abused—used for purposes God never intended for the body to experience. As I’ve said until I’m blue in the face, we were put here on earth to do one thing: to choose whether or not to receive and reciprocate the love of our Creator, Yahweh—demonstrating that choice by how we live our lives.

What we choose will determine our disposition throughout eternity—life, death, or damnation. The problem with drugs is that they compromise our ability to make rational decisions as we walk through life. To one extent or another, they rob us of the capacity for self-determination, if only temporarily. While under their influence, we are not fully able to exercise God’s most fundamental gift to us: free will. We have surrendered our rights and prerogatives to a mere chemical. Satan, of course, doesn’t want us to ponder our place in God’s universe. He squeals with delight whenever we submit ourselves to anything that precludes rational thought, and drugs fill the bill nicely. Our minds define us: it is therefore in the devil’s interests to see us “out of our minds,” distracted at best, and incapacitated at worst.

Like extramarital sex, illicit drugs are a self-inflicted pestilence. But unlike other diseases, this one has a profit motive. It has been calculated that the illegal drug trade comprises almost one percent of the entire world’s economy. Because it’s illegal, certain realities conspire to make the dope business much more destructive than any other “industry.” Organizations involved in the drug trade—from international cartels to local street gangs—think nothing of intimidation, bribery, and murder in order to protect their obscene profits. Because no taxes are paid, the profits are enormous—large enough for the cartels to “own” whole governments. And because the “product” is addictive (whether physically or psychologically), it “pays” the dealers to introduce as many people as possible to their false solution to a basic human need. Those who become dependent on these drugs, of course, tend to get to the point where they think of nothing else—forsaking families, careers, and life itself in pursuit of that ever-more-elusive feeling of euphoria.  

The NPR.org website (August 28, 2013) makes the case that drug abuse as a disease is far more serious a problem than the generally accepted statistics would indicate: “Mental disorders and substance abuse are the leading causes of nonfatal illness on the planet, according to an ambitious analysis of data from around the world. A companion report, the first of its kind, documents the global impact of four illicit drugs: heroin and other opiates, amphetamines, cocaine and cannabis. It calls illegal drugs ‘an important contributor to the global burden of disease.’ The two papers are being published by The Lancet as part of a continuing project called The Global Burden of Disease.” Mental disorders are grouped statistically with substance abuse because the two things are linked both causally and medically: drug abuse can cause mental illness, and mental illness is invariably treated with yet more drugs. I find it fascinating that their focus is on the “global burden of disease.” How many times in this essay have I been compelled to wonder at what point society will finally cease to function altogether because those in need have outnumbered and overwhelmed those able to help?

“The results stand out from previous disease rankings—and conventional thinking—which tend to focus on mortality as the most important metric. While mental illness and substance abuse do lead to premature deaths, the authors say, that’s hard to track because deaths are usually ascribed to the immediate physical cause rather than the underlying reason. For instance, suicides are often categorized as deaths due to injury, and overdoses of illicit drugs are often coded as accidental poisonings.

“When researchers use a different lens—not mortality, but illness and disability—the global burden of mental disorders and illicit drug use becomes clear. They add up to nearly a quarter of the total disease burden, more than any other cause worldwide. The reports couch it in ‘DALYs’—Disability-Adjusted Life-Years. One DALY is a year of healthy life that is lost to disease. By that measure, mental and substance abuse disorders cost nearly 184 million years of healthy life in 2010.” What I’d like to know is: how much coke does a man have to snort, and how much ganga does he have to smoke, before a man is fit only for prison or politics? The current president of the United States (as I’m writing this) was legendary in his college years for one thing only: doing dope. The result (if not mere coincidence), was the most far reaching case of narcissistic personality disorder since Nero, negatively impacting hundreds of millions of lives. So it’s apparently true: “The burden of mental disorders and substance abuse far outweighs the resources devoted to preventing and treating them. A third to a half of people with mental and drug use disorders go without treatment in wealthier countries, and up to 85 percent go untreated in less-developed countries, a previous study reported.”

“Untreated,” of course, is a loaded word. “Treatment” as often as not is a euphemism for trading one addiction for another—methadone for heroin, for example. Forced “drying out” is a hit-or-miss proposition, depending as it does on the ready willingness of the patient/addict to want to get clean. Call me hysterical, but the only “cure” worthy of the name is neither psychological nor physical: its spiritual. If one is filled with the Spirit of God, he has no need (nor room, I’m guessing) for any other “substance” to control his life. But then, I took the “cure” before I ever had the chance to get “sick,” so I may be deemed unqualified to pontificate on the subject.

Genetic Entropy

If anything is considered “certain” in science these days, it is that in the absence of external impetus, any given system tends to become less organized over time—not more complex. That’s an admittedly simplified expression of the second law of thermodynamics, which states that states that “the entropy of an isolated system does not decrease.” Entropy, in turn, can be simply defined as the “degree of disorder or randomness in a system.” It’s a measure of the loss of information or order or available energy that can be brought to bear.

As a universal principle, we all witness practical entropy increasing every day. Nothing exists that was not created, brought into being, built, or assembled by someone or something that was greater in some way than the thing being made. That is, children are “made” by their parents; cars and houses and toaster ovens are built by their manufacturers; planets and galaxies are created by God. Once built, all of these things require intelligence and energy to be expended in order to grow or improve; in fact, laborious maintenance is needed merely to “keep up” with the relentless march of entropy. But in the end, the things that “are” (i.e., that have already come into being) are all in the process of breaking down, wearing out, becoming less fit for their original purpose. People grow old and die. The things we make wear out, become obsolete, go out of style, outlive their usefulness, and finally disappear. Governments or societies rarely endure longer than a few hundred years. Scientific theories come and go. The pyramids of Egypt will eventually be nothing but dust in the wind. Given enough time, even the stars will burn through their nuclear fuel and implode.

And what about living species on planet Earth? They too are prone to going the way of the dodo bird. It has been estimated that there are 8.7 million species of eukaryotic organisms (that is, those with cells whose organelles are contained within membranes) alive on the earth today (plus or minus a couple of million). This includes Animalia, Chromista, Fungi, Plantae, and Protozoa. Of these, 1.2 million species are known and 7.5 million are only hypothetical. The Discovery Channel website, with more flair for the theatrical, puts the total number of species a wee bit higher: “Scientists have estimated that over the course of Earth’s history, anywhere between 1 and 4 billion species have existed on this planet.” I guess they’re including prokaryotes (bacteria and archaea). “Be it through disease, genetic obsolescence, over-predation or any number of other factors, the overwhelming majority of these species are now extinct. Of these billions of species, roughly 50 million still survive into the modern era. While these numbers are certainly extreme at first glance, it serves as proof that extinction, while a sad occurrence, is a part of life for all living things.” Some have estimated that 99.9% of all species that ever lived on Earth are now extinct.

So, the story seems to be: billions of species down; millions left to go. We’ve witnessed quite a few species go extinct in the past century or two, but we’ve never seen a new one emerge. That is, when a new plant or animal is found, it can be safely assumed that it is merely a variation that never got counted before—one of the hypothetical 7.5 million that are said to be lurking incognito on our planet, awaiting discovery by the taxonomists. The “new” species is always very much like something they already recognize—a frog, shark, cockroach, or whatever. The bottom line must be something of an embarrassment for those who insist on teaching the pseudoscience of evolution as gospel truth (as it is in every public school in the land): extinction is much, much faster that evolution ever was, even in their most hopeful hallucinations.

If you’ll recall from our previous chapter, virtually every phylum of eukaryotic life appeared within a blindingly brief twenty million year span of time, some 540 million years ago—the so-called “Cambrian explosion.” If I may quote from the “Water, Air, and Land” chapter, “This rapid and unprecedented diversification of fauna by itself destroys the Darwinian view of evolution. Darwin knew of the fossil evidence for the Cambrian explosion, and it gave him nightmares. He hoped and prayed (to the God whose reputation he was trying to sabotage) that this evidence was due to an incomplete fossil record, but in the century and a half since he published The Origin of Species, the fossil evidence has only gotten stronger: life as we know it fairly leaped onto the world stage within a very short span of time a little over half a billion years ago.  

“And worse (for them), the gaps between basic kinds of animals—a phenomenon evolutionary theory insists shouldn’t be there—have grown more and more distinct as the fossil record became more complete. The “missing links” they so hoped to find simply don’t exist. Of course, this is exactly what you’d expect to find if a Creator-God purposely introduced life-forms onto the planet the way He described the process in Genesis 1. The Cambrian explosion is apparently the fossil record of the fifth day of Creation. As the earth became ready to receive and support them, God placed (not evolved) more and more advanced animal kinds into the biosphere.”

But since we humans began looking for them, we haven’t witnessed the emergence of a single new kind of plant or animal. Once in a great while, we read of the emergence of a new “species,” but it’s invariably like this one, reported on Discovery.com: “Researchers from the National Institute of Health analyzed the genome of two strains of Anopheles gambiae [mosquito] and found unexpected differences. Their research supports the belief that two new species are emerging. Although they are physically indistinguishable, the two emerging species, called Mopti and Savannah, behave differently and prefer different habitats. They react differently to predators too. Mopti out-competes Savannah when predators are around, but Savannah can overtake Mopti when there are no predators.” Really? By that standard, I became a new species when I changed careers in 1996 and moved my family to the other side of the country. I lost many of my old skills, developed all new ones, and broadened my tolerance for cold winters in the process. Voila! A whole new species of Ken, if you believe in that sort of thing. It’s utter nonsense.  

No, what geneticists and sociologists see in real life is not the emergence of new species—and certainly not new animal “kinds”—but rather genetic entropy on a global scale. Like bacteria and viruses adapting to defend themselves against antibiotics and vaccines, “new species” are merely the result of specialization—of dredging up already existing recessive genes that provide an “edge” in certain environments. Something is gained, but at a cost. Nothing new is being created; the genome is not improving, nor is it becoming more complex. It’s like shuffling a deck of cards: if most of the “face cards” end up on the bottom of the deck, you might win a hand of poker with a pair of nines. If you want to call that evolution, you’re kidding yourself. But genetic entropy is even worse. It’s like every time you shuffle the deck, a card is removed, making it harder—or impossible—to win with “complicated” strategies like straights or flushes. Pretty soon, you’re reduced to simple pairs: brute force and dumb luck.

As if to confirm my hypothesis, Michael Snyder, writing for theDailySheeple.com (August 26, 2013) wrote an article entitled, “The Human Race Is Dying: DNA Degeneration Would Eventually Lead To The Total Extinction Of Humanity.” As unsettling as that may sound, the science leads us to precisely that conclusion. Don’t blame me if it lines up perfectly with what Bible prophecy tells us is going to happen. Snyder writes, “The human race is dying. It certainly won’t happen this year or even this decade, but the steady degeneration of human DNA would eventually lead to the total extinction of humanity, given enough time. The reason that we are heading toward extinction is the increasing number of mutations that are being passed down from generation to generation.” The burden of a compromised gene pool is not so much that genetic diseases kill people outright, but that those who labor under them are not able to be fully functional members of society: they have no choice but to be part of the problem instead of being part of the solution.

“According to Dr. John Sanford of Cornell University, every one of us already carries tens of thousands of harmful mutations, and each of us will pass on approximately 100 new mutations to future generations. Humanity is degenerating at an accelerating pace, and at some point the number of mutations will become so great that we will no longer be able to produce viable offspring. This is not going to happen in the immediate future, but already signs of DNA degeneration are all around us. Despite all of our advanced technology, genetically-related diseases are absolutely exploding. Our bodies are weak and frail, and with each passing generation it is getting even worse.

“Most people don’t understand this. Most average people on the street just assume that the human race will be able to go on indefinitely.” What was it the Apostle Peter wrote? “Scoffers will come in the last days, walking according to their own lusts, and saying, ‘Where is the promise of His coming? For since the fathers fell asleep, all things continue as they were from the beginning of creation.’” (II Peter 3:3-4) “But the geneticists that carefully study these things understand this stuff. Each generation is successively becoming more ‘mutant,’ and if given a long enough period of time it would mean our end. Dr. Sanford puts it this way: ‘We are a perishing people living in a dying world.’” Perhaps the last-days scoffers should reevaluate their position in light of the information being revealed by today’s geneticists. Perhaps the evolutionists should reevaluate their position in light of the fact that the mutations in our genome are not making the species better.

“In school and in the movies, we are taught that mutants are ‘cool’ and that mutations can be a very good thing. But that simply is not solid science.” X-men and comic book superheroes are fictional characters. That should be self-evident, but I guess it’s not. “The following is how Alex Williams describes the incredibly damaging role that mutations play in our biology: ‘…Directly contradicting mutation’s central role in life’s diversity, we have seen growing experimental evidence that mutations destroy life. In medical circles, mutations are universally regarded as deleterious. They are a fundamental cause of ageing, cancer and infectious diseases. Even among evolutionary apologists who search for examples of mutations that are beneficial, the best they can do is to cite damaging mutations that have beneficial side effects (e.g. sickle-cell trait, a 32-base-pair deletion in a human chromosome that confers HIV resistance to homozygotes and delays AIDS onset in heterozygotes, CCR5–delta32 mutation, animal melanism, and stickleback pelvic spine suppression). Such results are not at all surprising in the light of the discovery that DNA undergoes up to a million damage and repair events per cell per day.’

“So no, we are not going to ‘evolve’ into bigger and better creatures. Instead, the human race is steadily breaking down, and our time is running out.” Sooner than you may imagine, I’m thinking. “In essence, the blueprint of human life is being systematically destroyed, and there is not a thing we can do to even significantly slow it down.” This, once again, flies in the face of the core premise of evolutionary hypothesis: that humanity (along with every other kind of life) is constantly, if slowly, getting better and better. We’re not. Quite the opposite, in fact. We’re not evolving: we’re devolving.

Mr. Snyder’s article goes on to quote more specific evidence. “The following is from a paper by Gerald H. McKibben and Everett C. McKibben ‘…Geneticists have long worried about the impact of mutations on the human population, and worried that at a rate of one deleterious mutation per person per generation, genetic deterioration [of our entire species] would result. Earlier reports were based on estimates of mutation rates considerably lower than what we now know to be the case. Findings going back to 2002 show that the human mutation rate is at least 100 mistakes (misspellings) per person per generation. Some scientists believe the rate is closer to 300. Even a rate of 100 has profound implications, and the mutation rate is itself increasing. Furthermore, most, if not all, mutations in the human genome must be deleterious. And nothing can reverse the damage that has been done during our own generation, even if further mutations could be stopped. It would appear that the process is an irreversible downward spiral that will end in “mutational meltdown.”’

“So how long do we have until this ‘mutational meltdown’ takes place? Well, according to McKibben and McKibben, Dr. Sanford estimates that the human race has a total lifespan of approximately 6,000 years.” Funny: that’s precisely what Yahweh said, in so many words: 6,000 years before a thousand-year period of “rest” under His Personal care would be required—the kingdom age. In fact, factoring in Yahweh’s omniscience, our susceptibility to mutations could well be the basis of the Law of the Sabbath. Maybe the reason God equated “one day with a thousand years” (II Peter 3:8, Psalm 90:4) is that He knew living in a fallen world would subject the collective genome of the human race to a mutational burden that couldn’t be sustained much beyond six thousand years. Not to belabor the point, but the whole premise of this series of appendices is to demonstrate that however you parse the data, unregenerate man will have run his course on this planet by the fourth decade of the twenty-first century—to be specific, October 8, 2033. You may as well stick a fork in us, ’cause we’re done. It’s the end of the world as we’ve known it (which is not to say it’s actually the end).

“The author cites research showing that the human race is currently degenerating at 1-2 % per generation due to accumulation of mutations. At a 1% decline in fitness per generation, there is a sharp reduction in fitness after 300 generations (about 6,000 years).” Remember, under Darwinian dogma, “fitness” is the sole criteria for survival. Oops. I should point out that Adam and Eve were the progenitors of a whole new—and genetically pristine—species introduced by Yahweh about 6,000 years ago. They (according to the scriptural record) were not descended from the proto-human species (e.g., Neanderthals, Cro-Magnons, etc.) that apparently roamed the earth for several hundred thousand years—man-like animals not equipped with a neshamah (the “breath of God” that makes possible Spiritual indwelling—see Genesis 2:7).

“One of the most interesting revelations in Genetic Entropy is Dr. Sanford’s and other workers’ analysis of the Biblical account of life expectancies. In a statistical regression analysis of declining life spans since Noah (who lived 950 years), after 32 centuries since Noah the life expectancy has declined to about 70. The remarkable aspect is that this curve, which shows a sharp drop-off after Noah and a more gradual decline about 1,000 years ago [sic. I think he meant to say, “...a more gradual decline after about 1,000 years had passed—which would make it about the time of Abraham and Job], is that it is very similar to theoretical curves presented by other researchers that show genetic degeneration. Either Moses faithfully recorded the events (and ages) recorded in Genesis, or he was a skilled statistician who made up data with a remarkable fit to an exponential curve!” Actually, the evidence indicates that Moses compiled the chronological and historical data in Genesis from extant—and already ancient—written records.

“Other scientists put the lifespan of the human race significantly higher [as I said, it all depends on how you define “human”], but without a doubt there is a growing awareness in the scientific community that the human race is slowly heading toward extinction. This is how Alex Williams puts it… ‘Like rust eating away the steel in a bridge, mutations are eating away our genomes and there is nothing we can do to stop them.’” The problem with mutations, of course, is that they never “mutate” back to the way they used to be. Once they’re in the collective genome, they’re there for good. “Dr. Sanford makes the same point a little bit more eloquently: ‘The extinction on the human genome appears to be just as certain and deterministic as the extinction of stars, the death of organisms, and the heat death of the universe.’”  

Yes, that observation is where I began: at the Second Law of Thermodynamics. I have, of course, quoted qualified sources that tended to make my points for me. There are, I should point out, a plethora of supposedly qualified and highly respected “experts” that would have you believe that we’re just fine: Pay no attention to the disturbing rise in genetic entropy you’ll see all around you if you’re foolish enough to open your eyes and pay attention. The folks from Stanford, for instance, insist that “Few mutations are bad for you. In fact, some mutations can be beneficial. Over time, genetic mutations create genetic diversity, which keeps populations healthy. Many mutations have no effect at all.” Ah, yes: genetic diversity (a.k.a. keeping the races separate, a.k.a., Apartheid), so philosophically dear to the hearts and minds of liberal progressives. I would simply remind you that these are the same people on the same website, who, when asked, “Are GM foods bad for me?” piously opine, “Scientifically there is nothing about the process of genetically modifying a food to make it dangerous. All a scientist does is add one or at most a handful of new genes to the crop. Since plants have tens of thousands of genes, adding a few extra is really no big deal.” A little bit of arsenic surely won’t do you any harm. Knowing what you and I now know about GMOs, we can spot a bald-faced lie (told no doubt in order to keep the funding rolling in) when we see it.

So the Stanford people feel that “Few mutations are bad for you.” Really? Wikipedia lists 771 diseases that are linked to genetic factors. I’d call that more than “a few.” Although these diseases haven’t all been classified as to what sort of disorder they are, there are four basic types: (1) Point mutation, or any insertion/deletion entirely inside one gene; (2) Deletion of an entire gene or genes; (3) A whole chromosome extra, missing, or both; or (4) Trinucleotide repeat disorders, extending the gene in length. The most common of these genetic maladies are (listed in alphabetical order) are: 22q11.2 deletion syndrome; Angelman syndrome; Canavan disease; Charcot–Marie–Tooth disease; Color blindness; Cri du chat; Cystic fibrosis; Down syndrome; Duchenne muscular dystrophy; Haemochromatosis; Haemophilia; Klinefelter syndrome; Neurofibromatosis; Phenylketonuria; Polycystic kidney disease; Prader–Willi syndrome; Sickle-cell disease; Tay–Sachs disease; and Turner syndrome. Being merciful, I won’t list all 771 genetic diseases for you. Even the short list is enough to keep the script writers for TV’s “Dr. House” busy for a couple of seasons.

So there it is. Whether or not you “feel sick,” the fact is that pestilence and disease in one form or another are going to increasingly shape your life in the few years left between now and Christ’s kingdom age. Whether communicable or chronic, whether innocently accidental or self-inflicted, whether germ based or genome based, everyone on earth will be touched by disease—if only by being asked to foot the bill for someone who can’t. A strong society requires strong individuals, and there will soon be too much burden for the strong to bear without faltering. What can be done? What will be done? There is only One who can repair the damage, and He has promised to do that very thing: “Bless Yahweh, O my soul; and all that is within me, bless His holy name! Bless Yahweh, O my soul, and forget not all His benefits: who forgives all your iniquities, who heals all your diseases, who redeems your life from destruction, who crowns you with loving kindness and tender mercies, who satisfies your mouth with good things, so that your youth is renewed like the eagle’s.”  (Psalm 103:1-5) I’m looking forward to the day when these things are literal and overt truths, not just promises we know to be trustworthy because of the Holy Spirit dwelling within us.  

(First published 2014)