Written with Charles Cho
Food allergies are a rapidly growing medical and public health problem. Recent studies estimate an incidence of 5% in children younger than 5 years old and 4% in adults. In severe cases, subjects can experience anaphylaxis and even death if exposed to a food to which they are sensitive. There is no known cure. Rather, doctors recommend that the sufferer avoid exposure to the allergen. The mechanism of disease is thought to be immunologic.
A number of drugs can be used during an acute food allergy attack, but only one — intramuscular injection of epinephrine — immediately resolves all of the symptoms associated with the episode. Tellingly, epinephrine is a neurotransmitter/hormone of the autonomic nervous system (ANS) that augments sympathetic function. Emerging data in the literature supports a neuro-immune connection, particularly in light of how the ANS innervates and regulates lymphoid tissues and other constituents of the immune system. It is possible that food allergy syndrome (and perhaps all cases of anaphylaxis) may require both an allergic sensitivity and an underlying inability to generate an adequate sympathetic response (or an underlying parasympathetic/ vagal dominance).1
This year marks the 50th anniversary of “American Girl in Italy”, Ruth Orkin’s iconic photograph of men ogling Ninalee Craig as she walked down a street. Despite the cultural stigma now attached to its practice, the rubbernecking of young beautiful women is an everyday phenomenon. From an evolutionary perspective, prehistoric males who did not instinctively tune to visual cues of potential mates with beneficial traits and fecundity would face adverse natural selection pressure. Presumably, males living today have inherited the tendency to rubberneck nubile females because such proximate behaviors translated to improved ultimate fitness during natural selection.
In similar fashion, people driving by a car accident turn their heads to look. In nature, an organism that does not tune to signals of carnage is ignoring potential useful cues of threat in their vicinity and could be subject to elimination. Our tendency to rubberneck trauma, thus, is an adaptation inherited through evolution through the survivorship bias of those who attended to cues of stress that can improve our Darwinian fitness.
Ethylene is a gaseous hydrocarbon with the molecular structure C2H4. It is commonly produced when hydrocarbons are exposed to oxidative stress, such as that found during lightning, volcanic eruptions, forest fires, and photochemical reactions on the ocean surface. Plants coopted ethylene biosynthesis during evolution to manage their response to oxidative stress from biotic and abiotic sources. Further exaptations of ethylene include modulation of plant life history events such as development, transformation, senescence, and death.
Due to a number of factors described below, humans may be subject to increasing ethylene exposure. The potential health consequences of ethylene exposure are not part of the public consciousness and warrant further exploration.
In a monastery in New Hampshire in 1981, two groups of men in their seventies and eighties relived the 1950s. The men talked about the launch of the first United States satellite and Castro’s victory. They watched Anatomy of a Murder and black and white television and read back-issues of magazines. They engaged in discussions of sports figures of the 1950s. The first group pretended they were really experiencing the 1950s for the first time, whereas the second group simply remembered what it was like to live in that time period.
Afterward, the men’s minds and bodies were tested — both groups performed better physically and mentally. However, the men who pretended they were youthful again, as opposed to those who reminisced, demonstrated a dramatic improvement in performance. The youth had awakened within them.
Most people believe that aging is inevitable, that our bodies decay, a process that culminates in death. Through the study of the two groups of men, psychology professor Ellen Langer found that ideas internalized in childhood can shape the aging process. In fact, research shows that finding the Fountain of Youth is not as far-fetched as it may seem and the potential for immortality lies within our own bodies.
Children are trained to count linearly: one, two, three, four, five, etc. Long before mathematics was invented, however, a subjective process of estimation was used to quantify and make decisions. If the ability to appreciate quantities in linear terms confers fitness advantage, that edge appears to have eluded Darwinian selection. Studies of the Amazonian Mundurucu indigenous tribe and preschool American children all suggest that humans are innately wired to use a compressed scale to understand magnitude – not unlike those depicted by logarithmic, exponential, or power-law functions. A compressed scale is biased toward achieving higher resolution at the lower end of the spectrum where smaller numbers reside, where discriminating subtleties in degrees of scarcity can provide the greatest benefit. Psychophysical studies assessing the magnitude of subjective estimation of sensory inputs such as light intensity and sound intensity also reveal innate mapping of signals on compressed scales. From an adaptive perspective, a compressed scale of subjective estimation enables a wider dynamic range of sensory processing which is valuable in environmental signal interpretation. The hypothesis that selective pressures favored the cognitive adoption of a compressed scale for subjective estimation is consistent with the reality that natural phenomena generally unfold through iteration, yielding patterns of development that are best understood through the prism of compounding rather than the lens of linearity. Like an intellectual slide rule, modern mathematics reprograms children. It obligates them to abandon their natural cognitive tendencies, which rely on compressed scales and estimation and coerces them into adopting linear scales that provide uniform resolution along the entire scale. It resigns them to participate in a wholesale exercise of indiscriminate precision with respect to all things. This force-fed mental framework may help individuals thrive in the artificiality of our modern socio-cultural-economic landscape, replete with man-made straight lines and standardized tests. However, we believe that the conflict between our innate instinct to estimate on a compressed scale and our learned ability to quantify on a linear scale is a source of profound decision dysfunction in the modern world, particularly impairing the ability to assess the possibilities of outlier outcomes.
The concept of the importance of eating a balanced diet took on major cultural significance in this country when the U.S. Department of Agriculture released its first Dietary Guidelines for Americans in 1980 — a response to an increase in heart disease amongst Americans in the 1960’s. The Guidelines are updated every five years to incorporate the latest advances in medical and scientific research, based on the recommendations of the 11-member Dietary Guidelines Advisory Committee, a group of widely recognized nutrition and medical experts. The U.S. government directly or indirectly feeds approximately 54 million people daily according to these guidelines — including over 25 million school children. These numbers are not lost on those in the food, agriculture, and diet industries, who are all busy promoting their particular points of view. They work to install members to the committee whose support they can count on, ostensibly in order to ensure that the committee itself has a ‘balanced’ view of diet and nutrition. In such a politically charged environment, what do we end up with? A ‘balanced’ diet indeed, with a little something on everyone’s plate.
While the experts disagree on what constitutes a diet balanced for optimal health, most presume the need for ‘balance’, and the importance of consuming a wide variety of different foods. The guidelines have changed over time, with the recent addition of an emphasis on physical activity to offset caloric consumption. The debate remains largely centered, however, over which foods reside at the top and which languish at the bottom of the food pyramid, rather than the validity of the approach itself.
Some scientists call for a bigger dose of evolution in doctors’ educations
By MITZI BAKER
Joon Yun, MD, began considering how evolution applies to human health a decade ago when his first heart disease patients died. These cases disturbed Yun, then a Stanford radiology resident. But they also intrigued him.
Having studied evolutionary biology in college, Yun tried fitting these medical failures into that framework.
His mind wandered to the early days of humans when heart disease was a rare trigger of death. In the prehistoric era, a more likely cause of death would have been an attack by a predator. The human body’s response to trauma handles this type of assault by immediately springing to action: The blood forms clots and the blood vessels tighten, together with slowing blood loss, and inflammation kicks in to combat infection. The genes governing these responses to trauma presumably were favored during evolution and have become the “factory setting” in modern humans.