Friday, June 28, 2013

The time you have left

I ran across some YouTube videos today by a guy who calls himself zefrank. As I wandered through them, many of them were quite funny. He has a thoughtful humor and makes you think.

One, however, was a bit more serious, and got me to think about my own life. It's called The Time You Have Left. I invite you to watch this one and see if you don't walk away with the same questions I have after viewing it. I won't spoil the ending. It's important for you to take the journey there yourself. I think it will be worth it.

Thursday, June 27, 2013

Body Language: (Don't) Read my lips


This is a reprint from the Wednesday, June 26, 2013 edition of eSkeptic Magazine. I always thought that body language would give me a little insight into people's behavior. I guess I was wrong, but I suppose it still makes good TV.

Body Language: (Don't) Read my lips

By Karen Stollznow


When you run your hands through your hair like that it makes me think you’re flirting with me,” a colleague once said. Someone’s been reading those self-help books about so-called “Body Language.”


“Maybe I’m not flirting,” I replied, “but instead I’m just getting my hair out of my eyes, or detangling my hair, or it’s a nervous habit, or I have dandruff, or I’m readjusting my wig. There are many possible reasons for any action.” But not everyone believes that.


Linguistics, kinesics and semiotics are among the academic disciplines that attempt to observe and describe gesture and other forms of non-verbal communication. On the other hand (excuse the pun), body language is the new age, self-help interpretation of behavior. Body language “experts” claim they can “read” posture, facial expressions, and other body movements.


The theory seems to be that we conceal the truth with language, but we reveal the truth with body language. Experts on reading body language can supposedly decode our thoughts and feelings, disclose innermost desires, unlock the powers of intuition, provide searing insights, expose secrets, and uncover the hidden meanings behind our behavior. Julius Fast, author of the seminal book Body Language, says: “Your body doesn’t know how to lie. Unconsciously it telegraphs your thoughts as you fold your arms, cross your legs, stand, walk, move your eyes and mouth.”


Self-proclaimed body language specialists fancy themselves as behavioral scientists, or detectives who see clues in our cues, inspired by Sherlock Holmes and TV shows such as Lie To Me. They are motivational speakers, authors of self-help books, and gossipmongers of tabloids and the talk show circuit. They analyze photographs of celebrities to supposedly decipher personality traits, to guess who’s in bed with whom, and predict who’s dumping whom. They interpret footage of political debates, speeches and interviews, to uncover underlying meaning and detect deception.


With the help of these body language experts, you can be a success. Buy their books, attend their seminars, follow their programs, systems and methods and you too can harness these techniques to achieve your goals. They will provide you with the tools you need to give you confidence, to catch a liar in the act, to ace that job interview, to attract love, and to stop sending out those “wrong messages”. Some fear that body language is so accurate that reading people’s behavior is an invasion of privacy, or a type of unethical manipulation.


Proponents of body language theories claim there is standardized meaning in the way you sit, stand and walk, how you shake hands and lick your lips, or even run your hands through your hair. Like a visual polygraph test, they believe that answering a question and touching your nose a certain way, or an involuntary subtle shift in your eye, betrays a lie. Crossing your arms indicates defensiveness, or a lack of openness, while rubbing the stem of a wine glass suggests that subconsciously, you want to rub something (or someone) else.


Body language gurus promote the use of their techniques in conjunction with other alternative therapies. Allan and Barbara Pease, “Mr. and Mrs. Body Language,” are authors of the self-proclaimed “communication bible,” The Definitive Book of Body Language. They recommend the supplementary use of Neuro-Linguistic Programming techniques. Elizabeth Kuhnke, author of Body Language for Dummies, uses body language in conjunction with chromotherapy (i.e., color therapy). Shelly Hagen adopts principles of Feng Shui; “Cashmere and cotton, for example, are soft and inviting, suggesting that the wearer is a gentle soul. Hard materials, like leather or boiled wool, keep others at a distance.”


Some followers believe that body language can do more than reveal if someone is sexually attracted to us. Bioenergetic Analysis is a form of body psychotherapy, pioneered by Alexander Lowen and John Pierrakos. Bioenergetic therapists read posture, gestures and expressions to diagnose their clients, and treat them using exercise, movement and vocal expression. Proponents of Bioenergetic Analysis claim that body language techniques can heal conditions such as anxiety and depression, and even cure cancer.


Like infomercials that claim we only use 10% of our brains, body language theory claims that only 10% (or 15%, or 20% or 50%) of our communication is verbal. I once taught a course in Verbal and Nonverbal Communication and was expected to teach that human communication consists of 93% body language and 7% language. There has been much research on this topic, but no study has categorically proven there is a precise ratio of verbal to nonverbal communication. This myth seems to have its basis in a study by Albert Mehrabian, who studied the typical responses of people when communicating their feelings and attitudes.6 John Borg, author of Body Language: 7 Easy Lessons to Master the Silent Language and other writers generalized these findings to all communication, and created this unscientific sound bite.


Books about body language are full of anecdotes and hype. They include references to authors who support their theories, but no citations to the studies. Their claims, theories and models are unsupported by evidence. They invoke the names of legitimate scholars and irrelevant research to validate the practice. For example, Charles Darwin’s The Descent of Man and Selection in Relation to Sex addresses the importance of gesture and sign language, and suggests that gestures are relevant to the origins of language. Books about body language seize this reference, but they misinterpret and overextend Darwin’s arguments.


Many body language advocates believe in the universality of gestures, behavior and expressions. Tonya Reiman of Body Language University has developed the Reiman Rapport Method, a “ten step process to master universally pleasing body language.” Dr. Paul Ekman has conducted extensive research into classifying human facial expressions. He claims that a small range of expressions, such as those indicating anger and happiness, are biologically universal to all humans. Yet Amrozi, the Indonesian terrorist known as the “Smiling Bomber” demonstrated that a smile can also signal confidence and defiance. There are immense differences in expressions, habits, behaviors and customs across cultures and individuals.


In a study of behavioral cues, Ekman now claims he has discovered 29 “wizards”; subjects he claims are “geniuses of lie detection.” His findings have led to the use of behavior-detection officers by the Transportation Security Administration (TSA), in the Screening Passengers by Observation Technique (SPOT) program. However, TSA’s SPOT methodology isn’t supported by scientific evidence; nor is Ekman’s work in this study. A group of independent researchers who examined Ekman’s research concluded: “Analyses reveal that chance can explain results that the authors attribute to wizardry.” Moreover, Ekman’s recent research is not peer-reviewed. He stated he doesn’t publish his findings anymore because enemies of the United States follow his work closely.


I do not dispute the idea that nonverbal actions carry meaning, or that we are receptive to nonverbal communication. But there is no formula for understanding behavior, and every act has numerous potential meanings and causes. Our “body language” is subject to context, intent and interpretation. It is influenced by culture and socialization and differs at the individual level. Reading body language is simply the subjective interpretation of the observer, and is open to misinterpretation and misunderstanding.


The analogy to “reading” is helpful. Interpreting body language is a kind of cold reading, or a form of body divination to predict thought.


The claims in books about body language show that reading behavior is a superficial and unreliable practice. It is potentially risky too. So be careful the next time you approach the woman who seems to be looking at you and stroking the stem of her wine glass. You might just get that wine thrown in your face.


Wednesday, June 12, 2013

How To Be a Skeptical News Consumer


This is a reprint from the Wednesday, June 12, 2013 edition of eSkeptic Magazine. We are all bombarded with media from every direction. Some of it is things we seek out or comes to us from friendly sources. Other media is unsolicited, coming from friends, public media, and uncountable and yet seemingly reliable sources. How we digest this and what we believe can be a slippery slope. What is the true source of the information, and in some cases, who is the organization who owns the company where the source originates?

This article is a little long, but is a pretty good read. Learning how to question the information you are taking in, and becoming a critical listener, can make the difference between falling into the trap of believing what is phony and what is not.

==============================

How To Be a Skeptical News Consumer

By Donna L. Halper

Even the most skeptical among us have had this happen: A friend or relative forwards an e-mail from an organization with a safe-sounding name (“The Clean Air Initiative,” “The Center for Consumer Freedom”), but the e-mail is filled with scary assertions, usually of a political nature. If the Obama-care health bill is passed, Grandma will face a “death panel” that will decide if she lives or dies; if Barack Obama is re-elected, America will soon become a Marxist or Muslim nation. Some of the chain-emails are obvious partisan propaganda (There is little if any chance of any president, whether Barack Obama or anyone else, imposing Marxism or Islam on America; and the Affordable Care Act [its official name] contains nothing about “death panels”). But some are more subtle, relying on truncated (or fake) quotes, or manipulated facts. And while we most often see these sorts of false (but credible-looking) assertions made during elections, they can also be generated by interest groups trying to peddle unproven cures for diseases, or anti-science advocacy groups who oppose fluoridation or vaccination.

I’m a professor of media, and I focus on critical thinking in every class I teach; but it’s not just college students who can benefit from a skeptical approach to what they see from both print and online sources. Every school—from elementary right on up—should encourage students to become media literate: the ability to evaluate and assess the claims made by commercial advertising as well as by politicians and advocates. We are supposed to live in an “information society,” but sadly, much of what we see and hear is not entirely accurate. As a researcher, I’ve noticed the tendency on the Internet for some “fact” to be posted on one site and then reposted hundreds of times, as if the amount will somehow prove it’s true. As any student of philosophy knows, this is an aspect of Argumentum ad Populum, or the Bandwagon effect—if millions of people believe X, it must be true. Or, as my students will often tell me, they saw it on Wikipedia (or some other frequently read site), so it must be true.

In fairness to Wikipedia, although I much prefer encyclopedias where the articles are signed (so that I know who wrote the piece), some of their articles are quite thorough and informative. But others contain well-traveled myths and rely on volunteers to correct them. It’s often a losing battle. I can’t tell you how many times I’ve refuted the myth that radio station KDKA was the first station in the United States (or in the world, depending on which source you read). This is a durable myth, promoted very effectively in the 1920s by their corporate owner—Westinghouse—which had an impressive publicity department. And that is rule number one of media literacy: Know who created the message, so you can factor in whether the creator was pushing a special agenda. Not all agendas are malevolent. Westinghouse may have indeed believed their station was unique and the company sought to promote that fact. But they were not alone: the Detroit News (which owned a station in Detroit), AMRAD (owners of a station in Medford Hillside MA), and several other American companies had stations on the air at that time, as did the Marconi company in Montreal, and these owners certainly wanted to spread the word about what their stations had done. Yet Westinghouse was so effective in asserting KDKA’s primacy that to this day, the claim is treated as historical fact by otherwise reputable textbooks. History can indeed be written not just by the winners, but by powerful publicists.

Many contemporary media critics treat the proliferation of fake news and erroneous information as something modern, but the truth is we can trace it back several hundred years. In some cases, the misinformation was even intentional, created in order to sell newspapers (a technique still used by today’s tabloids). A good example occurred back in late August 1835, when the New York Sun published an authoritative-looking piece about a famous British astronomer who had discovered life on the Moon, thanks to an amazing new telescope. It was a time when a college degree was only available to the privileged few, and the Sun used techniques that are still being used even now: they cited an “expert,” used scientific jargon, and claimed that his “discovery” had appeared in a prestigious overseas journal. In an era where fact-checking would have been difficult, few readers asked the questions a skeptic might ask today: Was the expert a real person? Did the expert really write what the article claimed he wrote? Did the journal exist, and was his work really published in it? The case came to be called the Great Moon Hoax and a good summary of it can be found on the website of the Museum of Hoaxes.

Of course, even in 1835, there were skeptics (including some at rival newspapers), and eventually, the story was shown to be an elaborate fraud. But this would not be the only time a media outlet hoaxed the public: the Orson Welles’ “War of the Worlds” broadcast from late October 1938 is another frequently cited example. In this case, the broadcast was a radio adaptation of H.G. Wells’ science fiction novel about a Martian invasion of earth. But so realistic was the presentation, complete with scary sound effects (including the special tones used by radio stations when airing a news bulletin), that many listeners were certain they were hearing news, rather than a play. The effect was further enhanced by the deep-voiced and very serious narrator (Welles), who kept providing new and more frightening “details” about the “invasion.” Today, we know that reports of mass panic after the broadcast were exaggerated (the show didn’t even air in some large cities, Boston among them), but it sounded so authentic that millions of listeners were convinced the United States was under attack from Martians, and there is evidence that some people did in fact run from their homes in terror, convinced the end was near. The broadcast was a mixed blessing for Welles, whose Mercury Theater program previously suffered from very low ratings. After the “War of the Worlds” hoax, the show got lots of attention, but not all of it favorable—many people were furious that they had been fooled, and some critics demanded that such programs be banned. As for Welles, he claimed to be shocked, shocked that anyone would believe a science-fiction play, and yet many people did. And to this day, there are programs on television about “ghost-hunting” or about houses that are allegedly haunted; and because they are often well-produced and make good use of special effects, gullible viewers think they must be true.

Unfortunately, there have been many times when the media themselves gave credence to pseudoscience, and not just to sell papers or get bigger radio and TV ratings. It has been noted by some critics that far too often, journalists who lack a background in science simply repeat what a press release claims to be true, or quote from someone else’s article without checking into its veracity. Also, in fairness to journalists, the job of any reporter is to tell a story, and when confronted with a very dense and jargon-filled academic essay, the tendency is to find a way to give it more excitement and mass appeal. The media’s misadventure with science is nothing new: in 1922–1923, many otherwise reputable newspapers were eagerly touting a new “miracle man”—a doctor from France named Emile Coué, who could cure people by teaching them positive thinking, and having them chant “Every day, in every way, I’m getting better and better.” Of course, Coué was not a doctor (at most, he was a pharmacist), and there was little objective evidence of any cures, but that didn’t stop reporters from going to his presentations and marveling at the people who were no longer (pick one) blind, lame, asthmatic, or terminally ill. By most accounts, Coué was quite charismatic, and a number of reporters who saw him seemed genuinely convinced that he was a miracle-worker. The many articles praising him led to the emergence of an entire cottage industry, with radio programs devoted to American “experts” in the Coué method, and schools that claimed to teach anyone how to derive amazing results. Radio also became home to a number of other frauds: fortune tellers, faith healers, and assorted other quacks, some of whom were criticized by the press, but most of whom became very popular anyway. One of the most famous examples was Dr. John R. Brinkley, another fake physician, whose “cure” for impotence involved goat gland implant surgery for men, many of whom underwent the painful procedure in hopes of improving their performance in the bedroom. The story of his successful radio career and his eventual downfall, is well told in R. Alton Lee’s 2002 book The Bizarre Careers of John R. Brinkley.

These days, it’s not just scary chain e-mails that should warrant skepticism and critical thinking. Politicians love to give non-threatening or positive names to laws that would otherwise inspire debate and controversy. Two good examples: After 9/11, Congress quickly passed the PATRIOT Act, which evoked emotions of standing up to terrorists and showing pride in being an American. But the act, which was an acronym for “Providing Appropriate Tools Required (to) Intercept (and) Obstruct Terrorism,” contained some provisions that are still being debated today, and a number which civil libertarians and privacy advocates have vehemently opposed. Another example was the 2002 “Healthy Forests Act,” which certainly sounded like something worth doing: who isn’t in favor of healthy forests? But when skeptics, many of whom were also passionate about the environment, delved further into this act, which was a priority for President Bush, they found it actually encouraged more logging in national forests. Whether logging is a good thing or not, the name did not reflect the provisions the act contained. Another media literacy rule: Find out who is actually behind the innocuous-sounding name, so you can decide whether the facts they are presenting can be trusted.

And then there are fake quotes. Did you know that the Founding Fathers said America is supposed to be a Christian nation? Did you know that they also insisted that a nation that did not rely on the Bible would never prosper? If you believe the chain e-mails sent by conservative Christian advocacy groups, often citing the work of David Barton (an evangelical Christian minister, former co-chair of the Republican Party of Texas, and the founder of WallBuilders, a Texas-based group that claims the separation of church and state is a myth) then you have probably been told that the American founders were opposed to the government helping the poor (especially the undeserving poor), and that they especially feared the rise of socialism. In journalism, it’s a truism that “If your mother says she loves you, check it out.” In other words, just because you got the quote from Mom, that doesn’t mean she had accurate information. I always encourage my students to fact-check quotes, because even if the person actually said it, often the quote is taken out of context (this can frequently be seen in political ads, where both parties try to make their opponent look bad by using a particular quote to fit a narrative of what a horrible person he or she is). The Internet has been a great benefit in finding actual sources for quotes, but it has also been part of the problem: It is very easy to put up an authoritative-looking website with a very ideological agenda. When it comes to quotes, skepticism is especially needed, to make sure that: (a) the person really said it, and (b) the context supports the way the quote is being used. This is not just a good rule for political ads: it even applies to classic movie quotes: The words “Play it again, Sam” were nowhere to be found in the movie “Casablanca,” but millions of people think that’s what Ingrid Bergman said. Thus, in order to make sure your evidence is accurate, take the time to fact-check the quotes, even the ones that “everybody” believes to be accurate.

The bottom line is that it pays to be skeptical because so much of what we encounter in the media turns out to be entirely false, mythically inflated, politically charged, ideologically loaded, or a mixture of facts and fiction. And as we see with the “Birthers,” that percentage of the public who insist that Barack Obama was actually born in Kenya, no matter how much credible evidence is presented that he was born in Hawaii, some people have trouble distinguishing between verifiable fact and unproven opinion. But this is not just a problem that affects Birthers, climate change deniers, or people who think we never walked on the Moon; as we see every day, it is surprisingly easy to misinform the average person. Back in 1938, after the furor over “War of the Worlds,” the Boston Globe’s pseudonymous “Uncle Dudley” gave readers some good advice, words that still resonate today. He said that we all have a duty to think for ourselves and not rush to judgment just because of something we heard in a broadcast. And whether the information is in print or broadcast, he concluded, “…a robust will to doubt, to examine statements, and to measure them alongside common sense and experience… is a hallmark of the civilized mind.




Tuesday, June 11, 2013

What do you know about ALS?

My brother, Tracy, died four years ago today from ALS. We still feel the effects of his passing as much as the day we lost him. His kids miss him terribly and think it's totally unfair to lose someone so full of life, to something so devastating. I wrote about Tracy in my blog four years ago. It was tough to see my little brother pass the way he did, life stripped away inch by inch, until there was nothing left. To this day, I can still hear his laugh, feel his passions, as we shared stories sitting from each other at the kitchen table and sharing a beer.

You can read more about this terrible desease by going here. Educate yourself. Know the signs. It could be more important than you know.

sois sage

Wednesday, June 05, 2013

Food for thought...

This is a reprint from the Wednesday, June 5, 2013 edition of eSkeptic Magazine. As you read this, think about your own approach to food, nutrition, and how you relate to what the author is saying. 


Food for Thought
BY KENNETH W. KRAUSE
Americans currently spend less than ten percent of their disposable income on food, as opposed to more than twenty percent in 1950. Pasteurized milk has saved millions of us from outbreaks of campylobacter and E. coli. Our meats and vegetables last longer than ever and, when prepared with a modicum of skill and restraint, taste pretty good too. Why? Because natural is not always better and because science delivers marvelous outcomes.
Unadulterated science, that is. The equation gets a little messier, on the other hand, when incorrigible greed, governmental hypocrisy, and popular indifference (or blind faith) are entered into the calculation. So the new, real-world result for most Americans is, to say the least, less than appetizing: Seventy percent of American calories now come from industrially processed foods, an $850 billion per year venture.
Highly nutritious and generally affordable vegetables, eggs, fresh meats, fruits, beans, and nuts, for example, are commonly forsaken for their obscurely constructed, pre-packaged, and fast-food counterparts. In only the last century or so, says business journalist Melanie Warner, we have acceded to the “most dramatic nutritional shift in human history,” consuming twice the added fats, half the fiber, sixty percent more added sugar, three times the sodium, and immeasurably more corn and soybean product.
In Pandora’s Lunchbox, Warner exposes the “weird science” of food disassembly and reconstruction commonly applied by various food technologists and manufacturers including National Starch, Kraft, Tyson, General Mills, Sysco, and Pepsi. Subway’s Sweet Onion Teriyaki sandwich, for instance, contains 105 ingredients, more than half of which are “dry, dusty substances” added to the meat (13), bread (22), teriyaki glaze (12), and fat-free sweet onion sauce (8). “Eat fresh” indeed!
Yes, corporate food scientists have a lot on their plate. The end product must not only taste good and withstand the heat and physical wear and tear of processing; it must be consistent from package to package and possess an uncannily protracted shelf life. Perhaps most imperatively, however, processed foods need to be cheap, efficiently produced, and at the same time, marketable as “healthy.”
Consider breakfast cereals as one especially egregious example. There are good ones (Cheerios and Corn or Bran Flakes) and bad ones (Fruit Loops and Cocoa Puffs), right? Not so much.
One-fifth of all Americans and a whopping one-third of their kids, support a $10 billion per annum business nearly every morning. Boxed cereals are creatures of the 20th century and, after beer, wine, cheese, soda, milk, salty snacks, and bread, they are presently the most popular food item in U.S. grocery stores.
In 1905, Will Keith Kellog first altered the original Corn Flakes recipe to make his product last. He sacrificed the grain’s germ and bran because it caused corn and wheat to go rancid. Thus, only the starchy center was left for consumption and, as a result, most of the vitamins and minerals were eliminated as well. Although their scientists later discovered how to deactivate the specific enzymes causing the problem, Kellogs never restored the more nutritious whole-grain formula.
By the 1960s, many packaged cereals were produced through extrusion machines that cooked any number of ingredients into whatever shapes manufacturers thought average consumers would find appealing. Exceptionally harsh and “nutritionally devastating,” as Warner describes, extruders literally rip food molecules apart and melt the remains under extreme temperatures and pressures in a process called “plasticization.” Vitamins A, B1, C, E, and folate, along with natural antioxidants, fare most dreadfully according to a Texas A&M study published in 2009.
Following extrusion, many cereals are pressure cooked, dried, and toasted at temperatures between 525 and 625 degrees Fahrenheit to ensure resistance to decomposition and an extended shelf life. As such, cereal boxes can line grocery store aisles for many consecutive months prior to purchase. There is a downside, of course: whatever vitamins might have survived extrusion and cooking will tend to degrade as the products sit.
But the processed food industry devised a solution to that problem too—though not a particularly good one, according to Warner. To compensate for nutritional loss, manufacturers add synthetic vitamins, often two or more times the amount printed on the package. In other words, if the label says consumers get 30 percent of their recommended daily allowance of vitamin C, for instance, the processor may have actually added 75 percent.
But maybe the very definition of “vitamin” is flawed. As early as the 1970s, studies have suggested that added synthetic vitamins, as we currently conceive them, might provide little nutritional benefit absent certain phytochemicals that always accompany them in nature—carotenoids, flavonoids, and polyphenols, in particular. Plants use these chemicals to ward off pathogens, and they may benefit us as well by thwarting heart disease and cancer, for example, and even by slowing the aging process. As Warner reports, cereal companies have tried very hard, but so far failed to conjoin this “complex web of nutrients” into their products.
An average consumer might be surprised to know how synthetic vitamins are actually constructed. Most are concocted in Chinese factories few Americans would tolerate as neighbors and few are produced through natural processes. Vitamin D, for instance, requires multiple industrial chemicals to transform sheep grease into the supplement commonly dumped into our milk.
Vitamin B1 starts with coal tar, and vitamin A comes from lemongrass oil and acetone. Vitamin B3 emanates from a waste product in the manufacture of nylon 6,6, a material used to make carpets and vehicle air bags. In fact, the most food-based synthetic vitamins are C, B2, and B12, produced through genetically modified bacteria and the fermentation of corn derivatives.
None of which is to necessarily imply toxicity, of course. But, again, synthetically derived vitamins may be of little nutritional value when split from their natural complements. Even if manufacturers one day discover how to recombine vitamins and phytochemicals, the effort might be all for naught. As Warner recounts, some scientists believe that only the complete biological environments inherent in fruits and vegetables will suffice. And the addition of sugars and nitrates could cause problems too. In other words, if Americans think they can continue to eat poorly and supplement their way to health, they very likely have another thing coming.
So, despite a barrage of patently deceptive advertising to the contrary, breakfast cereals are not good for us. But the converse and more crucial question remains—are processed cereals demonstrably bad? As Warner notes, the industry has long relied on the less than inspired “better than a donut” defense. I suppose that depends on the donut, but, in general terms, certain metabolic facts tend to equate rather than distinguish the two foods.
Our ancestors crushed, milled, and cooked their grains for many centuries before cereal companies assumed control. But the old way’s objective was very different from that of the new. Our forebears labored over bowls and pots in order to gain access to the cereal grain’s well-concealed and highly-prized nutrients. Today, the industry extrudes and “gun puffs” our grains to the extreme point where both digestion and nutrition have become practically irrelevant.
Indeed, much processed food—packaged cereals, most notably—come to us essentially “predigested.” As such, we invite their starches to surge into our bloodstreams, triggering spiked insulin levels and, potentially, insulin resistance, metabolic syndrome, obesity, and type II diabetes. In this specific context, the conventional hypothesis that all calories are created equal is plainly flawed.
Perhaps it’s time to adjust our definition of what constitutes a “normal diet”—in particular, to distinguish it from an “average” or “typical diet.” A normal diet for any given species or population is one for which that species or population evolved to consume. The evidence is clear, and I hope compelling: humans did not evolve to receive anything close to seventy percent of their calories from industrially processed foods.