7 key practices to help you achieve your toughest, most important goals. Part 2
By Craig Weller, CPT, US Navy SWCC
4). Use behavior to change negative feelings: One way to deal with negative feelings—which will inevitably come up when pursuing a challenging goal—is to put behavior first. Over time, this allows us to have more control over how we approach any situation. In special operations selection, we used the phrase “quit tomorrow.” When we had particularly bad days, we would tell one another (or ourselves) that we’d just finish out the day. Tomorrow, we could be done with it all and never have to do burpees while soaked in saltwater and covered in sand again.
Inevitably, the next day would come. We’d realize the low point the day before wasn’t that bad, and we’d keep going. In the long run, this took advantage of a phenomenon called self-herding. Self-herding is forming a new behavioral habit by subconsciously referring to what you did in the past under similar circumstances. By not quitting in our low moments, we built a habit of finding a way to keep going whenever things got really bad. Over time, the urge to quit faded because we repeatedly reinforced that bad days still meant that we’d be okay.
Our choices don’t just reveal our preferences. They shape them. If you’re applying this to your own habits, it’s the same process. When you hit a low point, promise yourself you can quit tomorrow. After this workout. After this last round of meal prep. After this section or chapter or lesson. Over time, you’ll reinforce the decision and action to “do the thing that’s good for me right now,” and it’ll shape your future impulses and preferences.
5). Find meaning in being uncomfortable: The Latin root of the word passion is patior, which means to suffer or endure. This is where phrases like The Passion of the Christ got their name. Eventually, the word came to mean not just the suffering itself, but the thing that sustains you while suffering. When we think of people who consistently overcome hardships in order to achieve a big goal, patior is what we see, and it’s easy for us to mistake patior for motivation.
It’s not that these people feel like making small daily sacrifices and trading short-term comfort for long-term happiness. It’s that they have a purpose for doing so. Their suffering has meaning. In order to keep working towards something big, this purpose needs to be a frequent, daily presence in your mind.
In Okinawa, where people have the longest, healthiest lifespans in the world, they call this ikigai: Their reason for living. When surveyed, most Okinawans know their ikigai immediately, just as clearly as you know what you had for lunch. The ikigai of one 102-year-old karate master was to teach his martial art. For a 100-year-old fisherman, it was bringing fish back to his family three days a week. A 102-year-old woman named spending time with her great-great-grand daughter as her ikigai.
This is different from the deepest reason I will describe later on. That deep reason is something rooted in your past, that helps to drive you forward and, as the ancient Greeks used to say, “live as though all of your ancestors were living again through you.” Your ikigai is more about being and becoming. It’s present and future. It’s defining, through your actions, the words that you might put on your tombstone.
6). Use low moments to your advantage: When we experience something that disturbs our equilibrium, such as a tough workout or a bad day at work, a subconscious part of our mind rapidly assesses two things:
Do I know what’s happening?
Do I have what it takes to cope with it?
Our perception of both are derived from experience. The more things we throw ourselves into, whether we succeed or fail, the broader our experiences to refer to when assessing future stressors.
As years of varied experiences accumulate, we can begin to formulate a universal lesson: No matter how many bad things you went through in the past, you were still alive when they were over. This isn’t something you consciously decide. It’s something you teach a deeper part of your brain through practice.
The next time you crash and burn or feel like you keep getting knocked down, remember that even failure provides an opportunity. It’s an earned experience that helps create a more accurate and effective stress appraisal in the future. At some point, your mind will know that you’ve been there, done that—even when you’re in the middle of something awful. You can calmly and rationally move forward with the benefit of hard-earned knowledge.
7). Have a deeper reason: When I had my lowest points in training, I fell back to a mental image of my Dad’s snow boots sitting by our front door. Growing up, we had two cars. My mom was a paramedic and needed one of them. My dad chose to walk to work in the snow every morning so my siblings and I could use the other car to get to school. The mental image of his snow boots represented the countless little sacrifices my parents made for me over the years. Knowing all these sacrifices gave me a deep reason to persevere: I didn’t ever want to have to tell my parents I’d given up.
A deeper reason is the fail-safe that keeps you going when you’ve got nothing else left in your tank.
This mental image has to be uncomplicated, because when you’re hitting rock-bottom from stress, you won’t have the capacity to sort through complex, abstract concepts. You need one image that cuts directly to your core, no matter how tired you are. There’s no surefire way to find that image. Each of our inner worlds is too complicated for this to be an easy exercise. But for a place to start, ask yourself: When you have your biggest successes or failures, who do you want to talk to about them? Why? Think back to a time when someone truly cared about and helped you. Imagine that person watching you in one of your most difficult moments. What do you want them to see?
To summarize, using these 7 concepts, no matter what happens, you can move forward and make progress on any given day. That progress, even if small, actually feels good and can be enough to keep you going… until the next day. This is how you achieve great things—day by day.
Yes, it might be a long, slow, hard journey. But when we look back on our lives, what we remember most will be the things that were worth struggling for—and the way it felt to earn our happiness. Hopefully, this dialogue will help you realize that, if anything, motivation is an outcome, not a starting gun. You can’t control motivation. It can’t be directly pursued.
What you can control is the series of factors that underpin motivation. Just knowing this can help you. Stop waiting for a green light to get started and realize that, even if it’s hard, taking action gets you closer to your goal(s). Also, understand that doing the right thing in the moment is totally within your control.
My blog has evolved considerably since I first started it in 2004. I still attempt to update it with sometimes relevant and/or random observances as often as possible, but I can never promise which way the wind will blow on these things. Change is the only certainty.
Tuesday, June 30, 2020
Wear a Mask, But Act as If It Doesn’t Work
BY HARRIET HALL, M.D.
As I write this on June 29, 2020, we are in the midst of a global pandemic with a scary, rapidly spreading new virus that we don’t understand very well yet. Globally, 10,199,798 have become infected and 502,947 have died, for a 4.93% death rate. In the U.S., 125,928 people have died out of 2,564,163 infections, for a 4.91% death rate. The bad news is that the actual number of infected people is very likely much higher, inasmuch as testing has been hit and miss, so that means the death rate may not be so high. Nevertheless, the raw number of deaths is a staggering figure and everyone is, or ought to be, worried about getting COVID-19. What can you do to prevent catching the disease?
The rational response would be to listen to expert advice and follow the best public health precautions to minimize the risk of catching or spreading COVID-19, which includes frequent handwashing, remaining isolated as much as you can, social distancing when out in public, and wearing a mask around others. But this relatively simple advice has morphed from a public health recommendation to a political hot-button issue. Medical experts recommend masks, for example, but the President sets a bad example by refusing to wear one. Medical authorities recommend social distancing, but people are still crowding together in many settings, and many of them are not wearing masks. Where this has happened in places that began to reopen in May, most notably Florida, Arizona, and California, COVID-19 has come roaring back. Here are some reasons why wearing masks is important.
Surgeons Wear Masks
Surgeons wear surgical masks for many hours a day, day after day. They don’t have any trouble breathing. They are well trained. They know they have to keep their meticulously scrubbed and gloved hands sterile. They know they can’t touch their eyes or their faces and must not touch the mask. If it needs adjusting, they get a nurse or technician to do if for them. If something itches, they know they mustn’t scratch; they ask someone who is not scrubbed in to scratch it for them. They tolerate these minor annoyances not for their own personal benefit, but for the benefit of others. They know the precautions are necessary to minimize the risk of infection in the patients they are operating on.
I wonder how mask refusers would react if their surgeon wanted to perform major surgery on them without wearing a mask. Can you imagine the reaction if a surgeon said, “This is America and I am free to not wear a mask in surgery if I don’t feel like it!” Hopefully mask deniers would realize that would not be a good idea, but I’m not so sure they would.
Surgeons themselves were slow to adapt. In the not so “good old days,” they operated in street clothes with no gowns or gloves, and with unwashed hands. Sometimes the street clothes were visibly filthy or even blood-stained. When Ignaz Semmelweis (1818–1865) discovered that doctors had been causing puerperal fever, they refused to believe they were at fault. They had been going directly from autopsies to the wards, where they touched obstetrical patients, transmitting the deadly bacteria. No wonder they resisted Semmelweis: they didn’t believe in germs (the germ theory of disease wasn’t established until long after his death). Handwashing was shown to reduce mortality to less than 1%, but they refused to believe the evidence. A frustrated Semmelweis had a nervous breakdown and died in a mental institution.
Surgeons didn’t start consistently wearing gowns until 1901, caps until 1930, gloves until 1937, and masks until 1937. Today, no one in medicine is a mask denier. Not only because they have pledged an oath to “first, do no harm,” but also because they understand the principle of freedom: that the freedom for me to swing my fist ends at your nose. People should be free from other people’s germs where possible, and that is why masks and social distancing are advised. […]
Read the complete article
As I write this on June 29, 2020, we are in the midst of a global pandemic with a scary, rapidly spreading new virus that we don’t understand very well yet. Globally, 10,199,798 have become infected and 502,947 have died, for a 4.93% death rate. In the U.S., 125,928 people have died out of 2,564,163 infections, for a 4.91% death rate. The bad news is that the actual number of infected people is very likely much higher, inasmuch as testing has been hit and miss, so that means the death rate may not be so high. Nevertheless, the raw number of deaths is a staggering figure and everyone is, or ought to be, worried about getting COVID-19. What can you do to prevent catching the disease?
The rational response would be to listen to expert advice and follow the best public health precautions to minimize the risk of catching or spreading COVID-19, which includes frequent handwashing, remaining isolated as much as you can, social distancing when out in public, and wearing a mask around others. But this relatively simple advice has morphed from a public health recommendation to a political hot-button issue. Medical experts recommend masks, for example, but the President sets a bad example by refusing to wear one. Medical authorities recommend social distancing, but people are still crowding together in many settings, and many of them are not wearing masks. Where this has happened in places that began to reopen in May, most notably Florida, Arizona, and California, COVID-19 has come roaring back. Here are some reasons why wearing masks is important.
Surgeons Wear Masks
Surgeons wear surgical masks for many hours a day, day after day. They don’t have any trouble breathing. They are well trained. They know they have to keep their meticulously scrubbed and gloved hands sterile. They know they can’t touch their eyes or their faces and must not touch the mask. If it needs adjusting, they get a nurse or technician to do if for them. If something itches, they know they mustn’t scratch; they ask someone who is not scrubbed in to scratch it for them. They tolerate these minor annoyances not for their own personal benefit, but for the benefit of others. They know the precautions are necessary to minimize the risk of infection in the patients they are operating on.
I wonder how mask refusers would react if their surgeon wanted to perform major surgery on them without wearing a mask. Can you imagine the reaction if a surgeon said, “This is America and I am free to not wear a mask in surgery if I don’t feel like it!” Hopefully mask deniers would realize that would not be a good idea, but I’m not so sure they would.
Surgeons themselves were slow to adapt. In the not so “good old days,” they operated in street clothes with no gowns or gloves, and with unwashed hands. Sometimes the street clothes were visibly filthy or even blood-stained. When Ignaz Semmelweis (1818–1865) discovered that doctors had been causing puerperal fever, they refused to believe they were at fault. They had been going directly from autopsies to the wards, where they touched obstetrical patients, transmitting the deadly bacteria. No wonder they resisted Semmelweis: they didn’t believe in germs (the germ theory of disease wasn’t established until long after his death). Handwashing was shown to reduce mortality to less than 1%, but they refused to believe the evidence. A frustrated Semmelweis had a nervous breakdown and died in a mental institution.
Surgeons didn’t start consistently wearing gowns until 1901, caps until 1930, gloves until 1937, and masks until 1937. Today, no one in medicine is a mask denier. Not only because they have pledged an oath to “first, do no harm,” but also because they understand the principle of freedom: that the freedom for me to swing my fist ends at your nose. People should be free from other people’s germs where possible, and that is why masks and social distancing are advised. […]
Read the complete article
Tuesday, June 23, 2020
Motivation might get you started Part One
7 key practices to help you achieve your toughest, most important goals. Part One
By Craig Weller, CPT, US Navy SWCC
Special operations courses are designed to weed people out. In the Navy, the screening test just to qualify for these courses has about a 90 percent failure rate. From there, anywhere between 60 to 90 percent of candidates don’t make it through the course itself. Those who do make it, more than anything else, display the ability to just keep goingthrough a painfully discouraging process. They face a daily onslaught of being pushed to their limits: hypothermia, hypoxia, hypoglycemia, and sand-in-your-everything. Yet some persevere and ultimately graduate.
How do you stay motivated through something that’s designed to make you feel terrible, day after day? The answer is more complex than you might imagine. Contrary to what most people think, accomplishing big-picture dreams has very little to do with feeling motivated from moment to moment. It has even less to do with being good at something from the start. This is true whether you’re trying to get through a grueling selection course, a fat loss journey, a career change, or a marathon training plan. My story is a prime example.
Right after graduating high school in small town North Dakota, I joined the Navy. I volunteered for a Special Operations unit. When I left for boot camp, I didn’t know how to swim. As you can imagine, swimming is a pretty important skill in Naval Special Operations. My odds of success were near zero.
I learned to swim by taking the screening test, failing it, and going to an hour of stroke development to practice. I passed that test by 7 seconds on my third and final attempt. Then began two and half years of suffering. I spent 16 months in preparatory training and was 2 weeks from graduating my first Special Warfare Combatant Crewmember (SWCC) selection course when I failed a timed swim. Because I was so far along, I was given the option of repeating the entire course.
Before starting over, I spent four months in a BUD/S (Basic Underwater Demolition/SEAL) development program. Then I went through SWCC selection again. This time, I graduated. Along the way, I watched thousands of people—nearly all better swimmers than me—fail out or quit. During this process, I learned the characteristics that help someone succeed. (I also learned the factors that lead to failure.) What I discovered surprised me: Initial talent was only a small piece of the picture, and physical fitness? It was only one of many factors. The best athletes often quit early and reliably.
As it turns out, where you start out is far less important than where you’re willing to go. One of the main differences between those who succeeded and those who didn’t was the word “yet”.
“I’m not strong enough. Yet.”
“I don’t know how to do this. Yet.”
“I can’t handle this. Yet.”
Like everyone else in the program, the people who graduated struggled plenty, suffered setbacks, and had bad days. But the difference maker? They were also the ones who managed to consistently do a difficult, discouraging thing for a long time in order to finally reach a long-term goal. Which then leads us to ask, How?
Here’s the secret: IT WASN’T MOTIVATION THAT GOT THEM THERE. Motivation might be what gets you started, but almost everything after that is just doing what needs to be done in the moment… until you eventually get where you want to be. Motivation may return at some point, but it’s never guaranteed. Given that, here’s 7 ways to keep moving forward even when you don’t feel motivated.
Doing the right thing when the right thing is hard isn’t limited to the tiny, bizarre world of special operations. It’s a universal concept. A new parent getting out of bed at 3 a.m. to soothe a screaming baby for the fifth night in a row isn’t enthusiastic about it. The entrepreneur spending their Friday night combing through bank statements and receipts isn’t madly in love with do-it-yourself accounting. The athlete putting in 5 a.m. workouts doesn’t hate warm blankets and sleep. But if not motivation, then what helps people do the hard stuff? People who consistently do the hard thing have several core ideals and practices in common. Here’s how you can adopt them yourself.
1). View life as a series of learnable skills, learn the skills you need to accomplish your aims, and practice those skills, again, and again, and again…: Refer back to the power of the word “yet.” Resilient, effective people don’t just “try harder.” Rather, they see any process as a skill that can be developed. Perhaps your self-talk turns toxic when you’re having a terrible day. Don’t just tell yourself to self-talk better. Identify the specific components of that process you can improve upon—and the contextual cues that will trigger you to do so.
Here’s how it might work: Identify a past experience when your self-talk became self-sabotage. Take that apart. What exactly was happening in your mind, and what were you doing? Decide on a specific practice that could be instituted in a similar situation in the future.
Perhaps when you were trying to get up for a 5 a.m. workout, you began mentally complaining and negotiating with yourself about getting out of bed. Your future practice: Instead of complaining about how tired you are, you replace that dialogue with a different narrative. You tell yourself that you’re supposed to feel tired when you’re waking up, and that this early morning is the path you chose as a necessary step toward doing the thing that you truly want to do. Or maybe you just replace the negative self-talk with a mantra or meaningful song lyric.
Whatever it is, be specific about what you’ll practice. Then, in the same way that a runner times their splits on the track, time your ability to maintain this new practice. If you can replace or alter your negative self-talk for five minutes before breaking down, that’s your split. Reset your timer and start over next time.
The starting point doesn’t matter nearly as much as your willingness to improve, little by little.
2). Prioritize systems over willpower: If motivation isn’t the answer, willpower must be what we need, right? Not quite. Here’s an example: When I was a student in the early portion of the Naval Special Warfare pipeline, I had to get up at 3 a.m. for workouts. Being late or missing a workout could mean being dropped from the program. I made it to the workouts on time, but not by making myself promises or being super-duper disciplined every day. I simply put my alarm clock on the other side of my room. I slept in a top bunk and had roommates, so when the alarm clock went off, I had to literally jump out of bed to shut it off. I removed the possibility of failure from the path. It didn’t matter if I felt like getting out of bed. I had to. Essentially, I created a system to help make getting out of bed feel like the obvious path forward—rather than an uphill slog. Setting the alarm clock across the room was my system.
Systems help us prioritize what to do and when to do it. Systems also remove a lot of the effort and willpower we think are required to get things done. This approach of shaping your environment to help yourself succeed works with any type of habit you’re struggling to stick to.
3). Separate your feelings from your identity: In BUD/S, I was once in a support role keeping an eye on other students in the middle of Hell Week. The students were about 3 days into the week and were given a brief nap in tents on the beach. I was assigned to watch them for medical issues and get them to walk the 100 yards or so to the bathroom—rather than peeing in the same sand we’d be doing pushups in the next day. One of those students stepped out of the tent and trudged past me toward the bathroom. His uniform was still wet with saltwater, and he shuffled along as if trying to shrink inward to avoid touching cold, wet cotton. He paused briefly in front of me, staring off into the distance, then burst into a full-body shudder.
With his eyes still affixed on the horizon, he said: “S**t I’m cold.” With that, he resumed his slow, steady walk to the gate. He was probably as miserably cold as he’d ever be in his life. He was hitting bottom, and he didn’t hide from it. He acknowledged what he was feeling and set it aside. Being cold was a passing, unpleasant thing, like bad weather. It wasn’t his identity, and it didn’t shape who he was or what he chose to do. Eventually, he graduated: A newly minted SEAL.
We often assume that our feelings should drive our behavior. That if we feel tired or sad or discouraged, we should do tired, sad, and discouraged things. Expressing your feeling is one thing, but allowing them to control your every action can lead you down some destructive paths. It doesn’t have to be that way. We can recognize and accept those feelings in the same way that we grab a jacket when we see storm clouds passing over. Our moment-to-moment feelings don’t have to determine who we are or what we choose to do. Simply knowing this can make it easier to carry on when we don’t feel like it.
By Craig Weller, CPT, US Navy SWCC
Special operations courses are designed to weed people out. In the Navy, the screening test just to qualify for these courses has about a 90 percent failure rate. From there, anywhere between 60 to 90 percent of candidates don’t make it through the course itself. Those who do make it, more than anything else, display the ability to just keep goingthrough a painfully discouraging process. They face a daily onslaught of being pushed to their limits: hypothermia, hypoxia, hypoglycemia, and sand-in-your-everything. Yet some persevere and ultimately graduate.
How do you stay motivated through something that’s designed to make you feel terrible, day after day? The answer is more complex than you might imagine. Contrary to what most people think, accomplishing big-picture dreams has very little to do with feeling motivated from moment to moment. It has even less to do with being good at something from the start. This is true whether you’re trying to get through a grueling selection course, a fat loss journey, a career change, or a marathon training plan. My story is a prime example.
Right after graduating high school in small town North Dakota, I joined the Navy. I volunteered for a Special Operations unit. When I left for boot camp, I didn’t know how to swim. As you can imagine, swimming is a pretty important skill in Naval Special Operations. My odds of success were near zero.
I learned to swim by taking the screening test, failing it, and going to an hour of stroke development to practice. I passed that test by 7 seconds on my third and final attempt. Then began two and half years of suffering. I spent 16 months in preparatory training and was 2 weeks from graduating my first Special Warfare Combatant Crewmember (SWCC) selection course when I failed a timed swim. Because I was so far along, I was given the option of repeating the entire course.
Before starting over, I spent four months in a BUD/S (Basic Underwater Demolition/SEAL) development program. Then I went through SWCC selection again. This time, I graduated. Along the way, I watched thousands of people—nearly all better swimmers than me—fail out or quit. During this process, I learned the characteristics that help someone succeed. (I also learned the factors that lead to failure.) What I discovered surprised me: Initial talent was only a small piece of the picture, and physical fitness? It was only one of many factors. The best athletes often quit early and reliably.
As it turns out, where you start out is far less important than where you’re willing to go. One of the main differences between those who succeeded and those who didn’t was the word “yet”.
“I’m not strong enough. Yet.”
“I don’t know how to do this. Yet.”
“I can’t handle this. Yet.”
Like everyone else in the program, the people who graduated struggled plenty, suffered setbacks, and had bad days. But the difference maker? They were also the ones who managed to consistently do a difficult, discouraging thing for a long time in order to finally reach a long-term goal. Which then leads us to ask, How?
Here’s the secret: IT WASN’T MOTIVATION THAT GOT THEM THERE. Motivation might be what gets you started, but almost everything after that is just doing what needs to be done in the moment… until you eventually get where you want to be. Motivation may return at some point, but it’s never guaranteed. Given that, here’s 7 ways to keep moving forward even when you don’t feel motivated.
Doing the right thing when the right thing is hard isn’t limited to the tiny, bizarre world of special operations. It’s a universal concept. A new parent getting out of bed at 3 a.m. to soothe a screaming baby for the fifth night in a row isn’t enthusiastic about it. The entrepreneur spending their Friday night combing through bank statements and receipts isn’t madly in love with do-it-yourself accounting. The athlete putting in 5 a.m. workouts doesn’t hate warm blankets and sleep. But if not motivation, then what helps people do the hard stuff? People who consistently do the hard thing have several core ideals and practices in common. Here’s how you can adopt them yourself.
1). View life as a series of learnable skills, learn the skills you need to accomplish your aims, and practice those skills, again, and again, and again…: Refer back to the power of the word “yet.” Resilient, effective people don’t just “try harder.” Rather, they see any process as a skill that can be developed. Perhaps your self-talk turns toxic when you’re having a terrible day. Don’t just tell yourself to self-talk better. Identify the specific components of that process you can improve upon—and the contextual cues that will trigger you to do so.
Here’s how it might work: Identify a past experience when your self-talk became self-sabotage. Take that apart. What exactly was happening in your mind, and what were you doing? Decide on a specific practice that could be instituted in a similar situation in the future.
Perhaps when you were trying to get up for a 5 a.m. workout, you began mentally complaining and negotiating with yourself about getting out of bed. Your future practice: Instead of complaining about how tired you are, you replace that dialogue with a different narrative. You tell yourself that you’re supposed to feel tired when you’re waking up, and that this early morning is the path you chose as a necessary step toward doing the thing that you truly want to do. Or maybe you just replace the negative self-talk with a mantra or meaningful song lyric.
Whatever it is, be specific about what you’ll practice. Then, in the same way that a runner times their splits on the track, time your ability to maintain this new practice. If you can replace or alter your negative self-talk for five minutes before breaking down, that’s your split. Reset your timer and start over next time.
The starting point doesn’t matter nearly as much as your willingness to improve, little by little.
2). Prioritize systems over willpower: If motivation isn’t the answer, willpower must be what we need, right? Not quite. Here’s an example: When I was a student in the early portion of the Naval Special Warfare pipeline, I had to get up at 3 a.m. for workouts. Being late or missing a workout could mean being dropped from the program. I made it to the workouts on time, but not by making myself promises or being super-duper disciplined every day. I simply put my alarm clock on the other side of my room. I slept in a top bunk and had roommates, so when the alarm clock went off, I had to literally jump out of bed to shut it off. I removed the possibility of failure from the path. It didn’t matter if I felt like getting out of bed. I had to. Essentially, I created a system to help make getting out of bed feel like the obvious path forward—rather than an uphill slog. Setting the alarm clock across the room was my system.
Systems help us prioritize what to do and when to do it. Systems also remove a lot of the effort and willpower we think are required to get things done. This approach of shaping your environment to help yourself succeed works with any type of habit you’re struggling to stick to.
3). Separate your feelings from your identity: In BUD/S, I was once in a support role keeping an eye on other students in the middle of Hell Week. The students were about 3 days into the week and were given a brief nap in tents on the beach. I was assigned to watch them for medical issues and get them to walk the 100 yards or so to the bathroom—rather than peeing in the same sand we’d be doing pushups in the next day. One of those students stepped out of the tent and trudged past me toward the bathroom. His uniform was still wet with saltwater, and he shuffled along as if trying to shrink inward to avoid touching cold, wet cotton. He paused briefly in front of me, staring off into the distance, then burst into a full-body shudder.
With his eyes still affixed on the horizon, he said: “S**t I’m cold.” With that, he resumed his slow, steady walk to the gate. He was probably as miserably cold as he’d ever be in his life. He was hitting bottom, and he didn’t hide from it. He acknowledged what he was feeling and set it aside. Being cold was a passing, unpleasant thing, like bad weather. It wasn’t his identity, and it didn’t shape who he was or what he chose to do. Eventually, he graduated: A newly minted SEAL.
We often assume that our feelings should drive our behavior. That if we feel tired or sad or discouraged, we should do tired, sad, and discouraged things. Expressing your feeling is one thing, but allowing them to control your every action can lead you down some destructive paths. It doesn’t have to be that way. We can recognize and accept those feelings in the same way that we grab a jacket when we see storm clouds passing over. Our moment-to-moment feelings don’t have to determine who we are or what we choose to do. Simply knowing this can make it easier to carry on when we don’t feel like it.
Next week for part 2.
|
The Biggest Bluff: How I Learned to Pay Attention, Master Myself, and Win
It’s true that Maria Konnikova had never actually played poker before and didn’t even know the rules when she approached Erik Seidel, Poker Hall of Fame inductee and winner of tens of millions of dollars in earnings, and convinced him to be her mentor. But she knew her man: a famously thoughtful and broad-minded player, he was intrigued by her pitch that she wasn’t interested in making money so much as learning about life. She had faced a stretch of personal bad luck, and her reflections on the role of chance had led her to a giant of game theory, who pointed her to poker as the ultimate master class in learning to distinguish between what can be controlled and what can’t. And she certainly brought something to the table, including a PhD in psychology and an acclaimed and growing body of work on human behavior and how to hack it. So Seidel was in, and soon she was down the rabbit hole with him, into the wild, fiercely competitive, overwhelmingly masculine world of high-stakes Texas Hold’em, their initial end point the following year’s World Series of Poker.
But then something extraordinary happened. Under Seidel’s guidance, Konnikova did have many epiphanies about life that derived from her new pursuit, including how to better read, not just her opponents but far more importantly herself; how to identify what tilted her into an emotional state that got in the way of good decisions; and how to get to a place where she could accept luck for what it was, and what it wasn’t. But she also began to win. And win. In a little over a year, she began making earnest money from tournaments, ultimately totaling hundreds of thousands of dollars. She won a major title, got a sponsor, and got used to being on television, and to headlines like “How one writer’s book deal turned her into a professional poker player.” In this wide-ranging conversation Konnikova and Shermer discuss:
- the balance of luck, skill, intelligence and emotions in how lives turn out
- the real meaning of the marshmallow test
- time discounting and how to improve yours
- rapid cognition and intuition
- how to improve your use of emotions in gambling and in life
- what it was like being a woman in an almost exclusively male game, and
- the nature of human nature in the context of the BLM movement and protests.
Maria Konnikova is the author of Mastermind and The Confidence Game. She is a regular contributing writer for The New Yorker, and has written for The Atlantic, The New York Times, Slate, The New Republic, The Paris Review, The Wall Street Journal, Salon, The Boston Globe, Scientific American, Wired, and Smithsonian, among many other publications. Her writing has won numerous awards, including the 2019 Excellence in Science Journalism Award from the Society of Personality and Social Psychology. While researching The Biggest Bluff, Maria became an international poker champion and the winner of over $300,000 in tournament earnings. Maria also hosts the podcast The Grift from Panoply Media and is currently a visiting fellow at NYU’s School of Journalism. Her podcasting work earned her a National Magazine Award nomination in 2019. Maria graduated from Harvard University and received her PhD in Psychology from Columbia University.
Tuesday, June 16, 2020
Your Mind—Added Sugar Contributes To Depression
One of the recent mysteries of science is why depression, diabetes and dementia seem to cluster in epidemiological studies, and why having one of these health issues seems to increase your risk for the others. The answer: in a study in the journal Diabetologia, researchers found that when blood glucose levels are elevated, BDNF levels drop. BDNF is a compound that helps brain cells communicate with one another, build memories, and learn new things; decreased levels of BDNF have been linked to both Alzheimer's and depression. That means that the simple act of eating sugar compromises your brain and quickly; the more you do it, the greater your risk of diabetes, and the greater your risk of depression and dementia as well. In a 2015 study of post-menopausal women, higher levels of added sugars and refined carbs were associated with an increased likelihood of depression, while higher consumption of fiber, dairy, fruit and vegetables was associated with a lower risk.
In a study of nearly 1,000 seniors (median age: 79.5), researchers found that eating a diet high in simple carbs significantly increased the risk of developing dementia. All of the subjects were cognitively normal at the beginning of the study, and about 200 developed signs of dementia over the next 3.7 years. The risk of mental decline was higher in those who ate high-carb diets, and lower in those whose diets were higher in fat and/or protein.
Your Heart—Sugar Doubles Your Risk Of Dying From Heart Disease: People who get 25 percent of more of their calories from added sugar are more than twice as likely to die from heart disease as those who eat less than 10 percent, according to a study in the Journal of the American Medical Association. One out of ten of us fall into that category.
Now, if you're an average American, your daily sugar consumption is about 17 percent of calories, according to the study. That's hardly a laurel to rest on: people who ate between 17 and 21 percent of their calories from added sugar had a 38 percent higher risk of dying from heart disease, compared with people who consumed 8 percent or less of their calories from added sugar.
At first, the researchers figured that since those who ingest more sugar have poorer diets, that might be a main cause, but even after making adjustments for the quality of one's diet, the link between sweets and cardiovascular risk remained the same.
The study found that the major sources of added sugar in the American diet were:
Sugar-sweetened beverages (37.1%)
Grain-based desserts like cookies or cake (13.7%)
Fruit drinks (8.9%)
Dairy desserts like ice cream (6.1%)
Candy (5.8%)
Sodas and other sweet drinks are a major red flag: the researchers found that a higher consumption of sugar-sweetened beverages was directly tied to an increased risk of dying from heart disease. The impact is so great that you don't need to be meandering through middle age to see the impact: even teenagers who consume food and beverages high in added sugars show evidence of risk factors for heart disease and diabetes in their blood, according to a second study in The Journal of Nutrition.
Your Blood Pressure—Added Sugar Raises Your Blood Pressure: In fact, sugar may be worse for your blood pressure than salt, according to a paper published in the journal Open Heart. Just a few weeks on a high-sucrose diet can increase both systolic and diastolic blood pressure. Another study found that for every sugar-sweetened beverage, risk of developing hypertension increased 8 percent. Too much sugar leads to higher insulin levels, which in turn activate the sympathetic nervous system and lead to increased blood pressure, according to James J. DiNicolantonio, PharmD, cardiovascular research scientist at Saint Luke's Mid America Heart Institute in Kansas City, Missouri. "It may also cause sodium to accumulate within the cell, causing calcium to build up within the cell, leading to vasoconstriction and hypertension," he says.
Your Skin—Sugar Causes Your Skin To Sag: Your skin has its own support system in the form of collagen and elastin, two compounds that keep your skin tight and plump. When elevated levels of glucose and fructose enter the body, they link to the amino acids present in the collagen and elastin, producing advanced glycation end products, or "AGEs." That damages these two critical compounds and makes it hard for the body to repair them. This process is accelerated in the skin when sugar is elevated, and further stimulated by ultraviolet light, according to a study in Clinical Dermatology. In other words, eating lots of sugar poolside is the worst thing you can do for your skin.
In a study of nearly 1,000 seniors (median age: 79.5), researchers found that eating a diet high in simple carbs significantly increased the risk of developing dementia. All of the subjects were cognitively normal at the beginning of the study, and about 200 developed signs of dementia over the next 3.7 years. The risk of mental decline was higher in those who ate high-carb diets, and lower in those whose diets were higher in fat and/or protein.
Your Heart—Sugar Doubles Your Risk Of Dying From Heart Disease: People who get 25 percent of more of their calories from added sugar are more than twice as likely to die from heart disease as those who eat less than 10 percent, according to a study in the Journal of the American Medical Association. One out of ten of us fall into that category.
Now, if you're an average American, your daily sugar consumption is about 17 percent of calories, according to the study. That's hardly a laurel to rest on: people who ate between 17 and 21 percent of their calories from added sugar had a 38 percent higher risk of dying from heart disease, compared with people who consumed 8 percent or less of their calories from added sugar.
At first, the researchers figured that since those who ingest more sugar have poorer diets, that might be a main cause, but even after making adjustments for the quality of one's diet, the link between sweets and cardiovascular risk remained the same.
The study found that the major sources of added sugar in the American diet were:
Sugar-sweetened beverages (37.1%)
Grain-based desserts like cookies or cake (13.7%)
Fruit drinks (8.9%)
Dairy desserts like ice cream (6.1%)
Candy (5.8%)
Sodas and other sweet drinks are a major red flag: the researchers found that a higher consumption of sugar-sweetened beverages was directly tied to an increased risk of dying from heart disease. The impact is so great that you don't need to be meandering through middle age to see the impact: even teenagers who consume food and beverages high in added sugars show evidence of risk factors for heart disease and diabetes in their blood, according to a second study in The Journal of Nutrition.
Your Blood Pressure—Added Sugar Raises Your Blood Pressure: In fact, sugar may be worse for your blood pressure than salt, according to a paper published in the journal Open Heart. Just a few weeks on a high-sucrose diet can increase both systolic and diastolic blood pressure. Another study found that for every sugar-sweetened beverage, risk of developing hypertension increased 8 percent. Too much sugar leads to higher insulin levels, which in turn activate the sympathetic nervous system and lead to increased blood pressure, according to James J. DiNicolantonio, PharmD, cardiovascular research scientist at Saint Luke's Mid America Heart Institute in Kansas City, Missouri. "It may also cause sodium to accumulate within the cell, causing calcium to build up within the cell, leading to vasoconstriction and hypertension," he says.
Your Skin—Sugar Causes Your Skin To Sag: Your skin has its own support system in the form of collagen and elastin, two compounds that keep your skin tight and plump. When elevated levels of glucose and fructose enter the body, they link to the amino acids present in the collagen and elastin, producing advanced glycation end products, or "AGEs." That damages these two critical compounds and makes it hard for the body to repair them. This process is accelerated in the skin when sugar is elevated, and further stimulated by ultraviolet light, according to a study in Clinical Dermatology. In other words, eating lots of sugar poolside is the worst thing you can do for your skin.
Beyond the Known: How Exploration Created the Modern World and Will Take Us to the Stars
For the first time in history, the human species has the technology to destroy itself. But having developed that power, humans are also able to leave Earth and voyage into the vastness of space. After millions of years of evolution, we’ve arrived at the point where we can settle other worlds and begin the process of becoming multi-planetary. How did we get here? What does the future hold for us? Divided into four accessible sections, Beyond the Known examines major periods of discovery and rediscovery, from Classical Times, when Phoenicians, Persians and Greeks ventured forth; to The Age of European Exploration, which saw colonies sprout on nearly continent; to The Era of Scientific Inquiry, when researchers developed brand new tools for mapping and traveling farther; to Our Spacefaring Future, which unveils plans currently underway for settling other planets and, eventually, traveling to the stars.
A Mission Manager at SpaceX with a light, engaging voice, Andrew Rader is at the forefront of space exploration. As a gifted historian, Rader, who has won global acclaim for his stunning breadth of knowledge, is singularly positioned to reveal the story of human exploration that is also the story of scientific achievement. Told with an infectious zeal for traveling beyond the known, Beyond the Known illuminates how very human it is to emerge from the cave and walk toward an infinitely expanding horizon. Rader and Shermer also discuss:
- the human nature to explore: adaptation or spandrel?
- what the Greeks and Romans knew about the world that was lost for centuries
- how dark were the Dark Ages for exploration?
- the economic and religious drivers of early exploration
- the political and practical drivers of 20th century exploration
- Mars direct or to the moon first?
- how to terraform Mars
- how to get people to the moons of the outer planets
- how to get people to the stars
- are we living in a simulation?
- should we be worried about A.I.?
Andrew Rader is a Mission Manager at SpaceX. He holds a Ph.D. in Aerospace Engineering from MIT specializing in long-duration spaceflight. In 2013, he won the Discovery Channel’s competitive television series Canada’s Greatest Know-It-All. He also co-hosts the weekly podcast Spellbound, which covers topics from science to economics to history and psychology. Beyond the Known is Rader’s first book for adults. You can find him at Andrew-Rader.com.
Tuesday, June 09, 2020
Docs Need A Prescription For Nutrition 101
by Matthew Kady, MS, RD for IDEA Fitness
Harvard report urges nutrition education in medical schools. Although diet can be a significant factor in many chronic health conditions, surprisingly, U.S.-trained doctors receive little or no formal training in nutrition. (Estimates are that, on average, students in medical schools spend less than 1% of lecture time learning about diet.) Staff and students at the Harvard Law School Food Law and Policy Clinic would like to see that knowledge gap rectified.
In the report Doctoring Our Diet: Policy Tools to Include Nutrition Training in U.S. Medical Training, the group issued recommendations for improving nutrition education in undergraduate, graduate and continuing medical education. The report says that nutrition education should be required in medical school and that physicians should be required to take continuing education courses in nutrition to maintain medical licenses. The end goal? Supporting better health outcomes for patients.
This Is Your Body On Sugar from Eat This, Not That
Oh, you don't recall slurping down any of the hyper-sweet corn extract in one sitting? Well, you did—about eight teaspoons' worth, according to the U.S. Department of Agriculture. In fact, the average American consumed 27 pounds of the stuff last year.
But while 8 teaspoons of artificially manufactured syrup may seem like an awful lot, it's only a drop in the sugar bucket. The USDA's most recent figures find that Americans consume, on average, about 32 teaspoons of added sugar every single day. That sugar comes to us in the form of candies, ice cream and other desserts, yes. But the most troubling sugar of all isn't the added sugar we consume on purpose; it's the stuff we don't even know we're eating.
In recent years, the medical community has begun to coalesce around a powerful new way of looking at added sugar: as perhaps the number one most significant health threat in America. What exactly is "added sugar," and why do experts suddenly believe that it's the Freddy Kreuger of nutrition? Read on to find out!
The Deal With Added Sugar: When they talk about "added sugar," health experts aren't talking about the stuff that we consume from eating whole foods. "Added sugars are sugars that are contributed during the processing or preparation of foods and beverages," says Rachel K. Johnson, PhD, RD, professor of nutrition at The University of Vermont. Lactose, the sugar naturally found in milk and dairy products, and naturally occurring fructose, the sugar that appears in fruit, don't count. Ingredients that are used in foods to provide added sweetness and calories, from the much-maligned high fructose corn syrup to healthier-sounding ones like agave, date syrup, cane sugar, and honey, are all considered added sugars.
That's because naturally occurring sugars, like what you find in an apple, come with their own balanced health posse—fiber and micronutrients, which slows the digestion of the sugar and prevents it from spiking insulin response and damaging your liver, two serious side effects of added sugar.
Fortunately, giving up added sugar has been shown to have several dramatic and rapid impacts on your health. In a newly released study, children who cut added sugars from their diets for just 9 days showed dramatic improvements in cholesterol and blood sugar levels.
On the flip side, adding sugar to your diet can quickly put your health into a spiral: people who consumed beverages containing high fructose corn syrup for 2 weeks significantly increased their levels of triglycerides and LDL cholesterol (the unfavorable kind), plus two proteins associated with elevated cholesterols and another compound, uric acid, that's associated with diabetes and gout—this comes from a 2015 study in the American Journal of Clinical Nutrition.
In fact, in a 2014 editorial in the journal JAMA Internal Medicine, the authors made a bold statement: "Too much sugar does not only make us fat; it can also make us sick.
The editors of Eat This, Not That! took a look at the most recent research and discovered just how much harm added sugars are doing to us:
Your Belly--Added Sugar Causes Your Body To Store Fat Around Your Gut: Within 24 hours of eating fructose, your body is flooded with elevated levels of triglycerides. Does that sound bad? It is.
Triglycerides are the fatty deposits in your blood. Your liver makes them because they're essential for building and repairing the tissues in your body. But, when it's hit with high doses of fructose, the liver responds by pumping out more triglycerides; that's a signal to your body that it's time to store some abdominal fat. In one study, researchers fed subjects beverages sweetened with either glucose or fructose. Both gained the same amount of weight over the next 8 weeks, but the fructose group gained its weight primarily as belly fat, thanks to the way this type of sugar is processed in the liver.
What's unique to fructose is that it seems to be a universal obesogen—in other words, every creature that eats it gains weight. Princeton researchers recently found that high-fructose corn syrup seemed to have a unique impact on weight in their animal studies. "When rats are drinking high-fructose corn syrup at levels well below those in soda pop, they're becoming obese—every single one, across the board," psychology professor Bart Hoebel, a specialist in appetite and sugar addiction, said in a report from the university. "Even when rats are fed a high-fat diet, you don't see this; they don't all gain weight." Fructose is the freak show of fat.
Your Blood Sugar--Added Sugar Is The #1 Factor In Your Risk Of Dying From Diabetes: The link between increased sugar and diabetes risk is right up there with "smoking causes lung cancer" on the list of immutable medical truths— despite what soda manufacturers are trying to tell us. Researchers at the Mayo Clinic have come right out and said that added fructose—either as a constituent of table sugar or as the main component of high-fructose corn syrup—may be the number one cause of diabetes, and that cutting sugar alone could translate into a reduced number of diabetes deaths the world over.
Harvard report urges nutrition education in medical schools. Although diet can be a significant factor in many chronic health conditions, surprisingly, U.S.-trained doctors receive little or no formal training in nutrition. (Estimates are that, on average, students in medical schools spend less than 1% of lecture time learning about diet.) Staff and students at the Harvard Law School Food Law and Policy Clinic would like to see that knowledge gap rectified.
In the report Doctoring Our Diet: Policy Tools to Include Nutrition Training in U.S. Medical Training, the group issued recommendations for improving nutrition education in undergraduate, graduate and continuing medical education. The report says that nutrition education should be required in medical school and that physicians should be required to take continuing education courses in nutrition to maintain medical licenses. The end goal? Supporting better health outcomes for patients.
This Is Your Body On Sugar from Eat This, Not That
Oh, you don't recall slurping down any of the hyper-sweet corn extract in one sitting? Well, you did—about eight teaspoons' worth, according to the U.S. Department of Agriculture. In fact, the average American consumed 27 pounds of the stuff last year.
But while 8 teaspoons of artificially manufactured syrup may seem like an awful lot, it's only a drop in the sugar bucket. The USDA's most recent figures find that Americans consume, on average, about 32 teaspoons of added sugar every single day. That sugar comes to us in the form of candies, ice cream and other desserts, yes. But the most troubling sugar of all isn't the added sugar we consume on purpose; it's the stuff we don't even know we're eating.
In recent years, the medical community has begun to coalesce around a powerful new way of looking at added sugar: as perhaps the number one most significant health threat in America. What exactly is "added sugar," and why do experts suddenly believe that it's the Freddy Kreuger of nutrition? Read on to find out!
The Deal With Added Sugar: When they talk about "added sugar," health experts aren't talking about the stuff that we consume from eating whole foods. "Added sugars are sugars that are contributed during the processing or preparation of foods and beverages," says Rachel K. Johnson, PhD, RD, professor of nutrition at The University of Vermont. Lactose, the sugar naturally found in milk and dairy products, and naturally occurring fructose, the sugar that appears in fruit, don't count. Ingredients that are used in foods to provide added sweetness and calories, from the much-maligned high fructose corn syrup to healthier-sounding ones like agave, date syrup, cane sugar, and honey, are all considered added sugars.
That's because naturally occurring sugars, like what you find in an apple, come with their own balanced health posse—fiber and micronutrients, which slows the digestion of the sugar and prevents it from spiking insulin response and damaging your liver, two serious side effects of added sugar.
Fortunately, giving up added sugar has been shown to have several dramatic and rapid impacts on your health. In a newly released study, children who cut added sugars from their diets for just 9 days showed dramatic improvements in cholesterol and blood sugar levels.
On the flip side, adding sugar to your diet can quickly put your health into a spiral: people who consumed beverages containing high fructose corn syrup for 2 weeks significantly increased their levels of triglycerides and LDL cholesterol (the unfavorable kind), plus two proteins associated with elevated cholesterols and another compound, uric acid, that's associated with diabetes and gout—this comes from a 2015 study in the American Journal of Clinical Nutrition.
In fact, in a 2014 editorial in the journal JAMA Internal Medicine, the authors made a bold statement: "Too much sugar does not only make us fat; it can also make us sick.
The editors of Eat This, Not That! took a look at the most recent research and discovered just how much harm added sugars are doing to us:
Your Belly--Added Sugar Causes Your Body To Store Fat Around Your Gut: Within 24 hours of eating fructose, your body is flooded with elevated levels of triglycerides. Does that sound bad? It is.
Triglycerides are the fatty deposits in your blood. Your liver makes them because they're essential for building and repairing the tissues in your body. But, when it's hit with high doses of fructose, the liver responds by pumping out more triglycerides; that's a signal to your body that it's time to store some abdominal fat. In one study, researchers fed subjects beverages sweetened with either glucose or fructose. Both gained the same amount of weight over the next 8 weeks, but the fructose group gained its weight primarily as belly fat, thanks to the way this type of sugar is processed in the liver.
What's unique to fructose is that it seems to be a universal obesogen—in other words, every creature that eats it gains weight. Princeton researchers recently found that high-fructose corn syrup seemed to have a unique impact on weight in their animal studies. "When rats are drinking high-fructose corn syrup at levels well below those in soda pop, they're becoming obese—every single one, across the board," psychology professor Bart Hoebel, a specialist in appetite and sugar addiction, said in a report from the university. "Even when rats are fed a high-fat diet, you don't see this; they don't all gain weight." Fructose is the freak show of fat.
Your Blood Sugar--Added Sugar Is The #1 Factor In Your Risk Of Dying From Diabetes: The link between increased sugar and diabetes risk is right up there with "smoking causes lung cancer" on the list of immutable medical truths— despite what soda manufacturers are trying to tell us. Researchers at the Mayo Clinic have come right out and said that added fructose—either as a constituent of table sugar or as the main component of high-fructose corn syrup—may be the number one cause of diabetes, and that cutting sugar alone could translate into a reduced number of diabetes deaths the world over.
Wednesday, June 03, 2020
Human Compatible: Artificial Intelligence and the Problem of Control
In the popular imagination, superhuman artificial intelligence is an approaching tidal wave that threatens not just jobs and human relationships, but civilization itself. Conflict between humans and machines is seen as inevitable and its outcome all too predictable. In this groundbreaking book, distinguished AI researcher Stuart Russell argues that this scenario can be avoided, but only if we rethink AI from the ground up. Russell begins by exploring the idea of intelligence in humans and in machines. He describes the near-term benefits we can expect, from intelligent personal assistants to vastly accelerated scientific research, and outlines the AI breakthroughs that still have to happen before we reach superhuman AI. He also spells out the ways humans are already finding to misuse AI, from lethal autonomous weapons to viral sabotage. If the predicted breakthroughs occur and superhuman AI emerges, we will have created entities far more powerful than ourselves. How can we ensure they never, ever, have power over us? Russell suggests that we can rebuild AI on a new foundation, according to which machines are designed to be inherently uncertain about the human preferences they are required to satisfy. Such machines would be humble, altruistic, and committed to pursue our objectives, not theirs. This new foundation would allow us to create machines that are provably deferential and provably beneficial. Shermer and Russell also discuss:
- natural intelligence vs. artificial intelligence
- “g” in human intelligence vs. G in AGI (Artificial General Intelligence)
- the values alignment problem
- Hume’s “Is-Ought” naturalistic fallacy as it applies to AI values vs. human values
- regulating AI
- Russell’s response to the arguments of AI apocalypse skeptics Kevin Kelly and Steven Pinker
- the Chinese social control AI system and what it could lead to
- autonomous vehicles, weapons, and other systems and how they can be hacked
- AI and the hacking of elections, and
- what keeps Stuart up at night.
Stuart Russell is a professor of Computer Science and holder of the Smith-Zadeh Chair in Engineering at the University of California, Berkeley. He has served as the Vice-Chair of the World Economic Forum’s Council on AI and Robotics and as an advisor to the United Nations on arms control. He is a Fellow of the American Association for Artificial Intelligence, the Association for Computing Machinery, and the American Association for the Advancement of Science. He is the author (with Peter Norvig) of the definitive and universally acclaimed textbook on AI, Artificial Intelligence: A Modern Approach.
Tuesday, June 02, 2020
A Natural & Less Taxing Option
by Tom Venuto
It's commonly believed by many people that walking is not an intense enough exercise to benefit your health very much and you'd get more benefit if you ran. In a similar train of thought, many fitness enthusiasts believe that unless cardio is high in intensity (like interval training or sprints), it's not beneficial ("high intensity or nothing" mentality).
Yes it's true, high intensity interval training (HIIT) can give you major improvements in health and cardiovascular fitness in a short period of time, but does that mean lower intensity exercise like walking is not effective at all? Not according to a study from Lawrence Berkeley National Laboratory, which I wanted to share with you in today's newsletter.
The results found that walking briskly can lower your risk of high blood pressure, high cholesterol, and diabetes (all risk factors for heart disease) just as much as running (and more), and the benefits increased in a dose-response manner.
There's great confusion to this day about the benefits of exercise - including at various intensities (low vs moderate vs vigorous). An even bigger problem is that the majority of people confuse health benefits, cardio fitness benefits, and fat loss benefits (three entirely different subjects!).
This study looked at health benefits, specifically risk factors for cardiovascular disease in subjects 18 to 80 years old, clustered largely in their 40s and 50s. They found the following:
Running reduced risk for first-time hypertension 4.2 percent, and walking reduced risk 7.2 percent.
Running reduced first-time high cholesterol 4.3 percent, and walking 7 percent.
Running reduced first time diabetes 12.1 percent, compared to 12.3 percent for walking.
Running reduced coronary heart disease 4.5 percent, compared to 9.3 percent for walking.
Isn't that fascinating? They said benefits were "similar" but if you look at the numbers, walking was actually better.
Of course, the devil is in the details: they weren't comparing equal amounts of time spent, like 30 minutes of walking vs 30 minutes of running (running would win that comparison), they were testing whether equal amounts of energy (calorie) expenditure by moderate intensity walkingand vigorous intensity running provided equivalent benefits.
When energy expenditure was equivalent by moderate exercise (walking) and vigorous exercise (running) the health risk reductions were similar, but that does mean it takes a lot more time investment walking to achieve the same energy expenditure as you'd get with running or other higher intensity options.
In other words, high intensity exercise is more efficient than moderate intensity exercise and moderate intensity exercise is more efficient than low intensity exercise. So, contrary to what many people believe, it's not that walking doesn't benefit your health as much as higher intensity types of exercise, walking is simply not as time efficient.
It's the same thing if we shift the subject to fat loss. Low, moderate, and high intensity cardio can all burn fat, but the higher the intensity, the more efficient the exercise is (less time required). This is why people who are short on time and are physically able to do intense exercise often choose the higher or moderate intensity cardio over lower intensity cardio.
The thing is, high intensity cardio is difficult and not appropriate for everyone. The author of the study, Paul Williams, pointed out that walking may be a more sustainable exercise for many people when compared to running.
He said that people are always looking for an excuse not to exercise ("I can't run or do HIIT because of my knee, my hip, my foot" etc. etc. ), but thanks to research results like these, we now know we can get the same health benefits (and fat loss benefits) from walking as we do from running (or other types of intense cardio), it will simply require more time --you gotta walk longer... for example, you might have to walk 60 minutes to get equal benefits as 30 minutes of more intense exercise.
It's commonly believed by many people that walking is not an intense enough exercise to benefit your health very much and you'd get more benefit if you ran. In a similar train of thought, many fitness enthusiasts believe that unless cardio is high in intensity (like interval training or sprints), it's not beneficial ("high intensity or nothing" mentality).
Yes it's true, high intensity interval training (HIIT) can give you major improvements in health and cardiovascular fitness in a short period of time, but does that mean lower intensity exercise like walking is not effective at all? Not according to a study from Lawrence Berkeley National Laboratory, which I wanted to share with you in today's newsletter.
The results found that walking briskly can lower your risk of high blood pressure, high cholesterol, and diabetes (all risk factors for heart disease) just as much as running (and more), and the benefits increased in a dose-response manner.
There's great confusion to this day about the benefits of exercise - including at various intensities (low vs moderate vs vigorous). An even bigger problem is that the majority of people confuse health benefits, cardio fitness benefits, and fat loss benefits (three entirely different subjects!).
This study looked at health benefits, specifically risk factors for cardiovascular disease in subjects 18 to 80 years old, clustered largely in their 40s and 50s. They found the following:
Running reduced risk for first-time hypertension 4.2 percent, and walking reduced risk 7.2 percent.
Running reduced first-time high cholesterol 4.3 percent, and walking 7 percent.
Running reduced first time diabetes 12.1 percent, compared to 12.3 percent for walking.
Running reduced coronary heart disease 4.5 percent, compared to 9.3 percent for walking.
Isn't that fascinating? They said benefits were "similar" but if you look at the numbers, walking was actually better.
Of course, the devil is in the details: they weren't comparing equal amounts of time spent, like 30 minutes of walking vs 30 minutes of running (running would win that comparison), they were testing whether equal amounts of energy (calorie) expenditure by moderate intensity walkingand vigorous intensity running provided equivalent benefits.
When energy expenditure was equivalent by moderate exercise (walking) and vigorous exercise (running) the health risk reductions were similar, but that does mean it takes a lot more time investment walking to achieve the same energy expenditure as you'd get with running or other higher intensity options.
In other words, high intensity exercise is more efficient than moderate intensity exercise and moderate intensity exercise is more efficient than low intensity exercise. So, contrary to what many people believe, it's not that walking doesn't benefit your health as much as higher intensity types of exercise, walking is simply not as time efficient.
It's the same thing if we shift the subject to fat loss. Low, moderate, and high intensity cardio can all burn fat, but the higher the intensity, the more efficient the exercise is (less time required). This is why people who are short on time and are physically able to do intense exercise often choose the higher or moderate intensity cardio over lower intensity cardio.
The thing is, high intensity cardio is difficult and not appropriate for everyone. The author of the study, Paul Williams, pointed out that walking may be a more sustainable exercise for many people when compared to running.
He said that people are always looking for an excuse not to exercise ("I can't run or do HIIT because of my knee, my hip, my foot" etc. etc. ), but thanks to research results like these, we now know we can get the same health benefits (and fat loss benefits) from walking as we do from running (or other types of intense cardio), it will simply require more time --you gotta walk longer... for example, you might have to walk 60 minutes to get equal benefits as 30 minutes of more intense exercise.
Subscribe to:
Posts (Atom)