Press enter to see results or esc to cancel.

Six of the Most Common Health Myths and Misconceptions

There are tons of myths and misconceptions that people believe in. Some, like Bigfoot being real, are outlandish, while others, like French Fries originating in France, are more logical. In the health world especially, ‘facts’ are thrown around that are sometimes believed for years on end before they are debunked. While there are a huge number, the following are just six common myths and misconceptions that people believe about healthcare and personal health that quite simply aren’t true.

Discover the world's top health insurers.
Compare quotes with a click of the button.

Feed a cold, starve a fever”

The age old adage “Feed a cold, starve a fever” is something that most kids hear from their mother when they are at home sick. Despite it being a common saying, there is almost no modern evidence to prove it to be true, and studies have actually shown that it is false.

The origins of the saying are unknown, but the main idea behind it all has been attributed to the Greek philosopher, Aristotle. Aristotle hypothesized that the more food someone ate, the higher their body temperature would be. He compared eating and increasing body temperature to adding wood to a fire. As a fever involves your body temperature increasing, according to Aristotle’s reasoning, if you ate less, your body temperature would go down.

Aristotle’s idea gained steam as the centuries rolled on, as more and more physicians started endorsing ‘starving a fever’. By the 19th century, not giving patients any food was considered a fairly normal way to treat a fever. However, not everyone was convinced. One American physician, Charles Gatchell, who was against ‘starving a fever’, actually reported that patients were dying of starvation because they weren’t being fed enough food.

Fast forward to the 21st century, and evidence has surfaced that eating actually increases the immune system’s ability to fight against illnesses, and so, is beneficial to you. A Dutch study revealed that after subjects ate a meal, their gamma interferon levels increased. Gamma interferon is a messenger that triggers T cells to destroy infected cells. This is compared to when subjects drank nothing but water, during which their gamma interferon levels remained constant. All in all proving that, in stark contrast to what your mother may have told you, eating actually helps you fight off a fever.

So, the next time one of your loved ones is home with a fever remember the saying: Feed a cold, Feed a fever.

Being out in the cold will give you a cold

The common cold got its name from the belief that people caught one because they ventured out in the cold. However, in recent times this belief has been quashed by numerous studies revealing that cold temperatures have no direct impact on catching the cold virus.

Ironically, it’s escaping from the frigid weather outdoors that causes the increase in people with a cold. Whether it is huddling around a heater to gain the warmth, or anything else, the increased contact between people during winter increases the chances of the virus being spread from one person to the next.

The study that confirmed that outside temperatures have no effect on whether or not you get the common cold virus, and compared the infectivity, illness severity and antibody response, among other things, of the virus in a volunteers. Some volunteers were tested in a 4 degree Celsius room while others were in a 32 degree bath and observations about both sets were made.

The experiment concluded that there was no noticeable differences between the viruses in the two situations and that the cold had didn’t make people more or less vulnerable to catching a cold.

Sugar makes kids hyper

Many parents dread whenever their kids eat sugary snacks because they are fearful that their child will become increasingly hyper. A number of scientific studies have tackled the commonly held belief that sugar makes kids hyper, and all revealed that there is no discernible connection between sugar and hyperactivity.

One of these studies involved giving a group of children a sugar-free beverage. The group was split into half, with the parents of one half being told that their kids had just consumed a sugary beverage. The parents were then asked to make observations based on their kid’s behavior. The parents that had been told their child had consumed a sugary beverage reported that their kids were hyperactive, when in actuality they hadn’t had a sugary drink.

It seems like if anything, sugar causing hyperactivity is a bit of a placebo. This is because we’ve been told for so long that sugar makes us hyper, and even though it isn’t true, we’ve convinced ourselves it is. When kids were given a sugary treat under the belief that sugar made them hyper, they exhibited hyperactive qualities. On the flip side, when they were given sugar but were told that sugar has no effect, they were totally normal.

In some cases, specifically in the case of soda, it could be the caffeine that causes hyperactivity rather than the sugar. Scientists have also pointed out that the context of when a child consumes sugary substances is important. Kids normally eat things high in sugar at birthday parties, while on holiday, or at celebrations where the atmosphere causes them to become overly excited.

So even though sugar isn’t the cause of hyperactivity, it does have many other downsides and still shouldn’t be eaten in copious amounts.

Only 10 Percent Of Our Brain Is Being Used

It has long been said that humans only use 10 percent of our brains and that we have huge amounts of knowledge and insane abilities hidden away in the 90 percent waiting to be utilized. While it certainly makes for awesome fantasies and ideas, if we truly only used 10 percent of our brains we’d all be dead.

The whole concept that we use only a fraction of our brains stems from experiments conducted by Harvard psychologists, William James and Boris Sidis. In the 1980s, they tested their reserve energy theory by accelerating the raising of William Sidis, a child prodigy, to an adult IQ of 250-300. They then spoke about how people only use a fraction of our mental potential, and sparked the 10 percent theory.

The theory has been demolished by scientists who point out that the parts of our brain are all dependent on each other, and even if a small part was damaged, we’d be negatively impacted. Scientists also point out that the average size of the human brain has also increased over time, something that wouldn’t have happened if we weren’t using our brain to its full potential.

The “Five Second” Rule

The five second rule is simple: if you drop something on the floor and pick it up before five seconds are up, then it’s safe to eat. It’s a rule that almost everyone has followed at least once in their life, but recent research has revealed that the five second rule isn’t as straight forward as it seems.

Obviously, the longer anything is on the ground the more bacteria it will pick up, but as researchers at Manchester Metropolitan University discovered, the amount of bacteria different foods pick up in a short period of time differs.

 

For certain items, like processed food, or a biscuit, dropping it on the floor and picking up is no big deal as they don’t pick up too much bacteria. In terms of processed food, it’s because of the high salt content that bacteria have less of a chance to survive, while in biscuits, the low water content means they don’t sustain any bacterial growth.

Other items fared a lot worse, as in the case of dried fruit and cooked pasta, after three seconds, signs of a bacteria that can potentially be dangerous, called kiebsiella, were present. While the levels of bacteria were extremely low, the experiments still show that the five second rule definitely isn’t the most trustworthy rule of thumb to follow.

Reading in the Dark Harms your Eyes

Reading under the blankets with a flashlight in hand is something that is guaranteed to get any kid in trouble if discovered by their parents. However, in actuality while it does strain your eyes, reading in the dark isn’t detrimental in the long-term to your eyesight.

While reading, your eyes have to focus on text. To do this the muscle that controls the shape of your lens, and your iris contracts to maintain the focused images on the retina. In low light, your eye adapts by doing a couple of things. More light-sensitive chemicals are produced by cells on your retina. These chemicals detect light and transmit an electrical signal to your brain. Your iris muscles also relax, this makes your eye open wider and your pupil enlarge, and then the nerve cells in the retina adapt to work under low light.

Because of what your eye does while reading, and while adapting to low light differs, when you read in low light, your visual muscles receive mixed signals. On one hand, they’re told to relax so that they can take in the most light, while on the other they’re instructed to contract so that they can focus on the text. Therefore your eyes have to work harder to accomplish both tasks, and invariably you strain your eye muscles.

This strain can lead to headaches, sore eyes, as well as dryness but while they may be annoying at first, after a couple of hours they go away. In the long-term, there is no evidence that suggests that straining the eye muscles while reading at night can have a negative impact on your or your child’s eyes apart from the initial discomfort it causes.

So there you have it, our list of the 6 most common health misconceptions. If we’ve learned one thing while writing this it is probably that Mom wasn’t always right (but don’t tell her that!). What are some misconceptions you’ve noticed, and how does our list help us to understand how to “be” healthy? As always, we’d love to hear your thoughts in the comments