You likely use plastic so much, you don’t even notice it. This harsh reality is unfortunate because plastic is terrible for the environment. Since it’s non-biodegradable, plastic will just sit in landfills, waterways, or oceans forever. While some plastic can be recycled, the majority of plastic product waste will be around forever. In order to cut down on the waste of single-use plastics (one of the largest sources of waste in the United States), here are some tips you should follow that will help you break your dependence on plastic.
1. Get a reusable water bottle.
Bottled water is extremely convenient. It’s also very wasteful. Plastic bottles are so numerous that there’s actually a huge trash island of plastic bottles in the Pacific Ocean called “The Great Pacific Garbage Patch.” This infamous island about the size of Texas and continues growing larger.
Plastic bottles are especially wasteful when we consider that we have clean drinking water all around us that is as pure as most bottled water options available. To break your plastic water bottle habit, simply carry around a reusable water bottle and refill it at drinking fountains.
2. Bring your own bags to the store.
People are now realizing just how wasteful plastic bags actually are. Some cities have completely banned them while others are charging customers to use plastic instead of paper bags. This is good because there are affordable alternative ways to transport your groceries.
Aside from selecting paper bags at checkout, another option is to bring reusable tote bags to the store. You can keep some stored in your car for those spontaneous after-work trips to the supermarket so that you will always have at least one or two at your disposal. Just one tote bag can potentially save you from taking three or four plastic bags and eventually disposing of them in your garbage can.
3. Bring metal cutlery to work.
We don’t think about it, but that plastic fork, knife, and spoon are all wasteful. The problem is, at work, some people don’t have any other options.
That’s easy to fix, however, all you need to do is bring some metal cutlery and keep it at your desk. At lunch, you can use it and wash it off in the break room. Plastic cutlery may not be a large source of waste, but over time, it adds up when thrown away daily.
4. Bring your lunch in reusable steel containers.
Sandwich bags are convenient, but they aren’t good for the environment. Fortunately, there’s an alternative that will keep your food fresh and can easily be washed and reused.
Stainless steel food containers are durable, simple to clean, and can be used again and again. You can also use plastic Tupperware, but stainless steel is a much better choice.
5. Cut down on your online shopping.
Shopping online is something that has become routine for most of us. If we need something, we just hop onto Amazon and have it delivered to our house within a week. While this is a win for those of us who value convenience, it’s becoming a major source of waste in our environment.
How many times have you bought something small only to have it arrive in a much larger box? And how many times have you bought multiple items, only to have them shipped in multiple boxes? All of that waste adds up. While cardboard can be recycled, it may be best to travel to your nearest brick-and-mortar store and avoid that unnecessary box altogether.
“Measles,” according to the CDC, “is a highly contagious virus that lives in the nose and throat mucus of an infected person… If other people breathe the contaminated air or touch [an] infected surface, then touch their eyes, noses, or mouths, they can become infected.”
“Measles is so contagious that if one person has it, 90% of the people close to that person who are not immune will also become infected.
This terrifying transmission rate has snowballed into a dire situation that has caused Germany to consider fining families who refuse to vaccinate their children.
“Measles typically begins with high fever, cough, runny nose (coryza), and red, watery eyes,” writes the CDC.
After a couple of days, “Tiny white spots (Koplik spots) may appear inside the mouth,” followed by the breakout of a rash, which travels from the infected individual’s head all the way down to their feet — often accompanied by a high fever.
“After a few days, the fever subsides and the rash fades.”
Hasn’t the measles vaccine been around for more than 50 years?
The first measles vaccine was released in 1963 and this life-changing tonic is still commonly administered in the form of an MMR — measles, mumps, and rubella — vaccine.
“In the decade before 1963 when a vaccine became available,” the CDC shares, “nearly all [American] children got measles by the time they were 15 years of age.”
Millions of Americans were infected each year and hundreds died from the disease. Tens of thousands required hospital care and some suffered painful side effects, like encephalitis (swelling of the brain) as a result of their measles infection.
After decades of hard work and near universal immunization, scientists in the US declared victory over measles in the year 2000.
While doctors were winning the battle with measles in the United States in the late twentieth century, European physicians were following along at a respectable rate.
An ally to measles has been gaining power recently, though: Ignorance.
Measles has returned to the United States and was never completely eliminated from Europe because some individuals began to believe rumors that immunizations were potentially harming the development of their children. There was no reputable science behind these beliefs, but many people began choosing to raise their children without vaccinations.
As a result of these misleading rumors, in the US, there were 667 reported cases of measles in 2014, which remains the highest number of cases since declaring victory in the year 2000.
In Germany, they experience their own massive outbreak in 2015 when there were 2,466 cases of measles in the country, a 700% increase from the previous year’s total (443). Sadly, the 2015 breakout also included the death of an 18-month-old child.
Time for action.
German health minister, Hermann Gröhe was very upset with the death of the child: “The irrational fearmongering of some vaccination opponents is irresponsible,” he told The Guardian. “Anyone who refuses their child protection endangers not only their own child but others as well.”
Leaders in Germany’s parliament began calling for the mandatory vaccination of infants. Finally, this year, a bill calling for action on measles was introduced in Germany’s lower house of parliament.
Part of the bill calls for fining parents up to €2,500 ($2,800) if they “fail to seek medical advice on vaccinating their children.”
“Under the plan,’ according to the BBC, “the children of parents who fail to seek vaccination advice could be expelled from their daycare centre” as well.
There are some lingering questions about enforcing this law, but the bill is headed to the upper chamber of Germany’s parliament. “The law,” writes the BBC, “is expected to be adopted next month.”
In 2016, Germany returned to a somewhat more reasonable 326 cases of measles, but in just the first three months of 2017, they saw 411 cases, stirring the government to action.
Is it ethical for a government to force its citizens to seek medical care? Is it ethical to deny children proven medical care, exposing others to risks? These are tough questions that Germany is hoping to address with this forthcoming law.
Pippa Middleton has one of the most famous figures in the world. This petite brunette leads a healthy, active lifestyle that focuses on eating the right kinds of foods in order to maintain her weight.
But with so many fad diets, juice cleanses, and other options for healthy eating, many people have wondered just what does Pippa eat to stay so slim and fit? The answer might surprise you.
Kate Middleton, Pippa’s older sister, is someone who swears by the Dukan diet. This high protein, low carbohydrate diet seeks to identify essential foods that ensure a person gets all the nutrients their body needs throughout the day. While this works for her sister, Pippa takes a different approach to staying healthy.
Because Pippa leads such an active lifestyle, she actually eats more calories than people might expect. Her diet changes depending on any activities she may have coming up. If she’s preparing for something more strenuous, she makes sure to get the nutrients her body needs to store the energy she’ll need to compete.
In general, Pippa believes that everything should be done in moderation. She tries to eat a healthy, balanced diet that focuses on portion control. She also tries to get regular exercise to keep her body healthy.
She does not, however, deprive herself of what she calls “the naughty stuff.” In a recent interview with Hello! magazine, Pippa revealed that she’ll eat some chocolate on occasion, enjoy a glass of wine from time to time, or even eat some potato chips. The trick is to limit those “naughty” items instead of overindulging.
Along with her balanced diet, Pippa tries to get regular exercise. While she’s a very busy woman, she strives to get between three and five exercise sessions each week. That could mean a thirty-minute run, a game of tennis, or a swim. She says this exercise “boosts my mood and energy, and helps me sleep and digest better…so I make it a priority, whatever the weather or my work schedule.”
If she has a big activity coming up, such as the Birkebeiner Ski Race, a 33-mile cross-country ski race in Norway that she recently completed, she’ll adjust accordingly. The week before a race, she’ll generally cut out any sweets or alcohol and eat three balanced meals a day. She focuses on eating whole grain, high-energy carbohydrates to ensure that her body is ready for the latest endurance test.
As for what she eats specifically, Pippa says it includes “lots of brown rice, lentils, quinoa and sweet potato, and for breakfast, porridge and rye toast.” Those are excellent sources of healthy energy for anyone, but especially someone looking to push their body.
If you’d like a body like Pippa’s, try to follow her similar routine. Make sure you’re getting enough exercise (150 minutes of moderate exercise, 75 minutes of vigorous exercise, or a combination of the two per week), focus on eating three healthy meals a day with nutrient-rich foods and try to keep the unhealthy habits to a minimum. If you can follow this simple plan, you should begin to see results almost immediately.
When we were kids, we maybe had a pager. Then a cell phone in high school. The most advanced social media we had access to was texting, and it usually cost a quarter a text.
Those days are long gone, and today’s young people are growing up in the hashtag generation, with these “challenges” that promote self harm in ways we never thought of (fortunately).
The Eraser Challenge
Some people do things that are silly, yet for a good cause–like the ice bucket challenge that helped raise awareness for ALS. But there are other people who come up with dangerous ideas for no real reason at all.
Thus, we give you “the eraser challenge,” in which kids rub an eraser vigorously on their arms while reciting something like the alphabet.
The result? A nasty burn on their skin and an Instagram post that all their friends will like and comment on. No charity. No good cause.
It’s painful and can become pretty dangerous if you erase enough skin to break through and bleed. “Anytime the skin barrier is broken down, there is an increased risk of skin infections,” Dr. Angela Mattke, a pediatric and adolescent specialist at the Mayo Clinic, toldUSA Today.
“Burns, whether from heat or chemicals, result in a break of the natural skin barrier. The skin barrier’s job is to keep bad things out like potential infection causing bacteria (that live normally on the skin).”
The activity might seem dumb—and it is. But it can be much worse if you get an infection, especially if you’re sharing erasers with other kids who have done the same thing. Purell won’t stop this one, but it will surely make it sting.
Salt and Ice
For us to fully understand the danger of this challenge we have to remember a little science.
Thanks to the internet’s wealth of informative resources (and because high school was unfortunately a long time ago), we can tell you this about salt and ice:
“When salt is added to ice it lowers the melting point. In other words the ice begins melting at a temperature lower than 0° C. This is why salt is added to ice on the roads in the winter. It causes ice that would have otherwise remained as a solid in sub-zero temperatures to turn to water. Note that the temperature of the water has not changed. It’s still at a sub-zero temperature but, as mentioned above, the salt allows it to remain as a liquid at the lower temperature. Don’t think that just because the salted-ice has become water the temperature has risen.” – Stuff Explained
Now back to the challenge. Kids are taking an ice cube, putting salt on their arm, and then letting the ice cube melt on it. As we just learned, the salt will lower the melting point so the ice will melt more quickly, but the temperature is still the same as the water.
Therefore these kids are getting frostbite, which is effectively the same as burning themselves. The kicker is that the ice is numbing them while it’s happening, so they don’t realize how bad the burns are at first.
This trend was so bad that the National Society for the Prevention of Cruelty to Children in the United Kingdom issued a warning to parents about the craze. At least we can keep an eye on these kids to a certain extent, because they feel like they have to upload the challenge to social media or it never happened. Doesn’t this make you happy we grew up without it?
The Pass Out Challenge
You don’t need to know a lot of science to understand that this one is potentially deadly.
This ridiculous trend involves kids filming themselves choking or restricting their breathing ability to the point that they pass out.
This is such a disturbing fad. And with today’s added pressure of social media, some young people are trying this alone and then recording the whole thing. So we are seeing too many heartbreaking cases in which kids are literally filming their own death.
This is what happened with 11-year-old Davorius Gray, whose mom, Latrice Hurst, found his dead body in his bedroom. “If I could rewind time, I would go back and heavily monitor his use of social media, YouTube and the Internet,” she said. “I don’t believe young people should be on social media and it should be limited to adults, or at the very least, with extreme adult supervision—where the parents can see everything that takes place on the sites—should be a requirement.”
It’s crucial that people who have kids heed this warning that these challenges are not just harmless pranks—and that kids’ internet and social media activity should be closely monitored.
The Duct Tape Challenge
This is another challenge that doesn’t require any knowledge of science, but you do need a little “help” from your friends. The problem is that if your friends are dumb enough to wrap you up in duct tape like a broken leg of a chair at a frat house, then they’re probably not going to be much help if something bad happens.
Just ask the family of Skylar Fish, who did the duct tape challenge and wound up in the hospital. His friends had done this challenge before. Usually they tape the person to a pole, but this time they taped Skylar standing up.
The problem is that if you fall, like Skylar did, then your hands are still taped to your sides. So the only thing to break your fall is your head. Skylar fell into the corner of a window ledge and then hit the concrete.
The result: “It crushed his whole eye socket and pinched off nerves in his eye. It’s unknown whether Skylar will regain vision in that eye. He now not only has a brain aneurysm from his head slamming into the concrete, but he also had to get 48 staples,” according to his mom, Sarah Fish.
Skylar at least has found some purpose after the tragic events, saying he wants to “teach other kids not to do it. When I think about it, I become sad and then really happy, because I’m happy, because I survived it. I almost died.”
Fire Spray Challenge
The name alone makes you question just what goes through the mind of a teenager. We are sure that at some point everyone has taken a can of hairspray and a lighter and made a fireball. But this challenge has taken that one step further.
The hairspray + fire + indoors combination is dumb enough, but in some instances kids are covering themselves with the hairspray then lighting themselves on fire and jumping into a pool.
As one worried father told the Daily Mail after he discovered his sons had been doing this: “I felt sick and worried when I heard what they’d done. It could easily have been a phone call saying my sons are in hospital or they’ve died. They have been punished more than I have ever punished them before. It’s shocking that kids are able to watch videos like that.”
We think the word shocking is a perfect description of these dangerous fads. We hope to see more schools start educating students on the harm these “challenges” can cause.
This just in, y’all: TV and Hollywood films are not the best place to get facts about mental illness (or probably anything?) Whether you’re basing the entirety of your perception of eating disorders on a Lifetime seriesor you think you know everything there is to know about schizophrenia because you’ve seen A Beautiful Mind three times, you should be aware that you do not have the full picture.
Stereotypes being perpetuated through popular media is nothing new. Edify yourself, and read on for eight psychological “facts” about mental illness we’ve been fed by television and movies:
People with Tourette’s syndrome cuss all the time.
When you hear Tourette’s syndrome, you probably think of one particular feature for which it’s become famous: letting out strings of extremely offensive curse words, usually at inopportune times. For an example, you could watch this YouTube clip of this dude in a courtroom from a 1989 episode of the television series LA Law.
Contrary to popular belief, though, swearing isn’t always a feature of the neurological disorder. “[T]his only occurs in about 1 in 10 children with Tourette’s syndrome,” according toPatient. (The condition develops in people between ages 2 and 14, typically around age 7.)
As Patient also notes, “it must be emphasised that if this occurs, the child cannot help swearing. It is not a reflection on their moral character or upbringing.”
Other tics include eye blinking, touching or smelling objects, head jerking, repeating observed movements, shoulder shrugging, stepping in a certain pattern, eye darting, obscene gesturing, nose twitching, bending or twisting, mouth movements, or hopping.
You can talk yourself out of schizophrenia.
A Beautiful Mind is a moving depiction of an incredibly gifted mathematician, John Nash, whose life is altered indelibly by schizophrenia. Although the biopic’s plot does in some ways work against the stereotype of schizophrenics as always violent and erratic, it’s still not the whole story for most people who suffer from the illness.
“Regardless of the biographical exclusions, John Nash, in reality and in the movie, is unlike other schizophrenic patients,” Roberto Gil writes for In Vivo: News from Columbia Health Sciences. “He has a superior intellectual capacity, while most schizophrenic patients suffer from impaired cognition.”
Additionally, Nash is depicted as eschewing medication and instead reasoning his way out of his hallucinations. Gil notes:
“Deprived of medications and treatment, many schizophrenics lose, if they ever had them, jobs, family, friends, financial stability, and homes. It is not just a coincidence that homelessness is so common among schizophrenics. The real and fictional Dr. Nash kept ties with professionals, family, and friends because they were very tolerant of his symptoms.”
Treatment is evil.
The conversation about overmedicating is one certainly worth having, specifically in a culture that pathologizes the natural complexity of human emotion, often in gendered ways. But there remains a major dearth of misinformation about mental illness and its treatments, which results in many people who would greatly benefit from medication never receiving help.
One of the more damaging misconceptions is that medication is only for the weak. On par with this is the idea that medication fundamentally changes who you are. As Angelica Jade Bastién writes in Vulture:
“Out of all the tropes on this list, this is the most dangerous. Treatment varies from person to person, of course, but the idea that medication robs you of your personality is odious. Contrary to what’s often shown on TV, psychiatrists and mental-health professionals aren’t manipulative villains or incompetent caregivers.
“These claims contribute to the fear that prevents people from finding the right treatment. In recent decades, TV shows like Monk, Pretty Little Liars, Ally McBeal, Star Trek: The Next Generation, and Buffy the Vampire Slayer have all contributed to this trend.”
OCD sufferers are all terrified of germs.
Obsessive compulsive disord
er (OCD), for those who don’t actually know someone with the disorder (and perhaps some who do), is synonymous with a fear of germs, an obsession with order and cleanliness, and a propensity for counting.
Although these traits certainly do show up in some OCD sufferers, this is only a very superficial understanding of the condition, which is defined by unfounded anxieties that can attach themselves to any topic and resultant attempts to quell that anxiety through irrational compulsive behaviors or rituals.
Depictions of characters such as the protagonist of the award-winning show Monk, whose fears include germs, needles, milk, death, snakes, mushrooms, heights, crowds, and elevators—focus more on external behaviors without taking a deep dive into the OCD sufferer’s inner world. Plus, as Dr. Suck Won Kim, associate professor of psychiatry at the University of Minnesota, tells the Chicago Tribuneregarding Monk’s fears:
“Those are phobic disorders. They’re not related to OCD at all. Many of them are forms of agoraphobia. I’ve seen over 2,000 patients with OCD, and none of them has complained of having trouble going on an airplane.”
Sociopaths eat brains.
Speaking of tropes that perpetuate negative views of mental health professionals, Hannibal Lecter is a cannibalistic psychiatrist—fun! The protagonist of TheSilence of the Lambs is perhaps our favorite psychopath. Er, sociopath?
(There is debate about the distinction between a sociopath and a psychopath, but according to one person on the internet, there’s no diagnostic difference. We’re certainly not mental health professionals, but for the sake of simplicity, let’s just use the term interchangeably in this section.)
But is Lecter even really a psychopath? According to health, science, and tech reporter Rachel Feltman, no.
On top of that, despite what years of ingesting way too many MSNBC specials would suggest, doing something completely heinous like eating brains isn’t even a prerequisite for being a psychopath. As Feltman points out in Quartz:
“Sometimes, a psychopath can look a lot like your friendly neighborhood neuroscientist; James Fallon made headlines when he accidentally diagnosed his own brain scan as showing psychopathic features. After further research and self-evaluation, Fallon categorized himself as a ‘pro-social psychopath’—one who can keep his behavior within socially-acceptable bounds, despite not feeling true empathy for others. …At the end of the day, though, he’d ‘rather beat someone in an argument than beat them up.'”
Dissociative personality disorder is just like “Fight Club.”
Apparently, there’s a whole lot of confusion about this illness, which is used to be called multiple personality disorder but is now referred to as dissociative disorder. Even the professionals are a bit unsure. As psychiatrist Jason Hunziker tells University of Utah health sciences radioThe Scope:
“There is so much controversy, even in the mental health industry about dissociative identity disorder. There are those that swear by almost the Hollywood version of what this looks like. And then there are others who say people clearly use dissociation to help protect themselves, and that’s kind of where I fall in line. I think that people use that mechanism to get out of a stressful situation, and they then have a different personality style that interacts with you during those moments that [their] real self is not present.”
According to one Crackedcontributor,Fight Club doesn’t get it right, either. “Those with [dissociative] identity disorder don’t just wake up and realize they’ve been living as another person,” they write. “They always know about the other personalities, and don’t black out and live as another person. Amnesia and fugue states do happen, but what you see in movies is writers combining them to suit their narrative.”
Mentally Ill = Violent
Speaking of Fight Club, let’s talk about the other popular myth surrounding mentally ill people: They’re all violent. It just ain’t true.
“It will surprise most people—and disappoint Hollywood—but the fact is that the mentally ill are rarely violent and contribute very little to overall violence in the United [States],” Richard A. Friedman writes in Alternet. “It is estimated that only 3 percent to 5 percent of all violence in the country can be attributed to mental illness.”
Those who are much more likely than mentally ill folks to be violent are people who misuse drugs or alcohol. “The fact is that you have far more to fear from an intoxicated businessman in a suit than from a homeless schizophrenic man muttering on the street corner,” Friedman writes.
“People with no mental disorder who abuse alcohol or drugs are nearly seven times as likely as those without substance abuse to be violent, according to the National Institute of Mental Health.” (Then again, wouldn’t drug and alcohol abuse imply addiction, and isn’t addiction a mental illness?)
You can get over an eating disorder in a few days.
We can all agree that ‘90s television and culture have taught us a lot—like, for example, how to wear a denim hat every day for an entire summer with few or no social repercussions. (Mom, how did you let this happen?) What it did not teach us was how to recognize or treat an eating disorder.
Many have called into question the depiction of disordered eating in D.J. Tanner from Full House. In season four, D.J. begins starving herself to try and look thinner for an upcoming pool party, and then, within just a handful of episodes, the problem magically disappears. One viewer summarizes the nearly instantaneous resolution to the problem, which comes—surprise—during one of Full House‘s famous heart-to-hearts:
“…Danny tells D.J. that people come in all shapes and sizes, and that he himself struggled with body image issues growing up because he was so tall and skinny. Oh yeah, Danny, being a tall thin white man is a real hill to climb! Poor Danny! So anyway, Danny tells D.J. that it’s what’s inside that counts and that her friends shouldn’t judge her for looking terrible in a bathing suit and I guess that if more dads gave that same brief, ill-conceived speech then anorexia wouldn’t be such a problem.”
What Full House misses is that an eating disorder is a deadly mental illness that sufferers often struggle with for life, even with professional help. Heart-to-hearts are kind of beside the point.
Eye color, hair color, whether someone can roll their tongue or not—they’re all traits that are often said to come down to genetics. As it turns out, though, we’ve been oversimplifying the reasons behind these traits for decades.
For anyone who can roll their tongue, it seems almost unfathomable that someone couldn’t do this simple trick. It’s actually something that up to 81 percent of all people on this planet are able to do, after all. If it’s so easy, though, why are there still so many people who aren’t able to do it?
The answer that’s been passed around for decades is that it’s genetic, but as it turns out, that explanation isn’t exactly correct. And the truth is more complicated than you might realize.
Way More Complicated, Actually
The story usually goes something like this: People who can roll their tongues were lucky enough to have inherited an awesome (but ultimately pretty useless) tongue-rolling gene from their parents. Those who can’t, of course, are then free to bother their parents about it forever, wondering why they were destined for such a cruel fate.
However, according to John H. McDonald, a University of Delaware professor in the department of biological sciences, there’s more to it. “If that were true, you could never have two non-rolling parents that having a tongue-rolling kid,” he explained. “Yet people have looked at families and find you do see that.”
McDonald says this common explanation is grossly oversimplified. It originated with a study that took place in 1940, but its findings were debunked pretty quickly. “By the early 1950s, people knew pairs of twins where one could roll and one couldn’t,” he said.
“That pretty clearly tells you it’s not all genetic. Yet I ask even today my students, ‘how many of you have been told that tongue rolling is a simple genetic characteristic?’ and most raise their hands.”
So, what’s the truth?
Ever heard of nature vs. nurture? You know, how as you move through life, certain things are said to be influenced by your heredity and genes and others are influenced by your environment?
McDonald says it’s the true driving force behind why some people can roll their tongues with ease and others find it impossible. In some cases, he says something as simple as your positioning as a fetus in the womb could prevent the trait from developing, whereas others are able to overcome the odds and teach themselves how to do it.
Why has this inaccurate reasoning persisted, then? It’s not really clear, but McDonald wants to put an end to it. “It is an embarrassment to the field of biology education that textbooks and lab manuals continue to perpetuate these myths,” he said. “If students took it seriously, a large proportion of students would look at mom and dad and conclude that the mom was sleeping around and dad wasn’t really their dad.”
Believe it or not, it’s not just tongue rolling, either. As it turns out, this type of oversimplified explanation applies to a lot of things we’ve learned about throughout our lives.
Eye Color
Does anyone else have vivid memories of going over the Punnett square in high school while learning about genetics?
Eye color is one of those things that’s talked about a lot when it comes to making genetic predictions, and there are certain things we all think are true when it comes to this type of science. For example, it’s long been said that it’s impossible for two parents with blue eyes to end up with a kid whose eyes are brown.
The Truth?
It’s actually very possible, because something like is eye color isn’t determined by just one single dominant or recessive gene.
“Eye color is determined by variation at several different genes and the interactions between them,” McDonald said. “Thi
s makes it possible for two blue-eyed parents to have brown-eyed children.”
Hair Color
Hair color is usually another popular topic when it comes to discussing genes. Teenagers studying biology often try to figure out how they got the hair color they did or the color hair their kids would have if they ended up marrying their crush of the moment.
When it comes to red hair in particular, most people are under the assumption that red hair will always be a trait that runs in the family, and any baby with at least one redheaded parent will end up with it.
The Truth?
Although there is one main gene that controls whether someone will end up with red hair, it actually has quite a few different variations. Not only that, but it can be easily influenced by other genes, particularly the ones that would give someone brown hair.
Despite what you’ve heard, parents who both have red hair can definitely produce a child that’s either blonde or brunette.
Attached vs. Detached Earlobes
Most of us have probably heard that there are two main types of earlobes we can have—attached and detached.
An attached earlobe will be one that’s actually connected to the side of your head, whereas one that’s detatched will be separated, causing it to dangle slightly. Like many other traits, it is said that one gene determines which you end up with.
The Truth?
There really aren’t just two categories of earlobes at all. Instead, there’s actually a wide range of how attached or detached a person’s earlobes can be.
Not only that, but it’s still not even completely clear which trait is considered to be more dominant than the other.
Hitchhiker’s Thumb
Yes, even something as simple as your thumb shape has been a topic of debate. Anyone who has a thumb that bends backward at the knuckle is said to have a hitchhiker’s thumb and everyone else is just…normal?
If you’ve got them, it’s frequently said that it’s because of a single variation within just one gene.
The Truth?
“If the myth were true, two parents with hitchhiker’s thumb could not have a child with a straight thumb,” McDonald said. Not only that, but there’s also no clearly defined line or angle that dictates what is a hitchhiker’s thumb and what isn’t. Everyone is different when it comes to flexibility, including their fingers, so it can be difficult to determine if someone even has the trait.
“It’s completely arbitrary where you draw the line between straight and angled,” McDonald added.
Hand Clasping
Yes, even the way you clasp your hands is often believed to come down to biology.
Apparently a single gene is responsible for whether you clasp your hands with your right or left thumb on top.
The Truth?
That’s really not the case. It’s been determined that even twins seem to have their own preferences when it comes to the position of their hands when they’re clasped. Not to mention that your preferences might change throughout life, and another position may feel more natural and comfortable at some point in time.
So, what’s the bottom line here? Although there are certain ways we can predict genetic traits, genes are actually way more complicated than we’ve been led to believe in our high school biology classes. For anyone i
nteresting in learning more, McDonald actually suggests studying cats instead of other humans.
“Cats do have a number of traits—long versus short hair, orange versus black hair, white boots or not that—that are nice, simple, one gene-traits,” he said. Oh, and: “Everyone either has a cat or knows someone else’s cat,” he added.
In 2015, the city of Chicago faced a heartbreaking health crisis.
Not a single person ended up in the hospital due to this threat; no one missed work. Still, the suffering was unbearable.
This illness pulled a dirty trick. It went after man’s best friend.
The H3N2 flu virus doesn’t thrive in human bodies. It infects dogs and, more rarely, cats. The 2015 Chicago outbreak saw 1,000 pets brought low with fever, fatigue, and runny noses. In other words, this virus does to dogs pretty much exactly what the flu does to people.
Veterinarians were able to treat the flu victims in Chicago, and eventually, the epidemic passed. But the virus was only biding its time. In June 2017, the dog flu resurfaced, this time in the Sunshine State. As of this writing, 12 dogs have been diagnosed with canine influenza in Florida, and those numbers are almost certain to rise. Here’s what you need to know.
1. Is your dog at risk?
According to the American Veterinary Medical Association (AVMA), canine influenza is caused by a newer virus. As a result, dogs haven’t yet developed immunity.
“Almost all dogs, regardless of breed or age…, are susceptible to infection if exposed to the active virus,” the AVMA states on its website. “Virtually all dogs exposed to the virus become infected, and nearly 80 percent show clinical signs of the disease.”
2. How serious is dog flu?
The symptoms of canine influenza are similar to the flu we’re all familiar with. In its most common form, dog flu causes coughing, fever, lethargy, and runny nose. In rare cases, dog flu is accompanied by pneumonia and fevers of up to 106 degrees. That’s when the illness becomes truly dangerous.
3. Is the dog flu ever fatal?
Just like its human counterpart, canine influenza rarely leads to death. However, fatalities are not unheard of. The AVMA places the fatality rate of dog flu at less than 10 percent.
4. How long does dog flu last?
The vast majority of infected animals get over the illness in two or three weeks.
5. This latest outbreak is in Florida. Are dogs in other states at risk?
The short answer is “yes.” Since its U.S. debut in 2004, canine influenza has popped up in at least 40 states. The risk of infection rises considerably when dogs have contact with each other, as in kennels, dog parks, and animal shelters.
6. What can we do to protect our pets from the dog flu?
The best way to prevent infection is to prevent contact with a sick animal. The virus spreads through nose-to-nose contact between dogs, said Keith Poulsen, a veterinarian who teaches at the University of Wisconsin-Madison.
However, Dr. Cynda Crawford of the University of Florida College of Veterinary Medicine said in a news conference that the virus “is also spread very effectively and efficiently with contact through contaminated clothing, contaminated hands, contaminated objects in the environment.”
Talk to your veterinarian. If dog flu has been a problem in your area, your vet might recommend vaccinating your pet.
“The more dogs in a community that we can vaccinate to build up community immunity, the better chance we have of keeping that virus out,” Crawford said.
When T.S. Eliot wrote that “April is the cruellest month,” he must have been talking about allergies.
Just as the sun finally peeks out to warm our chilled bones, the air fills with pollen, dust, and a devil’s brew of allergens designed to make those of us who suffer from seasonal allergies want to go back to bed for a few months.
You can stuff pills down your gullet. You can walk around in a gas mask. But why do that when you can make a few simple changes to your diet for an allergy-free season? This summer, eat these foods to enjoy the sunshine without choking and dripping and being generally miserable.
1. Local Honey
Angels are real, and they are called bees. Not only do our buzzing buddies pollinate our crops for us, free of charge, they manufacture the world’s finest defense against seasonal allergies (maybe—more on that later).
The reasoning is pretty tidy. Local honey is made from local pollen. Local pollen is a serious allergen. Therefore, if you get your system used to the stuff by sampling bits of local honey early in the season, you won’t get all scratchy-voiced and stuffed up.
But does it really work? Maybe, says a doctor named Lawrence Rosen, who consulted with WebMD about foods that treat allergies.
“If you take small doses of honey early in the season, you may develop a tolerance toward pollen in your area,” Rosen said. The key word here is “may.” The jury’s still out, but it’s worth a try. Besides, honey is delicious.
2. Pineapples
Pineapples aren’t just kitschy cups for tropical beverages anymore. The fruit might also help keep your nasal passages clear during allergy season.
The fruit contains this stuff called bromelain, which is an enzyme that can sooth irritation caused by allergies. It might even help with asthma, Rosen told WebMD. Just don’t try to eat the spiny skin. They may be called “pineapples,” but you can’t bite into them like a non-pine apple.
3. Salmon and Tuna
You might recognize these fishes from the semi-recent national obsession with Omega-3 fatty acids. Well, the “good” fats found in tuna, salmon, and certain other ocean residents is also good for toning down the effects of allergies.
It’s all about reducing inflammation, which Omega-3s do quite nicely, thank you.
4. Broccoli (Sorry)
Your mom was right. You should eat your broccoli. That goes double if you suffer from seasonal allergies. Broccoli contains a flavenoid called quercetin that can help to reduce histamine levels.
Histamines are the shock troops of the allergic response. They’re natural chemicals that your body releases when it senses allergens sneaking into your system. They make your eyes water, your nose run, and your skin itch, which is basically the trifecta of seasonal allergies.
5. Hot Peppers
If you thought broccoli was loaded with the histamine-busting flavenoid quercetin, you’ve got to try ancho peppers. These dried poblanos contain like 9 times the amount of quercetin that broccoli can boast. Jalapenos and serrano peppers are also rich in quercetin, so get that hot sauce brewing!
Having a comfortable bedroom is essential. Not only is it the place you relax after a long day at work, but a comfortable bedroom can ultimately lead to a better night’s sleep. It’s no secret that getting a good night’s sleep helps us wake up refreshed, allows us to be more productive throughout the day, and aids with overall mental and physical health. If you’re struggling to catch those ZZZs, here are some indoor plants and flowers you should consider putting in your bedroom.
1. Snake Plant
Fresh oxygen will help make your room feel more comfortable and support a good night’s sleep. The charming snake plant will pump oxygen into the room day and night, keeping the oxygen circulating and ensuring you stay asleep all night.
The best part of all? The snake plant is very resilient and easy to care for. They are hearty and can live in just about any light, making them suitable for any room. They also only need to be watered about once every two weeks, which makes them great for people without a green thumb.
2. Peace Lily
This plant acts as a natural air purifier, removing toxins, dust, and other debris from the air in your bedroom. This will help keep your allergies at bay and enable you to breathe easier as you’re sleeping.
The peace lily is a good low light plant, meaning it can be hung just about anywhere in the bedroom. It only needs to be watered once a week, but if it’s near a window or your room is especially warm, you may want to water it twice per week.
3. Pink Jasmine
Pink Jasmine not only looks beautiful and makes the room more inviting, it also produces a pleasant scent. This scent serves as a natural relaxer, making it easier to [linkbuilder id=”6673″ text=”fall asleep”] when you climb into bed.
However, this pretty plant requires a little more care than others on this list. It does well in a sunny spot and should not be overwatered. It’s also best to buy it already potted. Fortunately, pink jasmine is relatively easy to find at most garden stores.
4. Gardenia
Like the peace lily, this plant also produces a lovely scent, making your bedroom smell clean and fresh. The aroma also has a sedative-like effect, helping you feel naturally tired as you unwind in bed. If you’re someone who tosses and turns, this plant can be a miracle worker.
The only negative aspect of the gardenia is that it may be difficult for some people to keep alive. The gardenia requires bright, indirect light. If it gets too much direct light, the leaves can burn. If your room doesn’t have a spot like the one previously described, the gardenia may not be the best option for you.
5. Lavender
Lavender has been shown to have a naturally relaxing aroma. This is why you’ll find many laundry detergents, scented candles, and air fresheners with this incredible scent. While dried lavender is effective, live lavender gives off a much stronger fragrance.
This plant does well in direct sunlight, so find a spot on a windowsill where it can bask in the sun all day long. You can water as needed. Sweet dreams!
Dementia is one of the worst health conditions a human being can face. It robs us of our ability to function, muddles our cherished memories, and can eventually become terminal. While there is, unfortunately, no known cure for dementia, there are ways to slow its progress. Here are some warning signs that all women should be aware of. If you notice any of these symptoms, talk to a medical professional as soon as possible.
1. Short-Term Memory Loss
We all have moments when we struggle with our short term memory. Things like forgetting why you walked into a room, where you just set down your car keys, what that person just told you their name was, and other similar short term memory issues happen to us all.
However, this memory loss becomes a serious problem when situations like the ones mentioned above happen multiple times throughout the day. If you find that you’re becoming more and more forgetful, this could be an early sign of dementia.
2. Difficulty Choosing Words
Language skills come naturally for most people. Because it’s so natural, we often speak without really thinking about the words we choose. When someone has to stop mid-sentence to find the words they want to use, that’s a warning sign.
This is especially true if someone is struggling to recall common words or a word that they use all the time. If you find that you’re rummaging through your brain for words you use daily, talk to your doctor right away.
3. Problems With Concentration
Focus and concentration actually require higher-order brain function. They seem second nature, so we don’t normally think of them that way. When someone has dementia, however, focus and concentration become extremely difficult.
If you find that you’re having difficulty making plans, are struggling with numbers, or realizing that following your favorite TV show or favorite game is harder than ever, that might be a sign of dementia.
4. Difficulty Understanding Time
Time is something we take for granted. It’s natural to us to understand the function of time and its role in our lives. People with dementia lose the concept of time.
Often, those in the early stages of dementia will lose track of time, their mind wandering off without paying attention to the clock. If you find that time is passing by very quickly or you that you’re having difficulty measuring time in your head, you should speak with your doctor.
5. Difficulty Recognizing Where You Are
Like time, we often take spatial relationships for granted. We know our homes, our workplaces, and other familiar spaces like the back of our hands. With dementia, these common places suddenly become foreign.
If you find that you don’t recognize rooms in your home, forget how to get to work, or easily forget where you are, that could be due to a decline in mental function.
6. Lethargy
People in the early stages of dementia often have depression over their condition. This depression can eventually lead to lethargy.
This in and of itself is not a sign of dementia, but when combined with other factors, could be. If you’re lethargic and depressed, talk to a doctor to try to identify the underlying health problem causing it.