Car crashes are a leading cause of death for children ages 1 to 14, and many tragedies can be prevented with appropriate foresight. According to the National Highway Traffic Safety Administration (NHTSA), a properly fitted car seat can help to ensure the safety of a child, but about 80 percent of parents use car seats incorrectly.
When you are choosing a car seat, it’s important to take your child’s age, height, and weight into consideration, along with your vehicle model. Although there are many options available, all follow the same general guidelines from the NHTSA, as required by law.
Here’s what you need to know.
Infants and toddlers: All children who are under the age of 2 or weigh under 35 pounds should ride facing the rear in the back seat of a vehicle. Toddlers and preschoolers: When children reach the maximum weight limit for rear-facing car seats, they are then able to ride facing forward.
Most forward-facing car seats will support children up to 80 or 90 pounds, depending on the model, and should always be placed in the back seat. School-aged children: Belt-positioning booster seats should be used when a child exceeds the weight of their forward-facing car seat (around 90 pounds). Typically kids ages 8 to 12 will be using a booster seat.
Older children: Most children will not fit in seats with adult seat belts until they are at least 4’9″ tall and about 10 to 12 years old. Still, any child 13 or younger must ride in the back seat.
LATCH your kids in.
The Lower Anchors and Tethers for Children (LATCH) system is used to safely harness car seats in a vehicle. LATCH uses specific anchor points from the car and connectors on the car seat to ensure the safety of your child.
This system works well, but it does have a weight limit, which includes the weight of the car seat itself.
Many parents are unaware that the weight limit of LATCH isn’t just the weight of their child, and this can lead to miscalculations. Some car seats can weigh up to 20 pounds, so be sure to note the seat weight when choosing between products.
After the Purchase
Consumer research is only half the battle when picking out a car seat. Once you’ve made a decision, you have to make sure you install the seat correctly. You should also register the seat so you can stay up to date with any future product issues.
Recalls happen, of course, and registering your car seat is the best way to make sure you receive important information right away if a problem arises.
As children get older, you may have an increasingly difficult time getting them to use their seat belt. Stay strong and don’t back down in these situations. In 2015, more than 14,000 children and adult lives were saved by seat belts. Make strong rules and work early to establish good habits.
Wondering if you stink right now? Go for it. Give your pits a sniff.
Unfortunately, simply sticking your nose in your armpit isn’t a reliable way to tell whether (or how bad) you smell.
According to a study published by the academic journal Frontiers in Behavioral Neuroscience, humans tend to be less aware of smells the longer we’re around them.
The phenomenon is known as “olfactory fatigue” or “olfactory habituation.” Whatever you choose to call it, the takeaway is that bad smells seem less bad and good smells seem less good the longer we’re exposed to them.
This is exactly the sort of thing that Pamela Dalton, a psychologist at the Monell Chemical Senses Center, spends her days studying.
“The olfactory system is one of the world’s best difference detectors, and that’s how it was designed,” she toldTheWashington Post.
One of the chief roles of the nose (in conjunction with the part of the brain that processes information coming from it) is to sense when something’s new, different, or unfamiliar. When a scent is familiar or lingers for a long time, the part of your brain that deals with smells tends to filter it out as unnecessary information.
In one study, Dalton placed air fresheners in the bedrooms of subjects for a few weeks. After just a couple of days, the participants reported that they didn’t notice the scent when they entered the room and were also less sensitive to the same scent when exposed to it in her lab.
“What seems to happen in long-term adaptation is that the receptors that would normally respond to these smells almost turn off after being bombarded for a few weeks,” she said.
“You don’t see that in vision or hearing. You can be adapted to a sound or sight, but generally the systems recover pretty quickly. The fact that it takes two or three weeks to regain sensitivity is very unique.”
So if simply giving yourself a sniff doesn’t work, how can you know whether you stink?
As Dalton told TheWashington Post, “Unfortunately, you really just have to rely on the opinion of a close friend or spouse.”
If you don’t have someone you can trust to tell you when you smell, there is another pretty solid rule of thumb: If you’ve sweated recently, you probably stink.
According to the Centers for Disease Control and Prevention, almost all humans’ sweat glands produce a substance that smells when it comes in contact with naturally occurring bacteria on our skin after we go through puberty.
It’s not all bad news, though.
Two studies—one from 2011 and one published this year—suggest that your natural odor might make you more attractive to potential partners. Another study that appeared in Frontiers in Psychology in 2016 indicates that your natural scent can even reveal your personality traits.
So, the goal shouldn’t be to totally eliminate body odor but to minimize it when it’s bad. Fortunately, the best way to do that is simple: Maintain good personal hygiene.
Wonder Woman is 2017’s top-grossing summer movie, raking in
Think, just for a second, about the superhero films featuring female leads. Women aren’t really equally represented—and, unfortunately, that’s not specific to the superhero genre.
The Bechdel Test
The Bechdel Test dates back to 1985 and sets a very low bar for evaluating women’s representation in films. The test has three simple criteria: “(1) it has to have at least two [named] women in it, who (2) who talk to each other, about (3) something besides a man.”
British event organizing company Twizzlecompiled a list of the last 25 Marvel and DC comic movies and ran them through the Bechdel Test. Less than half of the films passed the modest test.
“In the time it takes to make a movie,” Davis says, “we can change what the future looks like. There are woefully few women CEOs in the world, but there can be lots of them on screen. How do we encourage a lot more girls to aspire to lead? By casting droves of women in STEM, politics, law and other professions today in movies.”
A Brighter Future
Ashley’s mother, Christine Keller, is doing her part to provide a positive role model to little girls too. Keller published the book Danica Dreamer in 2014.
“Danica Dreamer is a smart, adventurous and curious young girl with a wild imagination and big dreams for her future,” reads the Amazon descriptionDFID – UK Department for International Development/Flickr
of the book. “Join her on an amazing journey to discover what it would be like to be the President of the United States of America.”
It looks like Ashley has more than a couple of strong female role models to look up to.
An important closing note: The value of representation is not just a gender thing. Having positive role models from allunderrepresented communities is vitally important to helping us create a more equitable world.
According to the Environmental Working Group (EWG), “natural flavors” is one of the most common ingredients listed on food labels—only surpassed by salt, water, and sugar.
So if you pay any mind to the labels on the food you buy, you’re probably familiar with the term—but what does it actually mean?
The answer isn’t quite as straightforward as you might have thought.
While it would seem intuitive that “natural flavors” are precisely the opposite of “artificial flavors,” the reality is that the two have more similarities than differences.
According to David Andrews, senior scientist at EWG, “The differentiation is really down to the origin of those molecules, whether synthetically produced in a lab or purified in a lab but from a natural source.”
“Most often, as far as I could find, the actual chemicals themselves could be identical or extremely close in terms of natural versus artificial,” he told CNN.
There’s still more to it than that, though.
A given flavor, whether natural or artificial, can contain anywhere from 50 to 100 different ingredients, and they’re not all as healthful—or as natural—as you might have hoped.
“The mixture will often have some solvent and preservatives—and that makes up 80 to 90 percent of the volume,” said Andrews. “In the end product, it’s a small amount, but it still has artificial ingredients.”
According to Andrews, the solvents and preservatives in natural flavors are present in such small amounts that there’s no real risk of adverse health effects directly from them.
They do present a less obvious problem, however.
“Natural and artificial flavors play an interesting role in food. They’re essentially providing the taste and often they’re added to make the food more appealing, or to potentially replace something that’s lost through processing, storage or in some cases even from pasteurizing,” Andrews says.
“One concern we have is the ability to make things more appealing than they may necessarily be,” he continued. “You can make [foods that aren’t as healthy] more appealing or even taste as if they’re extremely fresh when they may not be.”
Ultimately, Andrews says, the purpose of these additives “is to make a short intense flavor that quickly dissipates so you come back for more.”
Basically, if you consume them on a regular basis, foods containing these additives can play tricks on your body.
“As a consumer, it is important to be savvy about ingredients,” Sheth said. “Recognize that any food consumed in excess of your needs is going to affect your weight loss journey.”
Overall, the takeaway seems to be this: Although you don’t need to eliminate all natural (or even artificial) flavors from your diet until the end of time, you’re better off sticking with whole, unprocessed foods as much as you can.
If I had a nickel for every time someone asked me when I was going to settle down and have kids after my thirtieth birthday, well…let’s just say I’d have a lot of nickels.
I was never overly concerned about having kids. I enjoyed the single lady life, and being a furmom was all the responsibility I cared to take on in my twenties.
It wasn’t until my younger brother and his wife not only had a baby, but lapped me with a second child, that I started to panic about having children.
I was approaching 30 years old. Despite being married, actually being a mom felt farther away than it had when I was 20.
As I blew out the candles on my thirtieth birthday cake, the one question that lurked in the back of my mind was, “How long can I really wait to have to have a baby?”
Historically speaking, statistics are scary.
You’ve probably seen the numbers on a poster or pamphlet in your OB-GYN’s office. According to oft-cited statistics, only 67 percent of women over the age of 35 will conceive within a year. After a woman is 40 years old, that number drops to around 40 percent. By age 43, natural conception percentages plummet to under 5 percent.
Yikes.
In a time when researchers study everything from how to unboil an egg to intense make-out sessions, it’s reasonable to assume there’s plenty of modern fertility research being done.
Surprisingly, that is not the case. Those scary fertility statistics that make the rounds in women’s magazines every few months are actually based on data from 300-year-old French church records.
According to a report by the BBC, researcher Jean Twenge found that “the data on which that statistic is based is from 1700s France. They put together all these church birth records and then came up with these statistics about how likely it was [someone would] get pregnant after certain ages.”
In the 1700s, doctors (all men, naturally) really believed that the womb could wander all over a woman’s body, so pregnant women didn’t exactly have access to the best health care. Nutrition was poorly understood, and the average life span during this time was only about 40 years old.
Women who knew they were reaching peak life expectancy probably tried to avoid pregnancy. Although they didn’t have access to modern birth control, there were other ways to prevent pregnancy.
Yet modern scientists and doctors continue to cite these statistics, striking fear in the hearts of 30-something women hoping to get pregnant one day.
So what do modern statistics say?
There is surprisingly little modern natural conception research being done today. One of the problems with current research is that many studies are based on women undergoing fertility treatments such as in vitro fertilization (IVF). In large part this is because studies about unaided conception are often difficult to accurately report.
It’s not that studies of women undergoing fertility treatments are inaccurate. Instead, the problem is that they only represent a small fraction of women trying to get pregnant at different ages.
Only about 1.5 percent of babies born each year were conceived with fertility treatments. Most current fertility research simply doesn’t apply to the other 98.5 percent who were conceived naturally.
When it comes to a woman’s ability to get pregnant after a certain age, most current research is almost as bleak as the data from 300 years ago. But it paints an incomplete picture of fertility.
In her revealing essay in The Atlantic, Twenge interviewed Dr. Allen Wilcox, who shed light on why infertility rates among women over the age of 35 seem so high.
According to Wilcox, “The observed lower fertility rates among older women presumably overestimate the effect of biological aging. …If we’re overestimating the biological decline of fertility with age, this will
only be good news to women who have been most fastidious in their birth-control use, and may be more fertile at older ages, on average, than our data would lead them to expect.”
Though they are few, recent natural conception studies suggest promising news for hopeful moms-to-be.
A widely cited 2004 study by Dr. David Dunson looked at over 700 women who were actively trying to get pregnant across a wide age spectrum. Dunson concluded that 82 percent of women between the ages of 35 and 39 would naturally conceive within one year, compared with 86 percent of women in their twenties.
Corroborating Dunson’s research is a larger 2013 study done by Dr. Kenneth Rothman that found that 77 percent of women between the ages of 35 and 40 conceive naturally within a year, compared to 83 percent for women in their twenties. Both studies found only a marginal difference of 4 to 5 percent in conception rates in women over age 35.
These research studies, while small, are encouraging as more women than ever delay starting a family.
What does influence fertility?
Women over the age of 35 often shoulder the burden of fertility issues in the mistaken belief that their age alone is the reason that they struggle to get pregnant.
In reality, there are many factors that influence a woman’s fertility: genetics, fallopian tube dysfunction, and endometriosis, to name a few. Fallopian tube disorders and endometriosis alone account for as many as 55 percent of infertility cases, and these can occur in women of any age.
Additionally, about 35 percent of infertility cases can actually be traced back to a problem with the man.
Although it is true that a woman’s fertility does sharply decrease after age 40, the odds of getting pregnant naturally after 40 are still pretty good—around 50 percent. Pregnancy after 40 does carry a higher risk of chromosomal abnormalities, however.
For a woman in her twenties, the risk of abnormality is about 1 in 500. By the time women reach age 40, that number jumps to about 1 in 60.
That number does seem frightening, but David James of the National Institute for Health and Care Excellence says, “Turning that on its head, it does mean that 59 out of 60 women aged 40 will have no chromosomal problems in their baby at all.”
So what’s a woman to do?
The jury is still out on how long women can really wait to have a baby. However, current research does show that most women can wait to have children until well after age 30.
For healthy couples with no known heredity issues that might affect a fetus’ development, pregnancy is most likely safe. But ultimately, it’s a decision women should discuss with their doctor and partner.
Dr. Neil Gleicher, founder of the Center for Human Reproduction, is optimistic about the future of pregnancy at any age. In an interview with the Business Insider, he remarked, “We will reach a threshold where age no longer matters and women will be able to conceive probably pretty much independent of their age.”
As society progresses (we hope) with time, it’s not uncommon to look back at the way things were done “in the olden days” with a hefty dose of WTF.
Putting butter on a burn (a folk remedy that can actually make things worse), treating a croupy baby with a spoonful of sugar…garnished with a few drops of kerosene (NOPE), or raw chicken applied to a cold sore (?) would all probably strike most of us today as questionable, if not extremely foolish.
It’s no wonder that children have often borne the brunt of our stupidity. Being completely helpless and often incapable of expressing their own perspectives, kids make the perfect guinea pigs for adults’ “innovations.” One of these that would be regarded with suspicion by modern-day people? Mailing babies and small children.
In the early 20th century, the postal service increased the weight allowance for individual packages sent through the mail to 11 pounds. It was only a matter of time before folks started pushing the envelope (heh) on what could legally be carried by the mailman.
“While private delivery companies flourished during the 19th century, the Parcel Post dramatically expanded the reach of mail-order companies to America’s many rural communities, as well as the demand for their products,” reports the Smithsonian.
“When the Post Office’s Parcel Post officially began on January 1, 1913, the new service suddenly allowed millions of Americans great access to all kinds of goods and services.”
A New York Times article from that year describes one such good—a baby boy in Ohio who was sent by mail to his grandmother:
“Vernon O. Lytle, mail carrier on rural route No. 5, is the first man to accept and deliver under parcel post conditions a live baby. The baby, a boy weighing 10-3/4 pounds, just within the 11 pound weight limit, is the child of Mr. and Mrs. Jesse Beagle of Glen Este.
“The boy was well wrapped and ready for ‘mailing’ when the carrier received him to-day. Mr. Lytle delivered the boy safely at the address on the card attached, that of the boy’s grandmother, Mrs. Louis Beagle, who lives about a mile distant. The postage was fifteen cents and the parcel was insured for $50.”
Another article, from 1915, describes a 3-year-old girl named Maude Smith who weighed 30 pounds and who was sent through the mail for 33 cents in Kentucky.
“The child was seated on a pack of mail sacks between the mail carrier’s knees and was busily eating away at some candy it carried in a bag,” reportsThe Courier-Journal. “In the other hand it carried a big red apple and it smiled when the curious folks waved their hands and called to her.”
The reasoning behind some parents’ willingness to send their little ones through the Parcel Post seems to have been threefold: postage was cheaper than a train ticket, a lot of trust was placed in mailmen, and the idea of tiny living creatures carried in satchels like inanimate objects was funny and adorable (and, hey, some things never change).
But, lest we get things twisted, mailing infants and toddlers was by no means common practice. The fact-checking site Snopesmakes sure to point out that “it was neither a regular occurrence nor a routine aspect of the Parcel Post service for people to wrap up children, slap some stamps on them, and ship them cross-country.” Phew!
Snopesalso reports that “the few documented examples of children being sent through the mail were nearly all publicity stunts, instances of people who knew the postal workers in their area asking them to carry their babies a relatively short distance along their routes to some nearby relatives, or cases in which children were listed as ‘mail’ so they could travel on trains without the necessity for purchasing a ticket.”
Furthermore, to our disappointment/relief, the pictures showing babies hanging in mailbags alongside stone-faced postal carriers are, as Snopesreports, “simply vintage cute posed humor shots taken from a collection of historic Smithsonian Institution (SI) photographs uploaded to Flickr.”
We’re not sure whether the relatively few instances of baby-mailing were more brilliant, comedic life hack or lax (era-appropriate?) parenting. Either way, the past holds an endless supply of ill-advised things people used to do to kids.
Here are a few more examples for your cringing pleasure.
Newborns’ worth was tested by plunging them in cold streams.
Dunking a baby into a nearby body of water (say, a cold stream) after its birth certainly sounds jarring, but this may not appear so strange in the context of ancient times, when stream-cleaning could seem like a fairly reasonable way to clean off a gunky newborn.
Until you hear why some people were dunking their babies in streams, that is. According to Mark Sloan, MD, author of Birth Day: A Pediatrician Explores the Science, the History, and the Wonder of Childbirth, this functioned as a test to see whether a newborn baby deserved to continue living.
Sloan points to this quote from Aristotle (384-322 B.C.):
“To accustom children to the cold from the earliest years is also an excellent practice, which greatly conduces to health, and hardens them for military service. Hence many barbarians have a custom of plunging their children at birth into a cold stream.”
If babies couldn’t handle the plunge, Sloan says, they “were left outside to die.”
Infants were fed from bacteria-infested bottles.
If you want to see something that looks sadistic—and that actually was, unbeknownst to parents in those times—search for images of Victorian baby bottles.
These glass bottles equipped with rubber straws acted like petri dishes for illness and led to the death of thousands of babies in the late 1800s, when only 1 in 5 infants was expected to live to the age of 2.
The bottles weren’t always branded this way, of course. Originally they went by names like “The Little Cherub” and “Mummie’s Darling.” How did a dangerous item elicit such sweet talk?
“The long India rubber tubing that connected the bottle to the nipple made it easier for the busy housewife to feed the child,” writes the American Academy of Pediatrics (AAP). “You didn’t have to put the bottle up to the baby’s mouth, or even hold the baby. These allowed the babies to practically feed themselves when they were hungry!”
“This was considered a major move forward in the science of child care as well as a significant advancement for women’s rights, freeing them from the inconvenience of breastfeeding, including the difficulty of managing the mechanics with corsets and the need to be constantly accessible for feedings.”
Unfortunately, science was not well incorporated into people’s lives by this point, and many women were told that they could go weeks without washing what babies drank out of. Adding to the problem, the bottle itself had a faulty design.
“The rubber tubes connecting the bottle to the nipple were nearly impossible to clean and developed cracks over time, making them potent breeding grounds for numerous diseases that caused horrifying and painful deaths,” writes the AAP.
Livestock used to nurse babies with their animal teats.
When you read this headline, your face scrunched up and you silently mouthed “whiskey tango foxtrot” to yourself, didn’t you? Strange as it sounds, this is for real. Back in the day, if a baby couldn’t be breastfed by Mom, the options were limited.
Animals like goats and donkeys breastfeeding human infants was especially popular between the 16th and 19th centuries, before pasteurization and before the vulcanization of rubber (the chemical processing of crude or synthetic rubber that makes it stretchy and elastic) allowed for soft artificial nipples.
Its popularity also coincided with “the era of syphilis, which in 16th-century France prompted many mothers to reject wet nurses out of fear their babies would be infected,” as The Washington Post reports.
If the idea of a farm animal nursing a human baby strikes you as strange, we’d like to direct you to this quote from The Washington Post about human–animal breastfeeding, but in the reverse setup: “Women in the far eastern Russia peninsula of Kamchatka suckled baby bears, which they’d later kill for their meat and valuable gall bladders.”
So there’s that. Not a whole lot of words that come to mind—mainly just WTF.
Jim Rosenthal had a table at one of the best restaurants in the world.
It was 2009, and the British sportscaster joined his wife, Chrissy, and a few friends for dinner at
Dinner was excellent. Rosenthal’s bill came to around £1,300Nick Webb/Flickr
(close to $2,000 at the time). Two days later,
This habit might have played into the Great Fat Duck Poisoning of 2009. A Health Protection Agency (HPA) investigation of the incident found that “direct infection from shellfish could have produced this outbreak.”
“However, there is also some evidence to support other possible routes of transmission through food. The complex nature of food preparation in this restaurant, with extensive handling of foods, would require excellent food management systems to assure safety… Alcohol gel, which is not fully effective against norovirus, was widely used.”
In other words, lack of hand washing could have spread an infection from one batch of bad oysters to every item on the menu. This might be a good time to find out if your favorite spot serves raw shellfish.
2. Your steak might not be the freshest thing in the kitchen.
A full third of the kitchen staff who responded to the survey admitted that their workplaces served meat that was actively going bad. The scientific term for this, apparently, is “meat on the turn”—or at least that’s how the British chefs described it.
An article on the website for UK cooking show Ramsay’s Kitchen Nightmares quotes one chef as saying:
“The first task we gave someone who came to us looking for a cheffing job was to make a meal with the chicken that was on the turn… That’s important to a kitchen because it means you can get another day or two days out of your meat. If a chef could do this, I knew he was experienced in restaurant kitchens.”
If that’s how you spot a good chef, we’re eating PB&J from here on out. We’ll handle our own food prep, thanks.
3. Kitchen staff might not take sick days when they really, really should.
Restaurants are high-pressure, fast-paced workplaces. Workers bond like soldiers in the trenches, and when one of them misses a shift, they know their friends and colleagues will have to work even harder to make up for the absence.
Maybe that’s why nearly a third of the kitchen staff in the survey said they’d been to work within 48 hours of vomiting and/or having diarrhea, which the study adorably shortens to “D&V.”
D&V are the exact sorts of symptoms you’d expect from a norovirus infection. In fact, HPA investigators did track the 2009 Fat Duck outbreak directly to norovirus communities thriving on raw oysters. But even diners who didn’t order the oysters went home sick.
“Several staff members were infected with norovirus and may have been infectious while at work,” says the HPA report, offering a possible explanation for probable cross-contamination.
That wouldn’t surprise the authors of the PLOS ONE study. When restaurants win awards and accolades, they suggest, workers refuse to stay away—even when they’re sick. Kitchen staff at these decorated eateries were 28 percent more likely than others to work a shift within 48 hours of D&V.
But let’s get back to Jim Rosenthal and his wife’s disastrous birthday party.
No one would have predicted a food-borne illness when the diners sat down at their swanky table on that dark day in 2009. They happily celebrated Chrissy’s 58th birthday with culinary masterpieces.
Boxing promoter Frank Warren was among the guests. He later described the outing to Sky News, saying that “everything was fabulous about the evening.”
“The food, the setting, the service, it was unbelievably good,” Warren said. “But unfortunately, afterwards, all of us were ill.”
Let’s be clear: The HPA investigation blamed the Fat Duck’s downfall on raw oysters that spent their afternoons filtering sewage water and growing great blooms of norovirus. No one faulted restaurant staff, at least not officially.
“No breaches of hygiene standards were identified in the preparation processes as described by staff,” the investigators wrote in their report.
But authors of the PLOS ONE study tie the extent of the outbreak to some of the risky behaviors discussed above. So does the HPA report. It’s enough to make you think twice about your own city’s foodie Mecca.
In the end, the Fat Duck closed for two weeks of intensive cleaning and, presumably, soul-searching. It was the single worst outbreak of norovirus at a restaurant in history. The Fat Duck reopened after receiving a “clean bill of health,” a spokesperson for the restaurant toldMike_fleming/Wikimedia Commons TheGuardian in 2011. They would not be serving oysters this time, the spokesperson said.
The point is, if it can happen at one of the top restaurants in the world, it can happen anywhere. Just ask Jim Rosenthal.