We Can’t Believe We Have to Tell You This, But Please Don’t Drink Your Pee

I have done some pretty weird things in my life (including eating bugs for this blog), but I am not sure that there is anything quite so bizarre as using your own pee-pee as a health tonic. However, in a video posted recently in the New York Post, Julia Sillaman does just that.

Sillaman claims that drinking her own urine and using it as a face wash has cleared up her acne and helped her lose weight. She also says that fasting has helped her pee taste more like coconut water. Not only is this really a thing that she does (and not a joke or prank), the practice of using urine for health purposes has been around since forever. In this, the third article we’ve had to do about pee, we debunk the inexplicable history of people using urine as a health tonic. (Hint: It does not work.)

Ask a Scientist: Did the EPA Really Change Their View on Asbestos?

Ask a Scientist: Did the EPA Really Change Their View on Asbestos?

Welcome to Ask a Scientist, where we answer questions from our readers on a wide range of scientific topics. Got a scientific question? Drop us a line.

Q: I saw an article that the EPA recently changed their view on asbestos and also made it easier for companies to get asbestos-containing products approved. Is that true? How dangerous is asbestos? – AD, Hamden, CT

Thanks for the question, AD. Here’s the deal:

Asbestos is really, really dangerous. When you go to toxicology school (yes, that exists), one of the model chemicals they teach you about is asbestos. We know asbestos causes cancer, and we even know how it causes cancer. Read our talc post here for an earlier description of the mechanisms of asbestos toxicity. There is no scientific debate about the relationship between asbestos and cancer – asbestos is nasty stuff, and you don’t want to be breathing it in. Asbestos was briefly banned in the US in the late 80’s, but came back on the market in a very limited number of products in 1991 thanks to lawsuits by manufacturers. All new uses have remained banned. These companies argued (correctly) that as long as the asbestos in asbestos-containing products is not broken up into dust (technically, fibers) which can be inhaled, it’s use is safe. This is technically true for the people using asbestos products. However, in order to make these products people need to be around raw asbestos, and that can be dangerous if you don’t take your protective equipment very seriously. 55 countries have banned asbestos outright, and most developed countries no longer allow it to be mined.

Continue reading…    









Why Does My Urine Smell When I Eat Asparagus?

Why Does My Urine Smell When I Eat Asparagus?

Welcome to Ask a Scientist, where we answer questions from our readers on a wide range of scientific topics. Got a scientific question? Drop us a line

Q: Why does my pee smell when I eat asparagus? – D.T., Rutland, VT

A: OK, maybe this isn’t an important scientific question, but it’s kinda of interesting. So here’s the science behind asparagus pee.

People have known for a long time that asparagus causes the urine of many (but not all) people to smell pungent. Benjamin Franklin famously wrote about it, stating that,” a few stems of asparagus shall give our urine a disagreeable odor.” Asparagus pee stinks, but asparagus itself doesn’t have a similar smell, even when cooked, which suggests that the chemical responsible for the smell is a metabolite – something made in our bodies out of something in the asparagus. This is definitely true, asparagus contains a chemical boringly-named “asparagusic acid.” Asparagusic acid contains sulfur, which is a stinky element, and is responsible for the rotten egg smell (among others). Once most people eat asparagus, they metabolize aspargusic acid to several small volatile chemicals, one of which, methanethiol, is believed to be the major source of the smell. Methanethiol, being a volatile chemical, has a low boiling point and therefore is freely released as a gas into the air, hence the strong smell even several feet away from your toilet. This metabolism occurs rapidly – you can generally detect the smell about 30 minutes after eating asparagus. Besides methanethiol, there are several other stinky sulfur-containing metabolites found at lower levels, including dimethyl disulfide, dimethyl sulfone, and 2,4-dithiapentane.

The metabolism of asparagusic acid

For many years it was believed that a large percent of the population did not metabolize aspargusic acid to methanethiol (and the other sulfur compounds) and hence did not experience the strong smell. It turns out this was only partially true. There are definitely many people who don’t produce the smell at all, likely due to currently-unidentified differences in their metabolism. However, there are also a significant number of people who simply cannot smell the metabolites of asparagusic acid even when they are present. This was traced to a single mutation in a gene involved in olfactory function, identified by the genetic testing company 23andMe. This breakthrough was confirmed in a 2011 study that involved asking people to smell not only their own urine, but that of other people who had eaten asparagus. Science isn’t always sexy, folks.

The exact percent of people that don’t experience the smell, either because they don’t metabolize asparagusic acid or are incapable of smelling the metabolites is a bit unclear at this time. In the 2011 study, only 8% of subjects failed to produce the odor, while only 6% failed to detect it. Other older studies have reported much higher numbers of people who don’t experience the smell – up to 50% in one study. Based on the results across populations with different ethnicities, it seems likely that there is quite a bit of variation depending on your genetic background (in particular, people from China or Israel almost all experienced the smell, while people from England or the United States were less likely to notice it. The higher rate of people who don’t get the smell in these populations is likely due to a combination of the single gene variation (for smell) and the unidentified genetic factors affecting the metabolism of asparagusic acid.

Now, if you or someone you know doesn’t experience the smell, you can easily find out if you have the genetic variation. The simplest (and cheapest) way would be to find a friend who does experience the smell and ask to smell their urine. If you still smell nothing, it’s because you are incapable to experiencing the smell. If, however, you do smell the metabolites, then your metabolism is responsible. For those unwilling to smell someone else’s urine, you could just get the DNA test through 23andMe. The variation is on chromosome 1 and is officially termed “rs4481887.” The variation appears to be autosomal recessive, which means you need two copies of the variant gene in order to lose the ability to smell these chemicals. This means that if both you and your spouse are “non-smellers”, then your children will be too. If only one of you is, then your children have a 50/50 chance if your spouse is a carrier, and zero chance if they are not. Again, you’d have to get the genetic test in order to determine if you or your spouse is a carrier. That is, if knowing the chances of your children enduring the stench of asparagus metabolites for their entire lives is something that is important to you. Which would be kinda weird.









Ask a Scientist: Does being cold make you sick?

Ask a Scientist: Does being cold make you sick?

Welcome to Ask a Scientist, where we answer questions from our readers on a wide range of scientific topics. Got a scientific question? Drop us a line.

Does being cold really make you sick, or is that just an old wives tale? – M.H. Woolwich, ME.

Thanks for the question, M.H. You hear this one all the time – is it true?

The short answer is no. “Sick” in this context is generally assumed to mean the flu or the common cold. The cause of these diseases are viruses – influenza virus and rhinovirus, respectively. You cannot catch a cold or the flu without being exposed to these viruses, no matter now cold you get – even if you go out without a jacket, or don’t wear a hat, or forget your mittens. Now, if you get really really cold, and your core body temperature drops below 95°F (normal is 98.6°F), that’s called hypothermia, which can be very dangerous. If you classify hypothermia as “sick”, then I guess the old saying is true, but not if you are just talking about catching a cold or the flu.

Don’t forget to wear a jacket!

There is a long answer too, and it’s… kinda. While being cold doesn’t make you sick, two side effects of cold weather make it easier for viruses to infect you. The first is the low humidity associated with low temperatures (and indoor heating systems), which dry out the mucus membranes of your nose, making it more susceptible to infection. The second is that cold weather tends to make people spend more time indoors in close contact with other people, where they are more likely to spread viruses.

A few years ago, a group of researches at Yale University also demonstrated that cells and mice at lower temperatures have a more difficult time fighting off viruses. It’s not entirely clear if this same effect occurs in humans, and if so, how much it might increase the odds of someone catching a cold or the flu, but it’s pretty cool.

Also, the author’s assumption is that cold air in your lungs and airways was the culprit here. When you think about this, it means that going out in the cold might increase your chances of catching a cold or the flu regardless of how cold a person might actually feel. In other words, even if you are bundled up in a coat, hat, scarf, and mittens, you are still breathing the same cold air when you go outside, so your risk of getting sick would be unaffected by how you dress or how cold you actually felt. The same is true for the effects of low humidity in cold weather.

So in the end, even if cold weather might fundamentally increase your risk of getting sick, being cold (or not dressing warmly enough) will not. The best thing you can do is wash your hands, avoid friends and family who are sick, and of course, get the flu vaccine.









Ask a Scientist: Is urine really sterile?

Ask a Scientist: Is urine really sterile?

Welcome to Ask a Scientist, where we answer questions from our readers on a wide range of scientific topics. Got a scientific question? Drop us a line.

Is urine really sterile? Can you use it to disinfect things? – SF, The Woodlands, TX

Thanks, SF! Let’s talk about pee!

The answer to your first question is a very clear “no.” Urine is not sterile. Sure, it’s sterile as when it’s first made in your kidneys, but then it passes through your urethra, and like pretty much every part of you that comes in contact with the outside world, your urethra contains bacteria, and some of them end up in your urine. In some cases (like urinary tract or bladder infections) the bacterial levels can be quite high, but in general they are low – not sterile, but probably safe enough to drink, at least from a bacterial contamination standpoint. However…

It’s never a good idea to drink your own urine. Some survivalists say they have done it, including that guy who had to cut off his own arm in Utah after getting pinned under a boulder, but it’s probably not something he made a habit of – and for good reason. While urine is 95% water, the other 5% is stuff that is not good for you. This 5% contains urea, excess electrolytes and other stuff your body is trying to get rid of. To put this in context, salt water from the ocean contains about 96.5% water with 3.5% salt. If you drink salt water, the salt just dehydrates you, defeating the purpose of drinking water in the first place. Over time, if you drink enough of it, seawater can kill you. Urine will do the same thing, only it is also full of urea and other body waste products and is totally gross. Don’t drink urine, ever.

Continue reading…    









Ask a Scientist: Are Humans Still Evolving?

Ask a Scientist: Are Humans Still Evolving?

Welcome to Ask a Scientist, where we answer questions from our readers on a wide range of scientific topics. Got a scientific question? Drop us a line.

Q: Are humans still evolving? B.N. Bernalillo, NM

This is a fantastic question, B.N! A solid argument can be made that humans are no longer evolving because of our advanced healthcare and medicine. Sir David Attenburough, who has done more for science then UYBFS ever will, and has been knighted for his efforts, has suggested that he thinks humans are no longer evolving based on this reasoning, saying:

“We stopped natural selection as soon as we started being able to rear 95–99 per cent of our babies that are born.”

Say a child is stricken with juvenile diabetes on his 10th birthday. Prior to the invention of modern medicine, this was a death sentence – the child would have died from hyperglycemia and diabetic ketoacidosis within a short period of time, effectively eliminating his genes from the population. However, scientists were able to determine that a loss of insulin production in the pancreas is the cause of this form of diabetes, and since 1922, insulin (first from animals, now produced in bacteria) has been available to treat these children. Today, while the effects of juvenile diabetes can still be quite severe, if the child can maintain his blood glucose levels appropriately, he or she can expect to live a relatively normal life, certainly one that will be long enough to reproduce if that is their desire.

Behold, the modern science blogger!

So in this case, science and modern medicine has enabled the spread of genes responsible for (or predisposing people to) juvenile diabetes. The successful treatment of any life-threatening childhood disease with a genetic component  (cancer, cystic fibrosis, severe asthma) will have the same effect. On top of this, our ability to successfully treat or prevent potentially deadly infectious diseases in children (measles, rubella, pneumonia, diphtheria, etc) will, in theory, lead to an increased number of people who are more susceptible to these diseases over time.

Continue reading…    









Ask a Scientist: Does the Full Moon Affect Behavior?

Ask a Scientist: Does the Full Moon Affect Behavior?

Welcome to Ask a Scientist, where we answer questions from our readers on a wide range of scientific topics. Got a scientific question? Drop us a line.

It seems like an accepted fact that a full moon makes kids “hyper” or people behave in a crazy way. Is there any truth to this, or is it just an old wives tale? – BH, Watertown, ME

Thanks for the question, BH! This is one of those “common knowledge” beliefs that seems to be everywhere. About 45% of people actually think this is true, which is in line with about 50% of Americans that think astrology is a science. This should make us all sad, because astrology is not a science and more people believe in astrology today than in 2004.

As for the full moon and behavior, this was a big enough question on peoples mind that several groups of scientists actually ran studies to see if there really was an effect on behavior. Universally, they found that there is none, though there was a very small decrease in the amount of sleep that children got during a full moon, which may be related to all that moonlight coming into their window, or to their parents repeatedly telling them that they were “hyper” because of the moon.

Exactly why people believe this is a bit of a mystery. Certainly, it has been around for a long time – this is where the words “lunacy” and “lunatic” comes from – from the Latin lunaticus, which referred to madness or epilepsy because people thought the moon caused these effects. It could be that this is just a relic of the past before electricity when nights with a bright moon in the sky would have allowed for much more activity than dark moonless nights, or maybe just kept people from getting a good night’s sleep.

Many people have anecdotally suggested that the full moon increases the rates of crimes, accidents, suicides, or trips to the emergency rooms. Scientists have studied these links as well, and there is no consistent effect of the phase of the moon on any of them. The same is true for animal behavior.

Others think that the behavioral effects might be related to the gravitational effect that causes tides, but this is not how tides work – they are driven not just by the gravitational pull of the moon, but also the large distances between different regions of the ocean. This is why the pond behind your house doesn’t have tides. The local gravity that a person experiences is influenced more by their distance from the center of the Earth and other large objects around them than the moon.

The bottom line is that the moon doesn’t effect behavior in any consistent way. If your kids are “hyper”, it’s not the moon, and it’s not sugar – maybe they are just kids having a good time?

 









Ask a Scientist: Is microwaved food bad for you?

Ask a Scientist: Is microwaved food bad for you?

Welcome to Ask a Scientist, where we answer questions from our readers on a wide range of scientific topics. Got a scientific question? Drop us a line.

My coworker said that microwaves are bad for you and you should avoid using them. As evidence she stated that if you microwave dirt nothing will grow in the dirt, so microwaved food is similarly hazardous to your health. Is this true?- JK, Burlington, VT

Let’s ignore that fact that our coworker is microwaving dirt for some reason and talk about microwaves and microwave ovens!

Let’s start with the waves themselves. Microwaves are a form of electromagnetic radiation. While the term “radiation” scares some people, this is context it describes everything from radio waves to visible light to gamma rays.  All electromagnetic waves have energy, but the amount of energy they posses depends on their wavelength. Gamma and X rays have the shortest wavelength and therefore the most energy, and these can hurt you. They can cause cancer at low levels and at very high levels (or doses), they can be fatal in just a few days.

Luckily, our atmosphere filters out most of the gamma and x ray wavelengths coming from space. However, lower wavelength light from the sun does make it to the surface of the earth in the form of ultraviolet (UV) and visible light. UV light, which has a shorter wavelength than visible light (and hence more energy) can be dangerous – it causes cancer over time as well, though all animals have adapted a system for repairing the damage it causes to our DNA that takes care of most of the damage. If the “dose” of UV light is high enough, it will burn you – something most of us have experienced as a sunburn.

Continue reading…