Tuesday, June 18, 2013

The Ugly Cry Face

The human body is an immensely complicated thing. Each organ and bodily fluid has at least one, if not multiple functions. From your eyelashes to your toenails, humans have evolved to thrive in their surroundings. This includes tear production and the act of crying, which has its obvious perks: protection and lubrication of the eyes, delivery of important nutrients and electrolytes to the cornea, the removal of irritants. Additionally think of the great relief a sobfest can bring at the end of a hard day. The latter refers to “emotional tears”, which act as a sign of emotional distress. For decades scientists have tried to determine why it is we cry, linking it to sadness and signals of harmlessness. Randomized trials have been conducted to determine whether tears elicit a change in mood or increases empathy. These responses make sense to us because of our own personal experience with crying. But then came Dr. Sobel at the Weizmann Institute of Science in Israel, and the world of emotional tears has never been the same.

In 2011 Dr. Sobel's research group published an article where they reported that emotional tears from women elicit a seemingly odd response in men: reduced arousal and sexual attraction. To conduct this research, the scientists had women watch sad movies and collect their tears, this way the tears would be of the emotional kind. As a control, the women were also asked to trickle saline solution down their cheeks and collect the drops like they did their own tears. Men in their mid to late twenties then were asked to smell either the tears or saline and perform different tasks. One of these tasks involved looking at pictures of women like the ones below and rating their sadness and attractiveness. 



Men who were smelling tears were not any better at rating a woman's sadness compared to the men that were smelling saline. They were, however, less attracted to the female's photo. The researchers attributed the men's inability to rate sadness to the fact the men themselves were not sad. Therefore, the researchers repeated the study but first had the men watch a sad movie. Instead of asking the men to rate a female's attractiveness they measured their arousal through both self reports and proxy measurements (heart rate, skin temperature, respiration rate, among others). Salivary concentrations of testosterone were also measured. Again, the men that smelled the tears had decreased levels of arousal - both self reported and measured - as well as a decrease in salivary testosterone concentrations. Interestingly, even after watching a sad movie the men that sniffed tears did not report any differences in mood compared to the men that smelled saline. If tears contain a chemical signal that elicits sadness or empathy, the tears should have made the men more sad than the saline, yet this result was not achieved. 

Finally, to be sure that what they were seeing was a true effect, the scientists asked the men to watch an "erotic film" while they conducted brain imaging. This last portion of the study probably made for some awkward social interactions, and conjures up the image of a mad scientist creepily studying his subjects from behind a one-way mirror. Nonetheless, areas of the brain that "lit up" while viewing an erotic film had reduced activity when the men sniffed the tears, again indicating that a woman's tears reduce sexual arousal in men. 

While this study may leave us with more questions than answers, it does give us some insight into the evolutionary reason for tears. Seeing someone cry may still very well be a visual sign of sadness, but the chemical signaling compounds in tears do not seem to change the empathetic response. Now the question turns to why it would be beneficial to reduce arousal in times of sadness, and whether we would see a similar result if we were to reverse the genders or change the age group. 

Perhaps crying was the original, ahem, excuse. As in "I have a headache..." type of excuse. You get my gist. 


S. Gelstein et al. Human Tears Contain a Chemosignal. Science (2011)






Friday, February 1, 2013

Labradors and Laboratories

Each Friday morning I awake with anticipation of our weekly lab meeting. Because the research interests of our lab range from epigenetic inheritance to colon cancer and obesity, conversations at these lab meetings are pretty versatile, but usually go something like this:

Scheduled start of lab meeting: Is anyone coming to this meeting? Where is everyone?15 minutes after the scheduled start of lab meeting: Lab meeting actually begins.The next 30 minutes: Mostly professional discussion about recently published scientific paper or presentation of new data from the lab.The final 20 minutes: Transition into discussion about poop, mouse colonoscopies, checking the cervical mucous plug, how to perform the best sperm extraction, which unfortunate lab technician is going to have to collect fecal samples...

As you can tell, the end of the lab meeting is the most lively part of our discussion. In today's meeting we got to learn about this little gem of a study published back in 2011 in which dogs were trained to sniff out colon cancer. Do you have a funny image of a dog smelling someone's rectum yet? How about now?

Turns out the idea of having dogs "diagnose" cancer by smelling a patient's breath, urine or stool samples is not all that new, but somehow has gotten by me for the past decade and I'm just starting to look at the published research. In the case of the study I'm going to share with you the researchers worked with trained dogs that would smell either a breath or watery stool sample (collected during a colonoscopy to make the diagnosis of cancer or cancer-free) of a study participant. Samples of the patients' breath were used to train the dogs. The researchers explain it as follows:
Each cancer detection training session was considered complete when the dog could correctly distinguish between breath samples from a cancer patient and four controls consecutively in dozens of trials. The fundamental training method was a reward-based approach in which the correct behaviour is rewarded by simultaneous play with a tennis ball.
What I think is important to note here is that (1) I've definitely chosen the wrong subject for my thesis because it did not include playing fetch with a dog, and (2) the dogs were trained using samples from other types of cancer than colon cancer. In the end of the training period the dogs could sniff out the patients with esophageal, breast, lung, gastric, pancreatic, hepatocellular, cholangiocarcinoma (bile duct), prostate, uterine, ovarian, bladder or colorectal cancer simply using samples of the patient's breath. This means that some volatile compound (or compounds) exists in both the breath and bodily excretions of cancer patients that is missing or different in cancer-free patients. Moreover, these volatile compounds are seemingly similar between cancer types.

After putting the dogs to the test 74 times (38 watery stool tests + 36 breath tests) the dogs picked out the samples from the cancer patients with amazing accuracy. Remember how I mentioned above that each study participant underwent a colonoscopy to determine whether they had colon cancer or were cancer-free? The dog's cancer diagnosis correlated with the doctor's diagnosis 91% of the time for the breath test, and 97% of the time for the watery stool test. 97%! Maybe one day we'll see Labrador retrievers roaming the halls of cancer clinics, smelling out the patients needing assistance. Or, in a more boring, practical world, we'll discover what it is in the breath and stools of cancer patients that is being detected by the dogs and come up with a lab technique to measure it. Personally, I'm keeping my fingers crossed for doggy doctors.



Sonoda, H. et al. Colorectal cancer screening with odour material by canine scent detection. Gut (2011)




Wednesday, October 24, 2012

A Penny For Your Thoughts, and $100 For Your Fat!

It's been some time since my last post. Turns out trying to start something new is not easy, as anyone who has tried to make or break a habit knows. This is something that many people that struggle with weight loss already know. But what can make changing your lifestyle easier? In the U.S.A. we have greatly diminished smoking through legislation that makes it inconvenient and expensive to smoke. This idea is now being actively applied to creating a more healthful food environment that promotes weight loss. Probably the most publicized "bad boy" of the food world is sugar sweetened beverages. Vending machines are being removed from public schools, limits are being placed on the size of sugar sweetened beverages you can purchase, and proposals for taxes on sodas have been made (but usually are not actually passed). So is all of this political interference doing anything? Science Says...Maybe, maybe not (of course science gives such a flimsy answer). The research as to whether there is a difference on our waist size or diabetes rates after attacking sugar sweetened beverages is thus far inconclusive for many reasons, most notable is the fact that this is all so new and it takes some time to see large changes.

So, let's address this issue from the other side. Instead of punishing people for wanting to drink a root beer each and everyday, could we instead give them money if they don't? You may have heard about schools in Ohio and Washington D.C., among others, that are paying their students to attend class and behave. While this has been met with plenty of controversy, it is an interesting idea that may expand to weight loss. In 2008 a study was published in the Journal of the American Medical Association that demonstrated the effect paying people to lose weight has on their ability to do so. Participants in this study were overweight (BMI was between 30 and 40) but overall healthy. There were randomized into one of three groups: the control group was given a weight loss goal and a scale. The deposit contract group was allowed to contribute $0.01 to $3.00 a day, and that amount was matched by the researchers. At the end of a month, if this group reached their weight loss goal they were refunded with their own money plus the matched deposit from the researchers. However, if they did not make their weight loss goal for the month they did not get any money back, so this group was mimicking what would essentially happen if we both rewarded people for weight loss but also punished them for lack of weight loss. The final group, called the Lottery group, got to choose a a two digit number. If this number was selected at the end of each month and that participant had met their weight loss goal they got either a $10 or a $100 monetary reward. Therefore, the Lottery group represents what would happen if we simply rewarded weight loss without the punishment. So what happened?


After 4 months of the study both of the incentivised groups loss an equal amount of weight (yay!), but the amount did not differ between the Deposit contract and the Lottery group. That means, in this study at least, it doesn't matter whether you reward and punish, or just reward people for losing weight. The study ended after 4 months, but after 7 months the researchers returned to the participants to see if their was weight regain. You can see above that even after 7 months those that were in either the Deposit contract or the Lottery group still had lost weight relative to their initial weight, but these differences in weight loss weren't significant when comparing them to the control group.

These same researchers published another article about the same participants, this time looking only at the comparison between the control group and the Deposit contract group. This study lasted 8 months, and you can see below that the Deposit contract group decreased their weight relative to their baseline measurement. Again, at 17 months post-study initiation when the participants were on their own there was no difference in weight loss between the two groups.


So what do you think? Is it worth paying people to lose weight? And how would this cost compare to the financial burden obesity and its related illnesses already put on us? Most importantly, would something like this ever get passed at the federal level in my life time without an anti-socialist revolt? Is Scienticklish getting political???



Volpp, KG, et al. Financial Incentive-Based Approaches for Weight Loss: A Randomized, Controlled Trial. JAMA (2008)

John, LK et al. Financial Incentives for Extended Weight Loss: A Randomized, Controlled Trial. Journal of General Internal Medicine (2011)

This is a great review of studies through 2010 that have looked at incentivising weight loss:
Robert Jaffery. Financial incentive and weight control. Preventative Medicine (2012)

Thursday, July 12, 2012

The Atkins Diet Debate Continues

A recent article in the New York Times has gotten a lot of attention among dieters and nutritionists alike. In this piece, a long-time obesity researcher and physician is asked about a study published in the Journal of the American Medical Association last month. In this study, a group of overweight or obese people were placed on a calorie restricted diet until they lost 10-15% of their body weight. Then the study participants were randomized to a either a high protein, high fat and low carbohydrate diet (similar to the Atkins diet), a low fat, high carbohydrate diet (similar to what the USDA recommends Americans eat), or a diet in between, with the carbohydrates being more complex in nature and having a low-glycemic index, meaning that these carbs were from fiber in whole grains, fruits and vegetables as opposed to processed foods or white flour. Each of these diets were formulated to contain the same amount of calories, this way the researchers could focus on how the source of the calories rather than the the amount of calories affects weight loss.

The participants received each diet type for 4 weeks, and their resting energy expenditure (REE - the amount of calories your body needs to burn just to keep you alive) and total energy expenditure (TEE - the amount of calories you burn for everything you do) were measured. If you're trying to lose weight you would hope that your REE and TEE are high, because the more calories you burn the less fat you're going to be walking around with. You can see from the graph below that there was a slight decrease in REE and more substantial decrease in TEE when the participants ate the low fat, high carbohydrate diet relative to the high protein, low carbohydrate, Atkins like diet.



The authors of the study go on to say that they can't explain this change in TEE by an increase in metabolism caused by changes in thyroid hormone concentrations, or by the amount of physical activity each participant had. They hypothesize that a low carbohydrate diet may increase your TEE by changing other hormones in the body, or that the differences in metabolism of proteins, fats and carbohydrates may be responsible. Dr. Hirsch disagrees, stating that this TEE effect is probably due to the fact that when you are on a low carbohydrate diet you lose a lot of your water stores, and because lean body mass includes those water stores, your lean body mass will actually decrease. Because TEE is usually calculated as calories per unit lean body mass, you can calculate an artificial increase in TEE when the participant's lean body mass goes down.

I have to say that I like Dr. Hirsch's perspective on this, not because I'm strongly against a low carbohydrate diet (just strongly against a high meat one), but because he ultimately concludes that if you want to lose weight, you should just eat less of whatever it is you're eating already. As exemplified by the Twinkie diet, this works. You can eat Twinkies all day if you want and still lose weight, as long as you only take a micro-bite of the Twinkie and stop the instant you're no longer starving.


Ebbeling, CB et al. Effect of Dietary Composition on Energy Expenditure During Weight-Loss Maintenance. Journal of the American Medical Association (2012)


Friday, June 15, 2012

Raisins Lower Your Blood Pressure!

Wait, what? I just got this study in my inbox and felt it was worth a Scienticklish reply. I have 30 minutes before my lab protocol requires my attention, so lets see if I can get through this post quickly.

Looks like some researchers in Louisville, Kentucky were looking at the effects of eating either raisins, or cookies and crackers, what they called "snacks" on participants' blood pressure. Turns out if you eat the same amount of calories from raisins as you do from cookies and crackers three times a day and you are at risk for high blood pressure, you may actually lower your blood pressure. The lowering effect was anywhere from 4 to 8 mmHg for the systolic blood pressure, and a reduction between 2 to 6 mmHg for diastolic blood pressure, meaning that if your normal blood pressure was 133 over 82, like these study participants, the end result of eating raisins 3 times a day for a minimum of 4 weeks are a blood pressure of about 128 over 78. Still higher than what's considered a healthy blood pressure, but better.

 What does this mean? Of course! Raisins are better for you than cookies! The researchers point to the fiber or antioxidants you get from raisins that you're missing in crackers and cookies. I point to the fact that its raisins vs. cookies. I think you can figure out which one is better for you.

      vs    

Should we be trying to get people to snack on raisins? Well, this is just an abstract that was presented at a recent conference, meaning I can't find the details of the study (easily). So, I would have to say that it depends on the amount of total calories these snacks are adding to the diet. I would like to see the California Raisins go head-to-head with the Cookie Monster.


Bays HE. et al. Raisins and Blood Pressure: A Randomized Controlled Trial. Journal of the American College of Cardiology (2012)

Friday, June 1, 2012

Colonizing the Colon


Over the recent Memorial Day weekend Mr. Scienticklish and I were lucky enough to participate in not one, not two, but THREE barbecues! There was a lot of corn, burgers and sweet treats to go around, but probably the most interesting food we consumed was Water Kefir, thanks to a new friend met at BBQ #2. While in conversation, this new friend was referred to as the "King of Fermentation" - a title that I'm sure only a small demographic is excited to have. Our new friendship resulted in him offering up various bacterial and yeast cultures, including a sourdough bread starter and Kefir grains. Similar in theory to the fermented Kombucha beverages, Water Kefir uses various bacteria strains (together called Kefir grains) to break down sugars. When you make Water Kefir yourself you add table sugar or fruit juice to feed the bacteria, which they use to produce COand a small amount of ethanol, making the drink bubbly and about 0.5% alcoholic. In the end you get a fruity, carbonated drink that is very low in calories, high in active bacteria cultures and absent of the vinegar taste you find in Kombucha. The one we sampled the next day was so good in fact that we wondered why the producers chose to market the drink as a pro-biotic, fermented drink as opposed to a healthy alternative to soda? Anyway, if you can find some you should try it for the taste alone, if not for any health benefits you may get from the bacteria.

Don't be fooled by those good-for-nothing amateur biotics

Pro-biotic foods and beverages have been getting a lot of hoopla lately. I think it's pretty exciting stuff too. The more we learn about the functions of all of those bacteria living in our gut, the more we realize how integral they are for our own health. These colonies of bacteria exist mostly in the colon, some make it past the ileocecal valve into the small intestine, but when we think of the gut microflora generally we're thinking of the colon. So how did this bacteria get there? When you were a fetus you were sterile, it wasn't until you were born that mom passed on her bacteria to you as you traveled through her birth canal. As expected, babies that are born via cesarean section have a different composition of gut flora than those born vaginally. After delivery, it seems like almost everything, and then nothing, can change your gut flora. Some studies show that what the infant eats (breast fed versus formula fed for example), and whether the baby was born in a hospital or at home can affect what bacteria colonizes their gut. However when adults are given pro-biotic foods it's still not certain what the results are. For example, the live cultures in the yogurt you eat may be doing something beneficial while they're passing through your gut, but whether they hang around and make your colon their new home isn't really known yet.

The war in your colon?
A potential way to change what bacteria grow in your gut is to change what you feed them. For example, when comparing vegans vs vegetarians vs meat eaters you see different types of bacteria growing in the colon. Additionally, lean people and obese people also have a different world of bacteria taking up residency. To study the latter in more detail, researchers looked at the types of bacteria growing in the colons of obese and lean mice. They also looked at the contents of the fecal matter (taken from the caecum) and found differences in lengths of fatty acids between lean and obese mice (part a in the figure below). Next,they looked at differences in fecal material between lean and obese mice, finding that the obese mice had less energy (kcal) per gram of waste, perhaps showing that the obese mice were better able to absorb that energy into their blood stream as opposed to pooping it out (part b). Finally, in what I think is the coolest part of this experiment, the researchers did a "fecal transplant" into mice that were devoid of bacteria in their gut. This means that someone, a lowly grad student I'm sure, took poop from the lean and obese mice and fed it to the bacteria-free mice so that those bacteria would grow in the colons of the new mice. And now we know why people think scientists are crazy... BUT look at the cool results in part c below! The mice that got poop from lean mice (+/+) had less than half the percent increase in body fat compared to mice that got poop from obese mice (ob/ob). Keep in mind that these mice were on the same diet, and the mice that gained more weight didn't eat any more food than the mice that didn't gain the weight. Perhaps the bacteria from the obese mice helped breakdown the nutrients to make them more readily absorbed, thus permitting those mice to take in the energy and store the fat. So now we need to convince the world that the future of weight loss lies in fecal transplants.

I'm realizing that this is becoming quite a long blog post about poop and colons. I'm also realizing that I may have a strange fascination with these topics. I haven't even really got into what these little creatures may do for our health. Perhaps a future post about Jamie Lee Curtis and her empty promises with her Bifidus Regularis will be necessary. As for now I'm going to leave you all to digest this fecal-filled post, just like those poor mice.

Zimmer, J. et al. A vegan or vegetarian diet substantially alters the human colonic faecal microbiotaEuropean Journal of Clinical Nutrition (2012)

Penders J. et al. Factors Influencing the Composition of the Intestinal Microbiota in Early InfancyPediatrics (2006)

Turnbaug, P.J., et al. An obesity-associated gut microbiome with increased capacity for energy harvestNature (2006)

Wednesday, May 16, 2012

Shiver Me Thinner?

In making the move from the sunny desert of the southwest to the chilly northeast I've often wondered what this harsh climate is doing to my body. Of course my mind automatically goes to all of the bad things caused by extreme cold - hypothermia, slipping on ice, liver failure from all of the whiskey needed to keep warm - but turns out there may be some metabolic advantages to being in a cold environment due to a potential increase in a type of fat tissue called brown adipose. While brown adipose has become a growing area of research in recent years, it was originally discovered decades ago. The reason for the newly found interest is because until recently it was thought that only newborn babies and non-human mammals had brown adipose. The belief was that as newborn babies grew into adolescents they lost the brown adipose and were left with the normal white adipose tissue. So why does discovery of this special type of fat in human adults matter? Because it may increase the use of energy from the food that we eat for heat production in a process called non-shivering thermogenesis, which gives the body a mechanism to provide heat when living in a cold environment. Because this energy is being "burnt off" as heat, it won't be stored in your growing beer belly. So maybe you can't shiver yourself thinner, but maybe you can non-shiver yourself thinner.

The name "brown adipose tissue" comes from the look of this tissue under a microscope. It's brown, and it's adipose tissue. Sometimes science can be straight-forward. The brown color is from extra mitochondria, which are the organelles within the cell that make energy (in the form of ATP) for your body. In order to do this, the mitochondria keep an electrical gradient across its inner membrane by separating the negatively charged electrons from the positively charged protons (in the form of hydrogen atoms). The electrons then are transported through a series of proteins collectively called the electron transport chain (again, straight-forward), until the end of the chain. At this point the positive hydrogen atoms are allowed to flow back through the membrane, releasing energy that is refashioned into ATP molecules by the enzyme ATP synthase (See the picture below for the cartoon rendition of the ETC). These molecules are then passed on to every cell in the body, keeping us alive. So you can imagine that if this simple flow of hydrogen atoms down the energy gradient holds enough energy to keep each cell going, it also must be able to release that energy as heat. This is where we get back into brown adipose tissue and non-shivering thermogenesis. There is a protein called Uncoupling Protein (UCP) that is highly expressed in brown adipose. UCP will basically give the hydrogen atoms another channel to travel through the mitochondrial membrane down the energy gradient. This time instead of using that energy to make ATP it is released as heat, thereby raising the body temperature. This switch-over from ATP production to heat production causes the body to want to make more ATP, so the energy stores in your body that are the starting substrates of the reaction start to get used up.

Remember the ETC?

Those of you that are pharmacological-minded may ask why we don't just make a weight-loss drug that uncouples the electron transport chain? Well...because it's a poison. Back in the 1930's people started taking an ETC uncoupling chemical called dinitrophenol, DNP for short, to lose weight. It worked great! But, it also resulted in your mitochondria not being able to make any ATP for your cells to survive. No ATP = no energy = no you! Unfortunately it looks like there is still an illegal market for DNP and people are still taking it. This is pretty scary stuff, I don't think you want to mess with cellular respiration. Plus it seems the body does this for you to a degree if you have brown adipose tissue, which you probably do. The amount of brown adipose that you have seems to be correlated with age (the younger you are, the more you have), sex (females have more), fasting blood glucose (a lower fasting glucose is associated with more brown adipose), and BMI (the leaner you are the more non-shivering you may potentially do). Interestingly, the activity of the brown adipose seems to correlate with the mean outdoor temperature of where you live, so that the warmer the climate, the more glucose is taken up by the brown adipose and potentially used for heat production (See graphs below). This study just so happened to take place in Boston, so these subjects also get to experience the New England chill alongside me.



Yes, you read that right, those with a lower BMI actually seem to have more brown adipose. Why? Well, maybe they need more heat because they have less fat to act as insulation, or maybe it's the extra brown adipose that is letting them stay thinner in our obesogenic environment, no one really knows yet. Also something interesting to note, those with a lower BMI or % body fat also had more active brown adipose tissue, meaning the tissue is taking up more glucose out of the blood (see graph below). 


Okay, so I guess I take back my previous statements, you probably can't shiver yourself thinner, or non-shiver yourself thinner, but maybe when you are thin you can non-shiver yourself warmer? 



van Marken Lichtenbelt, WD. et. al. Cold Activated Brown Adipose Tissue in Healthy Men. New England Journal of Medicine (2009)

Cypess, AM. et al. Identification and Importance of Brown Adipose Tissue in Adult Humans. New England Journal of Medicine (2009)

Bartelt, A., Merkel, M. & Heeren, J. A new, powerful player in lipoprotein metabolism: brown adipose tissue. Journal of Molecular Medicine (2012)