Risk for Alcoholism Linked to Risk for Obesity


The researchers noted that the association between a family history of alcoholism and obesity risk has become more pronounced in recent years. Both men and women with such a family history were more likely to be obese in 2002 than members of that same high-risk group had been in 1992.

“In addiction research, we often look at what we call cross-heritability, which addresses the question of whether the predisposition to one condition also might contribute to other conditions,” says first author Richard A. Grucza, PhD. “For example, alcoholism and drug abuse are cross-heritable. This new study demonstrates a cross-heritability between alcoholism and obesity, but it also says — and this is very important — that some of the risks must be a function of the environment. The environment is what changed between the 1990s and the 2000s. It wasn’t people’s genes.”

Obesity in the United States has doubled in recent decades from 15 percent of the population in the late 1970s to 33 percent in 2004. Obese people — those with a body mass index (BMI) of 30 or more — have an elevated risk for high blood pressure, diabetes, heart disease, stroke and certain cancers.

Reporting in the Archives of General Psychiatry, Grucza and his team say individuals with a family history of alcoholism, particularly women, have an elevated obesity risk. In addition, that risk seems to be growing. He speculates that may result from changes in the food we eat and the availability of more foods that interact with the same brain areas as addictive drugs.

“Much of what we eat nowadays contains more calories than the food we ate in the 1970s and 1980s, but it also contains the sorts of calories — particularly a combination of sugar, salt and fat — that appeal to what are commonly called the reward centers in the brain,” says Grucza, an assistant professor of psychiatry. “Alcohol and drugs affect those same parts of the brain, and our thinking was that because the same brain structures are being stimulated, overconsumption of those foods might be greater in people with a predisposition to addiction.”

Grucza hypothesized that as Americans consumed more high-calorie, hyper-palatable foods, those with a genetic risk for addiction would face an elevated risk from because of the effects of those foods on the reward centers in the brain. His team analyzed data from two large alcoholism surveys from the last two decades.

The National Longitudinal Alcohol Epidemiologic Survey was conducted in 1991 and 1992. The National Epidemiologic Survey on Alcohol and Related Conditions was conducted in 2001 and 2002. Almost 80,000 people took part in the two surveys.

“We looked particularly at family history of alcoholism as a marker of risk,” Grucza explains. “And we found that in 2001 and 2002, women with that history were 49 percent more likely to be obese than those without a family history of alcoholism. We also noticed a relationship in men, but it was not as striking in men as in women.”

Grucza says a possible explanation for obesity in those with a family history of alcoholism is that some individuals may substitute one addiction for another. After seeing a close relative deal with alcohol problems, a person may shy away from drinking, but high-calorie, hyper-palatable foods also can stimulate the reward centers in their brains and give them effects similar to what they might experience from alcohol.

“Ironically, people with alcoholism tend not to be obese,” Grucza says. “They tend to be malnourished, or at least under-nourished because many replace their food intake with alcohol. One might think that the excess calories associated with alcohol consumption could, in theory, contribute to obesity, but that’s not what we saw in these individuals.”

Grucza says other variables, from smoking, to alcohol intake, to demographic factors like age and education levels don’t seem to explain the association between alcoholism risk and obesity.

“It really does appear to be a change in the environment,” he says. “I would speculate, although I can’t really prove this, that a change in the food environment brought this association about. There is a whole slew of literature out there suggesting these hyper-palatable foods appeal to people with addictive tendencies, and I would guess that’s what we’re seeing in our study.”

The results, he says, suggest there should be more cross-talk between alcohol and addiction researchers and those who study obesity. He says there may be some people for whom treating one of those disorders also might aid the other.

This work was supported by grants from the National Institute on Alcohol Abuse and Alcoholism and the National Institute on Drug Abuse of the National Institutes of Health.

The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials provided by Washington University School of Medicine.

More underage drinkers end up in ER on New Year’s

By Michelle Healy, USA TODAY

Alcohol-related New Year’s celebrations send an alarmingly high number of young people to hospital emergency rooms, says a report out today.

In 2009, 1,980 hospital emergency department visits involved underage drinking, according to the report from the federal Substance Abuse and Mental Health Services Administration (SAMHSA). That’s nearly four times the daily average number of emergency department visits for drinking-related visits by people under 21, the report says. It’s two to three times the number of visits recorded on other “party” holidays, namely Fourth of July weekend (942) and Memorial Day weekend (676)

The study looked at all alcohol-related ER visits, but it did not specify whether they involved traffic accidents, alcohol poisoning or other issues.

The huge rise of drinking-related incidents on New Year’s “should startle us. It should wake us up,” says Peter Delany, director of SAMHSA’s Center for Behavioral Health Statistics and Quality, which did the analysis.

Though any underage alcohol consumption is cause for concern, drinking can also increase the likelihood of other risky behaviors, Delany says.

The findings are in line with other research showing more alcohol-related problems over the winter holidays, SAMHSA says.

Two to three times more people die in alcohol-related vehicle crashes during that time than during comparable periods the rest of the year, the National Institute on Alcohol Abuse and Alcoholism says. And 40% of traffic fatalities during winter holidays involved an alcohol-impaired driver, compared with 28% for other dates in December.

Fueling the underage drinking problem, especially at this time of the year, is “a combination of greater access to alcohol, less parental oversight and mixed messages” about celebrating with alcohol, Delany says.

Young people are told “don’t drink, don’t do that, but in every third commercial in recent weeks, we see something linked to alcohol and drinking,” he says.

And there’s also the issue of “what kind of message parents may give,” Delany adds. “Maybe they’re drinking a lot. Kids see that it’s OK.”

What is needed is a long-term message “that underage drinking is not OK,” he says. “But adolescents don’t do well with ‘Just say no.’ We have to find ways to help young people make good decisions.”

Procrastination and the Perfectionism Myth

by Dr. Piers Steel

Do you have high standards? Do you expect a lot from yourself, day-in and day-out? Do you love it when life is organized and orderly? Do you try to do your best at everything you do? There is a word for people like you: perfectionists. You worry over life’s details, anxious to make every event just so. And you might like to know that some believe that your perfectionism is the root cause of procrastination.

But does perfectionism really cause procrastination? Lots of people think so. It’s a neat theory you’ll often hear repeated around the water cooler. There’s just one problem with it: it’s wrong. Research shows that perfectionists actually procrastinate less than other people, not more.

According to the myth, procrastination is caused by anxiety in one of its myriad forms. Sigmund Freud, for example, thought it was due to death anxiety-we delay because we live in fear of life’s ultimate deadline. In particular, the anxiety produced by perfectionists supposedly induces procrastination. We delay because of our fear of failure, anxious about living up to sky-high standards. Shame on your aspirations to do better!

So how did anxiety and procrastination get all mixed up together? There is a relationship, just not the one you hear about. Most people are indeed apprehensive as the deadline looms, especially if they haven’t left themselves enough time. People can almost become paralyzed over the work they left themselves for tomorrow, knowing that they should act but remaining immobile with anxiety. But this is an expression of having procrastinated, not a cause of procrastination. For anxiety to cause procrastination the two have to be connected, that is, anxiety-prone people have to put things off more than others. But according to analysis of about a hundred studies involving tens of thousands of participants, anxiety produces a negligible amount of procrastination at best-and even that tiny amount disappears completely after you take into account other personality characteristics, especially impulsiveness.

As best as we can figure, task anxiety will just as likely get you to start early as to start late. That is, worrying about a deadline will make you procrastinate more if you are impulsive, the sort of person to whom avoiding a dreaded task or blocking it from your awareness makes perfect sense from a short-term perspective. If you aren’t impulsive, anxiety is a cue that you should get cracking-and, as a result, you actually start earlier. The real culprit is impulsiveness, not anxiety. (But you can’t be expected to discern this effect through personal reflection; relying only on your own experiences, you will never know that anxiety decreases procrastination for many others.)

The myth that perfectionism creates procrastination makes even less sense. What traits do you associate with procrastination? A) Being messy and disorganized or B) Being neat and orderly? If you choose option A, good for you; you are right. Perfectionists best fit description B, being neat and orderly, and unsurprisingly, they don’t tend to procrastinate. The research-from Robert Slaney, who developed the Almost Perfect Scale to measure perfectionism, to my own meta-analytical research article, The Nature of Procrastination– shows this clearly.

For example, there is a recent article by Dr. Caplan from Anadolu University entitled: “Relationship among Perfectionism, Academic Procrastination and Life Satisfaction of University Students.” Dr. Caplan takes a fine-grained approach to studying perfectionism, breaking perfectionists down into three strains: other-oriented, socially prescribed, and self-oriented. Only the last of these, self-oriented perfectionism, includes the features we typically associate with perfectionism, i.e., having high personal standards and being rather critical if you don’t meet them.

Dr. Caplan reconfirmed what has been found many times before, that “Other-oriented and socially-prescribed perfectionism traits did not predict academic procrastination” and “self-oriented perfectionism and academic procrastination are negatively correlated,” that is, an increase in one is associated with a reduction in the other. In short, perfectionists tend to procrastinate the same or less than other people, not more. Of course, there are still some people who are both procrastinators and perfectionists, but not as many as there are procrastinators who are non-perfectionists (or perhaps, imperfectionists?). Odds are, you don’t even believe that perfectionism causes dilly-dallying yourself. Across several surveys, only 7 percent of procrastinators blamed their sloppy habits on perfectionism.

So how did this myth come about? Why did we ever think the two traits were connected? The December 24th issue of the Globe & Mail provides a relevant excerpt from my book, The Procrastination Equation. Here’s a summary.

The confusion comes from an unexpected source. As noted above, procrastinators themselves do not blame their delaying on perfectionism; instead, this misinformation comes from clinicians and counselors. Perfectionists who procrastinate are more likely to seek help from such professionals, creating a self-selection phenomenon that gives the illusion that the two traits are linked. Clinicians tend to see a lot of perfectionist procrastinators because non-perfectionist procrastinators (and, for that matter, non-procrastinating perfectionists) are less likely to seek professional help. You see, perfectionists are more motivated to do something about their dilly-dallying because, by their very nature, they are more likely to feel worse about putting things off. Consequently, it is not perfectionism per se that is the problem but the discrepancy between high standards and less-than-stellar performance.

Since diagnosis typically precedes treatment, understanding the real reasons behind procrastination is critical to stopping it. If we feel certain that perfectionism causes procrastination, then our cures will confidently head off in the wrong direction. This isn’t to say perfectionism and fear of failure aren’t important in their own right-each has the potential to become crippling. It is just that they aren’t important here, with regards to procrastination. But we do know what is.

The research shows that there are three major, empirically confirmed, causes of procrastination: expectancy, value and impulsiveness. I will tackle each one individually in the upcoming weeks. During the meanwhile, I want to hear from the perfectionists out there and how much you procrastinate. You can take this short quiz on Facebook to measure your level of procrastination. Are you a garden variety dilly-dallier or do you have “tomorrow” tattooed across your back? I’m interested to know which group is the most vocal-the perfectionists who procrastinate or the ones that don’t procrastinate much at all.

A National Institute of Depression?

by Jonathan Rottenberg, Ph.D

I am reading Siddharta Murkherjee’s, wonderful, Emperor of All Maladies: A Biography of Cancer. One of the stories it tells is about the formation of the National Cancer Insitute in 1937. Here is the current mission statement of the NCI:

* Supports and coordinates research projects conducted by universities, hospitals, research foundations, and businesses throughout this country and abroad through research grants and cooperative agreements.
* Conducts research in its own laboratories and clinics.
* Supports education and training in fundamental sciences and clinical disciplines for participation in basic and clinical research programs and treatment programs relating to cancer through career awards, training grants, and fellowships.
* Supports research projects in cancer control.
* Supports a national network of cancer centers.
* Collaborates with voluntary organizations and other national and foreign institutions engaged in cancer research and training activities.
* Encourages and coordinates cancer research by industrial concerns where such concerns evidence a particular capability for programmatic research.
* Collects and disseminates information on cancer.
* Supports construction of laboratories, clinics, and related facilities necessary for cancer research through the award of construction grants.
which over time became a major institute within the National Institutes of Health.

In other words, the NCI is a national coordinating body for research and training to reduce the menace of cancer.

If we look across the National Insitutes of Health we see that many conditions have an institute. Alcohol has an institute. Drug Abuse does. Diabetes is covered. Stroke. Allergies. Check, check.

But not depression.

Depression is soon to become the emperor of all maladies. Serious depression affects nearly a fifth of the population and it is booming, especially in the young. If you need a call to action, look at the graph below from the National Comorbidity Study Replication, a comprehensive national survey of mental health in the United States. It shows that young people aged 18 to 29 have already experienced as much depression as people aged 60+, even though they have lived for less than half as long on planet earth.

Although it is hard to quantify the suffering caused by depression, it is straightforward to make sound estimates of its overall burden — for example the financial costs of lost work time and increased use of health care dollars. When you do the math, depression already ranks up there with the most burdensome disorders. The World Health Organization projects that in less than 10 years, depression will be the 2nd most burdensome condition. That’s greater, by the way, than cancer.

A National Institute on Depression makes sense not only because there is an urgent public health need but because there is so much about depression that we still don’t know. In my last post, I pointed out one striking example:  A recent search revealed only one well-designed  prospective study of the depressive prodrome (i.e., warning symptoms that herald depression). And much of our knowledge about depression isn’t well coordinated. We don’t do a good job of even knowing what we already know.

Poor integration of existing knowledge is one of the major limitations of the current national approach to depression. Thus, in addition to giving a new NID the broad mission along the lines of the NCI, it would be important for the NID to allow room for many approaches, not only the exclusive focus on genes and brains that has yielded relatively modest results so far but cognitive, sociological, and anthropological perspectives as well.

In our times of fiscal retrenchment, it’s easy to shoot down any new initiatives. But at the same time, wouldn’t keeping our current approach to depression be far riskier to the public health? and more costly in the long run? The time is right to make a modest investment to expand federal research on depression and bring together the work that’s currently done in several NIH insitutes (Mental Health, Aging, Child Health and Development) into a stand-alone National Insitute of Depression. If not now, when?

Countering ‘Memory Loss’ In The Immune System

On December 26, 2010, in Immunology, by Christopher Fisher, PhD

After recovering from a cold or other infection, your body’s immune system is primed to react quickly if the same agent tries to infect you. White blood cells called memory T cells specifically remember the virus or bacterium and patrol the body looking for it. Vaccines work on the same principle: Harmless fragments of a virus or bacterium provoke the immune system to generate memory T cells that can attack the real thing later on.

As time passes, however, this specific immunity can wear off. That is because not all memory T cells live long enough to foster long-term immunity.

MIT biologists have now demonstrated the conditions that favor development of long-term memory T cells over short-term memory T cells, which can respond quickly but do not stick around for very long after the initial infection. That discovery could help vaccine designers better tailor their formulas to elicit long-term memory immunity, says Jianzhu Chen, MIT professor of biology and member of the David H. Koch Institute for Integrative Cancer Research.

Chen and Herman Eisen, emeritus professor of biology, are senior authors of a paper on the work that appeared in the Proceedings of the National Academy of Sciences the week of Dec. 13.

In the PNAS study, the MIT team looked at mice infected with influenza. In mice, as in humans, influenza virus stimulates T cells, whose job is to kill infected cells. Every T cell is programmed to recognize different foreign proteins (also called antigens) located on the surfaces of infected cells. When a T cell binds to the antigen, the T cell becomes activated and starts rapidly reproducing, creating an army of cells that can identify and destroy the invader.

Once the infection is eliminated, most of the activated T cells die off, but a few of them stick around, in case the virus comes back. These are short-term memory T cells. Because they have already battled the virus and reproduced many times, they survive only weeks or months after the initial infection. (T cells can only divide a certain number of times before they die.)

A set of long-term memory T cells also develops during infection. These cells are programmed differently, so they can persist for decades. Recipients of the smallpox vaccine, for example, have been shown to still have T cells against the virus up to 70 years later, says Eisen.

Until now, it has been unclear how these different cell types develop. In their new study, Eisen and Chen investigated the role three factors: T-cell location, the amount of antigen exposure, and length of exposure.

Scientists already knew that T cell contact with a large amount of virus provokes development of short-term memory T cells, says Eisen. Chen and colleagues discovered that large amounts of antigen also suppress development of long-term memory T cells. Those cells only develop when exposed to a small amount of the antigen for a short period of time.

For example, if you have an infection in the respiratory tract, nearby T cells will be exposed to many viruses and become short-term memory cells. Those cells hang around the respiratory tract, ready to pounce quickly if the same virus re-infects you, but they eventually die off.

In more distant parts of the body, T cells are exposed to only small amounts of the virus, and some of those cells become long-term memory T cells specific to that virus. These maintain a low level of constant vigilance in case the virus ever returns.

Ulrich von Andrian, professor of immunopathology at Harvard, says the new study’s major contribution is its experimental support of existing theories. “It builds on ideas that have been around for a while, that were not rigorously tested by experiments, for the most part,” says von Andrian, who was not part of the research team.

When developing vaccines, the goal is usually to generate a stable population of long-term memory T cells. This study suggests that the best way to do that is to give a small amount of antigen, and, for vaccines that require multiple injections, not to give them too frequently.

“The general rule of thumb is that you don’t want to give a large amount of antigen on a short-term basis,” says Chen. He adds that the amount of antigen for inducing a long-term memory T cells likely varies depending on the route of immunization and the form of antigen, and so the dosage for each vaccine will have to be determined through experiments.

He says the findings will likely not impact flu-vaccine design because existing dosages have already been optimized over many decades. However, the findings should be applicable to vaccines now under development for other diseases, such as HIV, tuberculosis and dengue fever, says Chen.

Material adapted from MIT.